présentation1 ci 2003
Post on 05-Apr-2018
219 Views
Preview:
TRANSCRIPT
-
7/31/2019 Prsentation1 CI 2003
1/13
Competitive
intelligence and theweb
-
7/31/2019 Prsentation1 CI 2003
2/13
definition
the process of ethically collecting,analyzing and disseminatingaccurate, relevant, specific, timely,foresighted and actionableintelligence regarding theimplications of the business
environment, competitors and theorganization itself
[SCIP, 2003]
-
7/31/2019 Prsentation1 CI 2003
3/13
An effective CI project is acontinuous cycle, who are includingthose steps:
-
7/31/2019 Prsentation1 CI 2003
4/13
The reasons for the increase of usinginternet information in the CI processinclude:
1. A business Web site will contain a variety ofinformation.
2. The cost of this information is, for the mostpart, free.
3. Access to open sources does not requireproprietary software such as access to
multiple commercial databases.
-
7/31/2019 Prsentation1 CI 2003
5/13
Web search
A Web search engine usually consists ofthe following components.
1. Web Crawlers or Spiders are used tocollect Web pages using graph searchtechniques.
2. An indexing method is used to indexcollected Web pages and store theindices into a database.
3. Retrieval and ranking methods are usedto retrieve search results from thedatabase and present ranked results tousers.
4. A user interface allows users to query
-
7/31/2019 Prsentation1 CI 2003
6/13
The most common method for gatheringinformation from the Web is the use ofsearch engines
Examples are:
AltaVista [http://www.altavista.com currentSeptember 1, 2003,
Infoseek [http://www.infoseek.com currentSeptember 1, 2003],
Yahoo! [http://www.yahoo.com currentSeptember 1, 2003] andGoogle [http://www.google.com current
September 1, 2003].
-
7/31/2019 Prsentation1 CI 2003
7/13
the meta-search engine
When a meta-search engine receives aquery it connects to several popularsearch engines and integrates theresults returned by those search
engines. Meta-search engines do not keep their
own indexes but in effect use the indicescreated by the search engines being
searched to respond to the query. In this type of search, if a computer
receives a request if it cannot fulfill, therequest is passed on to its neighboring
computer.
-
7/31/2019 Prsentation1 CI 2003
8/13
INFORMATION ANALYSIS
It becomes necessary to control thesearch.
Control can be achieved by controlling
the graph search techniques. Controlling the search is, in effect, a
rudimentary analysis of the informationbeing retrieved.
The search should return only thoseWeb pages that are relevant to thequery.
This initial form of analysis is referred to
-
7/31/2019 Prsentation1 CI 2003
9/13
Web mining
Web Content Mining Web Content Mining refines the basic search technique. Web Structure Mining Web Structure Mining uses the logical network model of
the Web to determine the importance a Web page. Web Usage Mining Web Usage Mining performs data mining on Web logs.
A Web log contains clickstream data. This data canbe analyzed to provide information about the use of theweb server or the behavior of the client depending
upon what clickstream is being analyzed Inaccuracies are a significant problem when the Web is
used as an information source for a CI project. Theissue is information verification.
-
7/31/2019 Prsentation1 CI 2003
10/13
Information verification
In assessing the accuracy of theinformation it is useful to ask thefollowing questions:
Who is the author?
Who maintains (publishes) the Website?
How current is the Web page?
-
7/31/2019 Prsentation1 CI 2003
11/13
Informations security
Information security is a concern. Theseconcerns include:
1. assuring the privacy and integrity ofprivate information
2. assuring the accuracy of its publicinformation
3. avoiding unintentionally revealinginformation that ought to be private.
The first of the concerns can be managedthrough the usual computer and networksecurity methods.
The second concern requires some use ofInternet security methods.(Web Defacing,
Web Page Hijacking, Cognitive Hacking...)
t t t
-
7/31/2019 Prsentation1 CI 2003
12/13
so ut ons to t elimitations of the use of
the web in CI process Development of methods that improvethe efficiency and accuracy of textmining for information analysis.
Automating the process of informationverification of Web sources in generaland surface Web sources in particular.
Develop methods for improving securityincluding the automatic detection offalse information, inaccurateinformation, and negative information.
-
7/31/2019 Prsentation1 CI 2003
13/13
Conclusion
top related