From a position of absolutely no knowledge two years ago, the numbers of people now capable of building a Web site, taking, mixing and re-broadcasting music on-line not to mention applying artificial intelligent robots to search for information runs to many millions.
Over a thousands of people created Freeserve discussion groups in the first three months of its existence. According to a study by NEC Research Institute in Princeton, New Jersey9. The extent of the Internet which is indexed by search engines is diminishing rapidly. They found that only one sixth of the Web is covered. Northernlights had the largest proportion of the Web indexed. However this was found to be only 16 percent of the Web. Last year Hotbot were the largest search engine and they had 34 percent of the Net covered.
The study found that on average it takes a new site 6 months to be indexed on a search engine and suggests that the cost of maintaining ever larger databases was the reason why search engines had relatively low amounts of content indexed. According to Steve Lawrence and C. Lee Giles, authors of the study, 800 million pages of the Web are now searchable. In 1997 that figure as 320 million.
Finding information is getting faster and smarter. The fastest way of finding strange things on the Internet is to use a ‘bot’. Netzines don’t care that this is an artificial intelligent robot. They deliver the goods. These robots power a number of search engines and are quite intelligent. They are a boon and make information available fast. These intelligent engines are needs. The old form of indexing sites is finding Internet Growth hard to manage.
Bots can also be used to provide information, misinformation and even damn your name. Members of the Internet Society can handle this. They are not scared by technology and they know its is capable of good and bad. Fast information retrieval and the ability to communicate is at the heart of the Internet. The reputation of engines incapable of mastering the Internet sinks in days. They now use multiple technologies and will quickly be on top of the problem of dead. In many companies and for a growing number of Internet research organisation the amount of information available is so big the need clever 'thinking' technologies are coming forward.
Some are quite simple and are used all the time. For example natural language searching (as opposed to application of sense interpretations – more later) is used by many search engine and filters out common language such as ‘and, if or etc’. This is automatic on most search engines (Hot Bot advanced search gives you a glimpse at how it works) Boolean is simple and in wide use (AND, NOT etc). Using it to search for specifics in newsgroups and listserve is very productive. ‘Agents’ are the way we all ‘open up’ all the web pages on a web site and why the process needs so much band width. Fuzzy logic has been around for a long time. This is a form of approximation algorithm. It is dynamic. For a number of Internet applications it is used at the front end of the search process to capture near likeness expressions before it is refined by the next phase (see below). I like it.
A product I am trying to get the PR industry to use to identify outcomes from PR activities (as well as other forms of communication) on corporate drivers such as sales price, volume, margins etc). An example is Cognos’4Thought.
Neural Nets identify to what extent a number of factors influence each other. They were invented to test the outcome of nuclear explosions without really exploding one. Then there are applications for Baynsian logic which asks if the logical answer does not make sense, to what extent does it not make sense and is thereby true/untrue. It is these latter two processes that are used most effectively in advanced knowledge management software. In this way the computer ‘learns’ from the behaviour of the operator and readjusts the words that are to be used next time a search is made. It is an iterative process and ideal for big volumes of research. The continual refinement of the search takes the engine off down ever-narrower searched and is great if you don’t want people to spend hours on trivia but not much good if you are looking for the incidence of a subject across the whole Internet.
These Baynsian processes are an excellent product for managing large amounts of information such as searching the Internet for everything to do with a subject i.e. ‘the applications of motor car engines’. It will quickly refine the search down to ‘in a Ford
Escort MKIII’ if that is the direction of the researcher. It will narrow knowledge acquired from terabits of information to manageable proportions.
These technologies have other applications. Software like ‘Electra’ provide semiintelligent interactive responses. You can hold conversations with this robot masquerading behind a graphic of a pretty girl, ask question and make statements in plain language and get a plain language response.
Some commercial Web sites have become very interactive with humanoid bots answering question from prospects and customers. Responsiveness to enquirers through using a person accessing a database is now common place. The new programmes are even more helpful in collating data. (including intelligent data mining). The artificial but humanoid bots are becoming essential for good Web presence and an effective Internet reputation because that can put a 'human' face to the acquisition of much information and, in addition can interpret what the netzine is asking for. Now robots are shaping opinion too with all that may entail for the reputation manager.
Full functioning broadcast sound and video is with us, virtual reality is near, cell phones with Internet access will be a great millennium Christmas present, Voice-mail (a spoken message sent like e-mail) is available, Dynamic Web pages, with moving pictures and pages browsing netzines can make themselves and fully Web enabled applications (even writing press releases without a word processing package on your computer) is now becoming usual. I use a fully Web enabled package every day for monitoring and knowledge management.
And, with the advent of interactive TV, the range of opportunities grow. Cable and Wireless announced in August 1999 that it is to launch a new TV based Internet service for its customers in the UK and is very much of the new genre. The new service will allow users to access Internet, email and on-line shopping facilities via remote control while simultaneously interacting with television broadcasts. The new cable-based service offers access to 15 major UK Web sites including Tesco's, Barclays, British Airways and Teletext. Plans are to provide access to 100 of the Net's top sites by the end of the month. The company already have a subscriber base of 10,000 from a pilot Digital TV campaign in Manchester. As we shall see this is an on-line marketers dream. The amount of technology now deployed on the Internet is mind bending in its volume and capability. Being late into the Internet means we all have to catch up, but the technology is moving away from us fast.