RankingToday.com

Techonology for the people

From a position of absolutely no knowledge two years ago, the numbers of people now capable of building a Web site, taking, mixing and re-broadcasting music on-line not to mention applying artificial intelligent robots to search for information runs to many millions.

Over a thousands of people created Freeserve discussion groups in the first three months of its existence. According to a study by NEC Research Institute in Princeton, New Jersey9. The extent of the Internet which is indexed by search engines is diminishing rapidly. They found that only one sixth of the Web is covered. Northernlights had the largest proportion of the Web indexed. However this was found to be only 16 percent of the Web. Last year Hotbot were the largest search engine and they had 34 percent of the Net covered.

The study found that on average it takes a new site 6 months to be indexed on a search engine and suggests that the cost of maintaining ever larger databases was the reason why search engines had relatively low amounts of content indexed. According to Steve Lawrence and C. Lee Giles, authors of the study, 800 million pages of the Web are now searchable. In 1997 that figure as 320 million.

Finding information is getting faster and smarter. The fastest way of finding strange things on the Internet is to use a ‘bot’. Netzines don’t care that this is an artificial intelligent robot. They deliver the goods. These robots power a number of search engines and are quite intelligent. They are a boon and make information available fast. These intelligent engines are needs. The old form of indexing sites is finding Internet Growth hard to manage.

Bots can also be used to provide information, misinformation and even damn your name. Members of the Internet Society can handle this. They are not scared by technology and they know its is capable of good and bad. Fast information retrieval and the ability to communicate is at the heart of the Internet. The reputation of engines incapable of mastering the Internet sinks in days. They now use multiple technologies and will quickly be on top of the problem of dead. In many companies and for a growing number of Internet research organisation the amount of information available is so big the need clever 'thinking' technologies are coming forward.

Some are quite simple and are used all the time. For example natural language searching (as opposed to application of sense interpretations – more later) is used by many search engine and filters out common language such as ‘and, if or etc’. This is automatic on most search engines (Hot Bot advanced search gives you a glimpse at how it works) Boolean is simple and in wide use (AND, NOT etc). Using it to search for specifics in newsgroups and listserve is very productive. ‘Agents’ are the way we all ‘open up’ all the web pages on a web site and why the process needs so much band width. Fuzzy logic has been around for a long time. This is a form of approximation algorithm. It is dynamic. For a number of Internet applications it is used at the front end of the search process to capture near likeness expressions before it is refined by the next phase (see below). I like it.

Article Series

This article is part 8 of a 24 part series. Other articles in this series are shown below:
  1. The Internet Influence
  2. Reputation Management
  3. The Internet Society
  4. How People Use The Internet
  5. The Opinion Formers
  6. A Stakerholder Society
  7. Its Fast
  8. Technology For The People
  9. A Reputation For Responding
  10. Newsgroups, Chat and Cybercast
  11. The Nature of Newsgroups
  12. Chat Overtaking Newsgroups
  13. Cybercasting
  14. The Internet Communities
  15. Neighbourghood Communities
  16. Company Communities
  17. Community Currency
  18. The Effect Of Virutal Communities On The Bottom Line
  19. Political Communities
  20. Cyber Marketers
  21. Global Branding
  22. Accessibility
  23. Information and Content
  24. Cyberbrand Outreach Accessibility
No popular articles found.