What are search engine spiders?
A spider is a program that visits websites and reads their pages and other information in order to create entries for a index. The major search engines on the web all have such a program, which is also known as a “crawler” or a “robot.” Spiders are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Spiders are called spiders because they usually visit many sites in parallel at the same time, their “legs” spanning a large area of the “web.” Spiders can crawl through a site’s pages in several ways.
Search engine spiders (also called “crawlers” or “bots”) are programs that browse the Internet for web pages. They index and rank each page they find, based on subject matter and a number of other factors. The spiders work on unique algorithms programmed by software engineers for the various engines.