A -  B -  C -  D -  E -  F -  G -  H -  I -  J -  K -  L -  M -  N -  O -  P -  Q -  R -  S -  T -  U -  V -  W -  X -  Y -  Z - 



The digital marketing glossary > S > What is Search engine spider definition?

What is Search engine spider definition ?

A search engine spider is a sofware robot that crawls the web to store webpages and their contents in search engine indexes. A webpage can make an appearance in SERPs only if it has been first indexed by a search engine spider.

Search engine spiders follow links within and across websites to discover or revisit webpages. For a new site, it is possible to submit its homepage URL to a search engine to have it visited and indexed by the spider.

To index billions of webpages, search engines have several spider that crawl the web in parallel.

Webmasters can use a robot.txt file placed at the root of the website to indicate spiders how to index a website and to ban access to some parts of the site. A sitemap may also help indexing.

Spiders have some difficulties to index Flash based content and content generated through javascript and can’t access content behind an indentification process (Intranet for instance).

Search engine spiders are also called search engine bots or crawlers.

 
Published on Sunday 28 July 2013, mis a jour le Wednesday 31 July 2013 (Authors)