Coding Cheatsheets - Learn web development code and tutorials for Software developers which will helps you in project. Get help on JavaScript, PHP, XML, and more.

Post Page Advertisement [Top]


In SEO the term Crawl refers to the way a search engine "Spider" will read the text and follow the links on a site. As the spider robot moves throughout the pages of a site it is Crawling the site and collecting information for the search engine. In Crawling Google find the links pointing to your website, and it would associate the content and reputation of that bigger website with your own. 

Before a search engine can tell you where a file or document is, it must be found. To find information on the hundreds of millions of Web pages that exist, a search engine employs special software robots, called spiders, to build lists of the words found on Web sites. When a spider is building its lists, the process is called Web crawling.

No comments:

Post a Comment

Bottom Ad [Post Page]