0
was successfully added to your cart.

Website owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol (robots such as GoogleBot crawl through websites with the intention to index them in Google Search Results). If a robot wants to vists a Web site URL, say http://www.example.com/welcome.html, before it does so, it firsts checks for http://www.example.com/robots.txt. In this file, are instructions telling the robots how to respond to the page.

Leave a Reply

Anti-spam check *required