What Is Robots.txt?

Robots.txt is an abbreviation for the “robot's exclusion standard”. This is a standard used by websites to communicate with web crawlers and tell them which pages on the site to avoid, and which pages not to scan. The benefit of this can be found in Google's guidelines, where its states a robot.txt file should be used “to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines”. 

X