What is Robots.txt?

It is a text file that is used to communicate with web crawlers and web robots used by search engines. The search engines use the file to read the websites and web pages during a user search to check the relevancy of the website against the search string.

Feedback and Knowledge Base