Robots.txt file

Robots.txt is a text file used to communicate with bots like search engine crawlers. It contains sets of instructions for search engine crawlers that specify which folders, pages and/or file types it is not allowed to access. However, these disallowed files can still be found by crawlers following links from other pages.

Robots.txt files can prevent search engines from indexing duplicate content, low quality or thin pages, and less important pages in order to optimize their crawl budgets.