The robots.txt file is used to help search engines crawl the website. It includes a list of pages that it is not allowed to crawl and index in its search engine. Examples of pages you don’t want to be crawled by the search engines would be where sensitive account data is being held.