Bots are robots that the search engines use to go out and discover new websites or new content to the website via sitemaps, links, etc. to find, crawl, and index your website to include it in their search engine to display to other people. Bots can be blocked or allowed using a robots.txt file.