The robots.txt file is used to instruct the bots that crawl through a website that they cannot index certain pages or sections of the website (or the whole thing), and to tell the bots the location of the sitemap(s) for the website. For example, one could use the Robots.txt file to tell the Googlebot not to index the administrative area of your WordPress blog, and you can tell Baidu or Yandex not to index your entire website. Care should be taken to ensure one does not tell Google to skip indexing one's entire website.
Note well: Robots sometimes ignore commands in the Robots.txt file. To actually hide a file from Googlebot or Bingbot (or any other bot), the page needs to have "noindex" in the Robots Meta Tag, and the file should be password-protected. Password protection is the only 100% certain method that will keep the search engine bots from indexing any page.