Web site robots.txt probing tool parsero robots.txt file is a text file under the root directory of the Web site. Robots.txt is the first file to view when visiting a website in a search engine. When a search engine accesses a site, it first checks to see if a robots.txt exists in the root directory of the site. If present, the search engine will follow the contents of the file to determine the scope of the visit, and if the file does not exist, it will be sufficient to access all the pages on the site that are not password protected. Website in order to prevent the search engine to visit some important pages, it will put its directory in the robots.txt file. So, probing the file can also get important information about the site. Kali Linux provides a gadget Parsero that detects robots.txt files for a specified Web site and confirms actual accessibility. PS: This tool requires the user to install manually using the Apt-get command.
Website robots.txt Detection Tool Parsero