Website robots.txt Detection Tool Parsero

Source: Internet
Author: User
Tags kali linux
Web site robots.txt probing tool parsero robots.txt file is a text file under the root directory of the Web site. Robots.txt is the first file to view when visiting a website in a search engine. When a search engine accesses a site, it first checks to see if a robots.txt exists in the root directory of the site. If present, the search engine will follow the contents of the file to determine the scope of the visit, and if the file does not exist, it will be sufficient to access all the pages on the site that are not password protected. Website in order to prevent the search engine to visit some important pages, it will put its directory in the robots.txt file. So, probing the file can also get important information about the site.  Kali Linux provides a gadget Parsero that detects robots.txt files for a specified Web site and confirms actual accessibility. PS: This tool requires the user to install manually using the Apt-get command.

Website robots.txt Detection Tool Parsero

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.