Penetration Test information spying

Source: Internet
Author: User
Tags ftp naming convention svn versions web services subdomain

Penetrate the first bomb spying information: freebuf

1, analyze the content and function of the target website

(1) First determine the language of the site to be written. Or whether there is a mix of situations. Here you can get it by viewing the site source files, observing the site links, capturing the submission request, and so on.

(2) Crawling the site directory, using tools to crawl the site directory, you can assist the previous step to make the results more accurate. The crawling results are archived, if you can, the site should be analyzed whether the use of general-purpose programs, if it is, recorded. Proceed to the next step.

(3) According to the crawling results of the previous step, the site root directory or the key directory of violence directory detection, if the site for the general procedures, interpretation has been two times development, such as non-common procedures, in the detected directory to find key directories and documents.

This step tests the method in detail:

1. Enter and access some files or directory names that may not exist, and then enter and access directories and filenames that are known to crawl through the directory to learn how the server handles invalid resources.

2. Use the Web site crawling to the results as a basis for this step of violence directory detection, scanning critical directories or all.

3. Determine how the server handles the responses that files cannot find, and use keyword techniques to handle these responses. Thus, the effective resources and the invalid resources are judged.

4. Collect the results of this step violence scan, and manually detect the validity of the detected directory.

5. Repeat the above steps to get more and more critical catalogs and documents.

(4) Through the above steps, get a complete Web site directory structure, as well as the enumeration of all directory names, file names and file extensions. Learn about the naming of Web developers, determine their naming rules, and infer more directories and filenames.

This step tests the method in detail:

1. Check the entire list of file naming rules, interpretation of its naming basis, such as the discovery of several parts of the same file name, addnews.php,viewnews.php, then we can try to exist editnews.php,delnews.php, Often just look at a few file names, you can measure the Web site developers of the naming habits, according to their personal style, the developer may adopt various naming methods, such as lengthy (addnewuser.php), Concise (adduser.php), abbreviated (addusr.php), Or a more ambiguous naming method (addu.php). Understanding the naming conventions used by developers helps to speculate on the exact name of the content that has not yet been determined.

2. Some different content naming schemes use numbers and dates as identifiers, and they can easily infer hidden content. Static pages often use this naming method.

For example, Pkav.net's team blog, the article shows the image of the file name has not been redefined, using a date plus the number of incremental naming scheme, such as the December 12, 2012 published articles in the picture is 1.jpg,2.jpg,3.jpg. Then the path to these pictures is 2012-12-12/1.jpg,/2012-12-12/2.jpg,/2012-12-12/3.jpg, at this time we posted a content-encrypted article on our blog, and only team members knew the password, But hackers have speculated on the exact address of these images based on the naming rules used to publish articles in blogs, thereby revealing the general concept of the article through the image content.

3. Check all client code, such as HTML and JS code, looking for any hidden server-side clues, as well as hidden table cell elements, and so on. Certification Check annotation content, often can bring us surprises, such as part of the general program will be in the home page to the Web site management links to the background, But site managers don't want this link to be known to normal visitors, so the content annotation, we can look through the HTML code to know this specific address, and most of the admin backstage in the call of JS often stored in the back of all the function module link address, but in the judgement of the current user rights to hide it , we can also directly view the JS code to learn the specific content, there are some developers in the annotation content to record some sensitive information, I have many times from the annotation information to get the name of the database, even can get the specific connection information database, SQL query statements and so on.

4. Try it out in other places by speculating on what we have enumerated.

If the file a.php exists in the/111/directory, then we can try to find out if the same file exists in the/222/directory, try to access all enumerated filenames using some regular suffixes, such as index.php This file is known to exist, we can use Txt,bak, Src,inc,tmp, such as the attempt to index.txt,index.bak or add in the original suffix based on index.php.bak, etc. this will help us get the precompiled versions of these files, develop versions, or back up files, It can also be inferred from the language used by the website, such as the. cs suffix used in Java.

5. Search for developer tools used by developers or temporary files created by text editors. such as SVN. Svn/entries, or ultraedit the automatic backup feature of such a text editor, the. bak file, the heavily used. tmp suffix, and the legacy files such as index.php~1, are all details that may lead to important clues, You must not omit these steps in your test.

6. Automate the above steps, extract all existing file names, and the stem information of directories and suffixes, and automate batch detection in all directories.

7. If you have identified a unified naming scheme through the above steps, you can use this naming convention to test across the site.

8. Continue to repeat the above steps to get more key information, according to time and personal imagination to play!

(5) Use of public information, such as search engine, site snapshot information, as well as the use of the program developers of its Web site to publish some information about the use of documents to get closer to the target site more information.

1. Use several different search engines and site snapshots to get the index and historical content records of the target site.

2. Use base search techniques such as:

Site:www.hao123.com (return to this target site by search engine crawl all content) site:www.hao123.com keywords (return this target site by search engine crawl included this keyword All pages.) Here we can set the keyword, Web site background, management background, password modification, password back, and so on. Site:www.hao123.com inurl:admin.php (returns all pages containing admin.php in the address of the target site, can use admin.php,manage.php or other keywords to find the key function page) Link: Www.hao123.com (returns all pages containing the link to the target site, including its developer's personal blog, development log, or third party company, partner, etc.) related:www.hao123.com (return all "similar" to the target site) Pages that may contain information about generic programs, and so on.

3. Do not only use the search function of the Web page in search, you can try such as pictures, news and other functions to locate specific information.

4. Find some key information from the search engine's snapshots, such as program error information can leak the site specific path, or some snapshots will save some test information, such as a Web site in the development of the background function module, has not given all pages to increase the authority to identify, at this time by the search engine grabbed a snapshot, Even after the site has increased the ability to authenticate, the search engine's snapshot still retains this information.

5. Through the search engine to obtain the target site's subdomain, get more functions, such as some sites often use admin this subdomain as its management background, such as admin.hao123.com.

(6) Collect information on Web site developers, such as developers of web sites, management of maintenance personnel, etc. on the Internet.

This column more highlights: http://www.bianceng.cnhttp://www.bianceng.cn/Network/Security/

1. List the names and mailing addresses of all the development and maintenance personnel on the website and other contact information, including those obtained from the website contact function, from the comments in HTML or JS, and from the content pages.

2. Using some of the advanced search techniques described above, find all the information that these people have posted on the internet about the target site, and analyze concurrent existing information, such as when I used this method to get information from a developer of a large web site in the country, He found that he has developed all the functions of the page's source code are placed in an open Web site, you can download at will, including the site's database link information and other key content, which led to my easy access to this large Web site permissions.

Penetrate the first bomb spying information: hack-test

Use the network such as some WHOIS query sites, DNS sites or Google search engines to extract information, through social engineering to obtain information, through scanning software to discover the target's network environment, open port, identify its operating system database, Web services and discovering vulnerabilities in network devices, system databases, Web services, etc. through various vulnerability scanning tools.

We need more information about the target site Ps: I think that in the process of penetration testing, this is more important than the implementation of the Test link. ), they include:

1.DNS record (a,ns,txt,mx) whois http://searchdns.netcraft.com nslookup

2.WEB service Type (IIS,APACHE,TOMCAT)

3. Domain name Registrant's information (held domain name company, etc.)

4. Name, telephone, email and address of the target site administrator (related person)

5. Script types supported by the target site (php,asp,jsp,asp.net,cfm) whatweb

6. Operating system of the target site (unix,linux,windows,solaris)

7. Port open to target site nmap-ss-a-o-p0-sv target

Some BT5 auxiliary scanning plug-ins, for some common service analysis

Auxiliary/scanner/ip/ipidseq//IP Scan

Auxiliary/scanner/smb/smb_version

Auxiliary/scanner/mssql/mssql_ping

auxiliary/scanner/mysql/

Auxiliary/scanner/ssh/ssh_version

Auxiliary/scanner/ftp/ftp_version

Auxiliary/scanner/ftp/anonymous

Auxiliary/scanner/snmp/snmp_login

Auxiliary/scanner/smb/smb_login

Auxiliary/scanner/vnc/vnc_none_auth

AUXILIARY/SCANNER/X11/OPEN_X11///pentest/sniffers/xspy/

This article is from the "No Mark" blog, please be sure to keep this source http://hucwuhen.blog.51cto.com/6253667/1288043

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.