Web security practices (5) global tasks and automated tools for web Application Analysis

Source: Internet
Author: User

 

Author: Xuan soul

Prerequisites: None

This series navigation http://www.cnblogs.com/xuanhun/archive/2008/10/25/1319523.html

Security Technology Zone http://space.cnblogs.com/group/group_detail.aspx? Gid = 100566

Preface

The web security practice series focuses on the practical research and some programming implementation of the content of hacker exposure-web Application Security secrets and solutions (version 2. So if you fully understand this book, you can skip this article.

 

Body

From today on, we have been officially engaged in application profiling. Please do not complain about why analysis is always not an attack, and there is no real content. In fact, as long as you carefully analyze the vulnerability, the vulnerability will naturally show up. Will it be far from knowing the vulnerability attack? Today's content is very simple, mainly my personal thoughts and simple use of several tools.

  1. Application Analysis should have a broad view

Analyzing websites is not just a matter of luck. The real test is not to miss every detail. You do not need to start with details. Focus on finding out the trunk and then analyze it in depth. What is the overall view of website application analysis? Is its hierarchy. After understanding the layers, we are looking at the content, files, and functions of each layer, and then starting with analyzing the file extensions, languages, forms, and so on. We will introduce the details.

  1. Layered Structure Detection Method
    1. Starting from the root directory of the website, the website crawls along the links.
    2. For Hidden Directories Or interfaces, we can only guess or reason based on other information such as error information.
    3. For well-known systems, search engines always surprise us.

III. The hierarchical structure of the website does not represent the file organization structure on the actual host.

I don't need to elaborate on this point, as long as you understand some website construction knowledge. The actual file organization structure can be mapped to any location at the website level.

Even so, I am afraid no one is willing to make such a ing complex when building a website. From the management point of view, it is always the best to be simple and clear. However, if the website is simple, it will be a great thing for our analysts.

Iv. Automated Tools

The most basic and powerful technology used in profiling is to mirror the entire application to a local copy. For large Web applications, it is unimaginable to manually analyze directories for crawling. Many excellent crawling tools give us a choice. These tools are inherently limited to global tasks. The higher the automation level, the lower the intelligence level. So I think it is the best tool for layered profiling (but the results are not the most comprehensive), but it is not the first choice for detail profiling.

(1) lynx

This is the first text browser to appear. It is a good URL capture tool. For more information about the application of this software, see. If I have never used it, I will not introduce it in detail.

(2) wget

Is a command line tool in windows and Unix that can be used to download the content of a website.

Its purpose is simple-download an object. Results filtering and other functions are not supported. In general, it is still very powerful. Enter-help, and we can see a lot of its parameters.

-The R parameter indicates that the program tracks every link. This creates a complete directory for the tracking website.

It can download the output of each parameter of an application.

For example, the same login. ASP page may be downloaded multiple times.

Login. asp id = 012568

Login. asp id = 012566

Login. asp id = 012554

 

It also has some advanced options such as support for proxy and http basic authentication.

I will not introduce other parameters for the time being, so it is better to do more.

(3) Teleport Pro

Is a graphical interface program in Windows. It also provides detailed setting options, which is suitable for cainiao.

The disadvantage of this tool is that it is not supported by tools such as grep and findstr. Therefore, it will be difficult for us to analyze the information we have obtained.

(4) Black Widow

It can be used as a replacement for Teleport Pro. Provides an interface for users to search and collect specified information. Search tools are supported. It has its own browser. Many more configurable parameters are supported. files can be downloaded to a specified directory.

(5) Offline Explorer Pro

Is the best software we currently use. It supports various authentication protocols, including NTML and https access. It also has a cool editing function.

If you want to make good use of this software, it seems that it is not very easy to do so. However, we only use it for full-site download. For further analysis of the information we have obtained, let's talk about it tomorrow, OK? A little tired.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.