Use Firefox to attack Web2.0 applications (2)

Source: Internet
Author: User
Iii. Challenges Faced by Web2.0 Security Assessment
In the asynchronous framework of Web2.0, web applications no longer need to frequently refresh and redirect pages. In this way, some server resources that can be used are hidden. The following are three important challenges for security researchers to understand Web applications:
(1) discover hidden calls: It is a great challenge to discover the xhr call of the page loaded by the browser in a timely and accurate manner.
(2) web crawler: traditional Web Crawler applications cannot cope with all page behaviors in browsers. When a page calls an xhr object through JavaScript to obtain resources from the server, traditional web crawlers cannot capture these resources.
(3) discovery of call logic: web applications are loaded using JavaScript. It is difficult to separate the call logic of a specific event, every HTML page loads three to four Javascript script files from the server. Each Javascript file contains many functions. when an event occurs, it is difficult to determine which functions are called in JavaScript files.
We need to develop a set of methods and tools to overcome the above obstacles in Web Application Security Assessment. The purpose of this article is to introduce how to use the Firefox browser and some plug-ins to address the above challenges.
3.1 hidden calls discovered
Web applications may only download one page from the server, but they can build the final page through several xhr object calls. These xhr calls asynchronous download of resources or JavaScript scripts from the server. In this way, the challenge we face is how to determine the xhr call and identify the resources downloaded from the server. These resources can help security researchers discover security vulnerabilities. Let's start with a simple example.
For example, you can visit the hxxp: // example.com/news.aspx website to learn about the business news of the day. Page 2.

Figure 2 a simple news display page
As a Web2.0 application, Ajax calls are sent to the server through xhr objects. We can use a tool firebug to identify all xhr object calls. Firebug is a plug-in of the Firefox browser. Before you browse a page, select the "show xmlhttprequests" option, as shown in 3.

Figure 3 configure the firebug record xmlhttprequests call
After the XMLHttpRequest call interception option is enabled, We can browse this page again. All xhr object calls sent to the server will be detected and recorded by firebug, as shown in figure 4.

Figure 4 Ajax call record
We can see some requests sent by the browser through xhr, download the dojo Ajax framework from the server, and send a call to the server, obtain the content of a news article.
If you carefully check the code, you can find the following JavaScript code:
Function getnews ()
{
VaR HTTP;
HTTP = new XMLHttpRequest ();
HTTP. Open ("get", "getnews. aspx? Date = 09262006 ", true );
HTTP. onreadystatechange = function ()
{
If (HTTP. readystate = 4 ){
VaR response = http. responsetext;
Document. getelementbyid ('result'). innerhtml = response;
}
}
HTTP. Send (null );
}
The code above sends an asynchronous call to access the Web server, and the attempt to obtain resources is:
Getnews. aspx? Date = 09262006
This ASPX page is compiled and then sent to the client browser. The code is placed in the tag where the ID is result. This is a typical example of Ajax calling using xhr objects.
Using firebug for analysis, we can record all the xhr object calls on a page to discover the internal URLs, query strings, and post requests with security vulnerabilities. The preceding code is used as an example. If the value of the date parameter is improperly handled, the SQL injection vulnerability may exist.
3.2 web crawler problems and browser Simulation
Web Crawler is an important tool to evaluate the security of a Web application. The Web Crawler function crawls every web page in the Web application and collects links. However, in Web applications, these links often point to JavaScript Functions, and then JavaScript Functions call new page content through xhr objects. In this case, web crawlers will lose this information. For example, the following is a set of simple links.
Go1
Go2
Go3
When the "go1" link is clicked, The getme () function is executed. The code of the getme () function is as follows. This function may be implemented in a separate Javascript file.
Function getme ()
{
VaR HTTP;
HTTP = new XMLHttpRequest ();
HTTP. Open ("get", "hi.html", true );
HTTP. onreadystatechange = function ()
{
If (HTTP. readystate = 4 ){
VaR response = http. responsetext;
Document. getelementbyid ('result'). innerhtml = response;
}
}
HTTP. Send (null );
}
After the above code is executed, send an HTTP getrequest to access the resource hi.html on the server.
Can I use an automatic mechanism to simulate the click link operation? The Firefox plug-in Chickenfoot implements this function. It provides JavaScript-based APIs and extends the browser's programmable interfaces.
With the Chickenfoot plug-in, you can write simple JavaScript code to automate browser behavior. In this way, simple tasks such as crawling web pages can be automatically completed. For example, the following sample code can simulate a click event and click all connections on the webpage. This plug-in has obvious advantages over traditional web crawlers: all these onclick events generate an xhr call, and these responses are ignored by traditional web crawlers, traditional crawlers can only try to analyze JavaScript code and collect possible links, but this does not replace the response of the actual onclick event.
L = find ('link ')
For (I = 0; I
You can load the script to the Chickenfoot console and run it. Result 5 is displayed.

Figure 5 Use Chickenfoot to simulate an onclick event
In this way, you can create JavaScript scripts to evaluate the security of Ajax-based Web applications in the Firefox browser.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.