Install using Tcpdump to count HTTP requests in Ubuntu systems

Source: Internet
Author: User
Tags file url header http request printable characters sort

install using Tcpdump to count HTTP requests in Ubuntu systems

Installation

Tcpdump installation is still more annoying ...

1. Download online for Libpcap and tcpdump

http://www.tcpdump.org/

2. Install the required package for C compilation:

The code is as follows:

Apt-get Install Build-essential

3. Installation of Libpcap front:

The code is as follows:

Apt-get Install Flex,apt-get Install Bison

4. Install Libpcap.

The use of tcpdump must have this library.

The code is as follows:

Tar XVFZ libpcap-1.2.1.tar.gz/Decompression

Run into the file directory after decompression

The code is as follows:

./configure//Generate Makefile Files

Make//Compile

Make Install//installation

The library file is installed by default in the directory/usr/lib, and the header file is installed by default in/usr/include

5. Install Tcpdump

The code is as follows:

Tar XVFZ tcpdump.4.2.1.tar.gz/Decompression

Run into the file directory after decompression

The code is as follows:

./configure//Generate Makefile Files

Make//Compile

Make install//installation library files are installed by default in the directory/usr/lib, and header files are installed by default in/usr/include

Test for successful Installation: command line input tcpdump have network information display!!

6. Problems that may be encountered:

The code is as follows:

#tcpdump

#tcpdump: no suitable device found

Reason: Network monitoring requires root permissions, switching to root users can be used normally.

Using Tcpdump to count HTTP requests

The statistical HTTP request mentioned here refers to the statistics QPS (requests per second), the first 10 most visited URLs. In general, when doing such statistics, we often use the Web site access log to statistics. When we came to a strange server environment, it is necessary to immediately statistics the top 10 most visited URLs to determine whether there is an attack, the use of tcpdump is much simpler, because we do not need to care about the site log, do not need to consider the site log is open and so on, Capture the current HTTP packet directly with tcpdump, and then filter further to get the statistics we want. This feature has been integrated into ezhttp, and the following is the effect map:

Below is a description of its statistical methods.

1, capture 10 seconds of data packets.

The code is as follows:

Tcpdump-i eth0 tcp[20:2]=0x4745 or tcp[20:2]=0x504f-w/tmp/tcp.cap-s, 2>&1 &

Sleep 10

Kill ' PS aux | grep tcpdump | Grep-v grep | awk ' {print $} '

This command represents the monitor network card eth0, captures TCP, and 21-22-byte characters are GE or PO, representing packets that match a GET or POST request, and write to the/tmp/tcp.cap file.

2, this time we get the latest 10 seconds of binary packet files, our next step is through the strings command to find the Get/post URL and host.

The code is as follows:

Strings/tmp/tcp.cap | Grep-e "Get/| POST/| Host: "| grep--no-group-separator-b 1 "Host:" | grep--no-group-separator-a 1-e "Get/| POST/"| awk ' {url=$2;getline;host=$2;printf ("%SN", Host "url)} ' > Url.txt

This command is the key to this article, which displays binary files Tcp.cap all printable characters by strings, then filters out HTTP requests through grep and awk, and writes the resulting URL (including the domain name +uri) to a file url.txt.

3, at this time we got nearly 10 seconds all the access URL, the next statistic is easy to draw, such as:

Statistical QPS:

The code is as follows:

((qps=$ (wc-l/tmp/url.txt | cut-d '-F 1)/10))

Exclude static file statistics before 10 access URLs:

The code is as follows:

Grep-v-i-e ". (gif|png|jpg|jpeg|ico|js|swf|css) "/tmp/url.txt | Sort | uniq-c | Sort-nr | Head-n 10

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.