Web attack log analysis guide
This is often the case: web applications face suspicious activities for different reasons, such as a child using an automated vulnerability scanner to scan a web site or a guy trying to perform fuzzy testing (fuzz) A parameter is used for SQL injection and so on. In many such cases, you need to analyze the logs on the web server to understand what is happening. In severe cases, forensic investigation may be required.
In addition, there are other scenarios.
As an administrator, it is really important to understand how to analyze logs from a security perspective.
People who have just started hacking/penetration testing must understand why they should not test/scan websites without permission.
This article includes the basic knowledge of log analysis to provide solutions for the above scenarios.
0x01 preparation
For demonstration purposes, I made the following settings.
Apache server
Pre-installed in Kali Linux
Run the following command to enable this function:
Service apache2 start
MySQL
Pre-installed in Kali Linux
Run the following command to enable this function:
Service mysql start
Vulnerable web applications built using PHP-MySQL
I have used PHP to develop a vulnerable web application and put it in the Apache-MySQL mentioned above. After the preceding settings, I used some automatic tools (ZAP and w3af) in Kali Linux to scan the URLs of this vulnerable application. Now let's take a look at the different situations in the log analysis.
0x02 log records in the Apache service
It is recommended to maintain web server logs for different reasons.
The default location of Apache server logs on Debian is var/log/apache2/access. log.
A log record is only a process for storing logs on the server. I also need to analyze the logs to get the correct results. In the next section, we will see how to analyze the access logs of the Apache server to find out whether there are any attack attempts on the web site.
Analyze logs
Manual check
When the log size is small, or if we are looking for a specific keyword, we can take some time to manually use something like grep expressions to observe the log.
In, we try to search for all requests with the keyword "union" in the URL.
We can see the "union select 1, 2, 3, 4, 5" request in the URL. Obviously, someone with the IP address 192.168.56.105 tried SQL injection. Similarly, when we have our own keywords, we can search for special keywords.
In, we are searching for requests trying to read "/etc/passwd", which is obviously a local file inclusion attempt.
As shown above, we have many local file inclusion attempts and these requests are sent from the IP address 127.0.0.1. These requests are generated by automated tools.
In many cases, it is easy to see whether the log is sent by an automated scanner. Automated scanners are noisy and use the loads of specified vectors when testing an application.
For example, IBM appscan uses the word "appscan" in many attack loads. Therefore, you can view such a request in the log to determine what is happening.
Microsoft Excel is also a good tool for opening and analyzing log files. You can use excel to open a log file by specifying a space. This can be used when we do not have a log analysis tool.
In addition to these keywords, it is very important to have the basic knowledge of HTTP status code during the analysis. The following table lists the advanced information about HTTP status code.
0x03 Web shells
Webshell is another issue for websites/servers. The webshell gives the server full control. In some examples, we can use webshell to access all other sites on the same server.
The following shows how to enable the same access. log File in Microsoft Excel. I have applied a filter to the specified column in the file accessed by the client.
If we observe clearly that a file named "b374k. php" is accessed. "B374k" is a popular webshell, so this file is totally suspicious. Check the corresponding code "200". This line indicates that someone has uploaded a webshell and accessed it.
This is not always the case: the webshell uploaded to the server is still the original name. In many cases, attackers can rename them to avoid suspicion. We must be smart to see if the accessed files are regular files or if they look unusual. We can further look at the file type and timestamp if anything looks suspicious.
One single quote for the win
This is a well-known fact: SQL Injection is one of the most common vulnerabilities in web applications. Most beginners of web Application Security begin with learning SQL injection.
. It is easy to identify a traditional SQL injection, add a single quotation mark to the URL parameter and destroy the query.
Anything we pass to the server will be recorded and traced.
The following shows that the access log records the single quotes passed to the parameter "user" to detect SQL injection.
% 27 is the URL encoding of single quotes.
For management purposes, we can also run query monitoring to check which request in the database has been executed.
If we observe the slice, it shows that the query is being executed by the request in the previous image, we pass a single quotation mark to the parameter "user"
0x04 use automated tools for analysis
When a large number of logs exist. Manual check will become difficult. In this scenario, in addition to some manual checks, we can use automated tools.
Although there are many efficient commercial tools, I would like to introduce you to a free tool called "scalkaline.
According to their official link, scalkaline is a log analyzer used for Apache servers to find security issues. The main idea is to browse a large number of log files and extract possible attacks from HTTP/GET.
Scalkaline is a python script, so we need to install python on our machine.
The help of this tool is displayed in slices.
As we can see, we need to use the flag-l to provide the log file to be analyzed.
At the same time, we need to provide the use of the flag-f to provide a filter file for scalkaline to identify possible attacks in the access. log file.
We can use the filters in the PHPIDS project to detect any malicious attempts.
The following code blocks are part of the above link.
filter> id>12/id> rule>![CDATA[(?:etc\/\W*passwd)]]>/rule> description>Detects etc/passwd inclusion attempts/description> tags> tag>dt/tag> tag>id/tag> tag>lfi/tag> /tags> impact>5/impact>/filter>
It is a rule set defined using XML tags to detect different attack tests. The above code snippet is an example of an attempt to detect File Inclusion attacks. Similarly, it detects other types of attacks.
After downloading the file, put it in the same folder of scalkaline.
Run the following command to analyze logs by using scclp.
Python scalp-0.4.py-l/var/log/apache2/access. log-f filter. xml-o output-html
"Output" is the directory where the report is saved. If it does not exist, it is automatically created by scalkaline. -Html is used to generate reports in HTML format. As we can see, scalkaline results show that it has analyzed 4001 rows, more than 4024, and found 296 attack modes.
Run the preceding command to generate a report in the output directory. We can open it in the browser and view the results. The output shown below shows a small part of the directory traversal attack attempts.
Log records in MySQL
This section describes how to analyze and monitor attacks in databases.
The first step is to check what variables are set. We can use "show variables;" to complete the process, as shown below.
The output of the preceding command is displayed.
As we can see in, the logging is enabled. The default value is OFF.
Another important record here is "log_output", which means we are writing the results to the file. In addition, we can also use tables.
We can see that "log_slow_queries" is ON. The default value is OFF.
MySQL query monitoring
Generally, the request log records the established client connection and the statements received from the client. As mentioned earlier, by default, these are not enabled because they reduce performance. You can enable them correctly from the MySQL terminal or edit the MySQL configuration file, as shown in.
I am using the VIM editor to open the "my. cnf" file located in the/etc/mysql directory.
If we scroll down, we can see the record and the record copy part we opened. These logs are being written into a file called "mysql. log.
We can also see the warning that this log type is a performance killer.
The Administrator usually uses this function to troubleshoot the problem.
We can also see that it takes a long time to query logs that record "log_slow_queries.
Now everything is ready. If someone hits the database with malicious queries, we can observe these logs. As follows:
It shows that the query hit a database named "webservice" and tried to use SQL injection to bypass authentication.
0x05 more log records
By default, Apache only records GET requests. To record POST data, we can use an Apache module called "mod_dumpio.