Before a job is to the server to copy a string of text down, very simple operation, but need to repeat about 50 times, each time spent about three minutes, once down two hours went in. So I did this tool to automatically fetch the data.
Tools do three things: login, download, intercept.
Login section because the server is using Windows security check,
A mock login is required to get to the Web page data.
First, crawl the HTTP-transmitted packets with fiddler, and find a string in the header section:
After Base64 decryption get: Administrator:manage.
This is the user name: the string after the password has been encrypted. And Authorization:basic is a kind of authentication method, generally through setrequestproperty to set.
After the landing, you can directly obtain the content of the Web page and then data interception. Finally, a loop is added to the outermost layer, and the server address that needs to be crawled is executed once and the data on all the servers can be retrieved at once.
Last Post code:
ImportJava.io.*;Importjava.net.*; Public classGetpackagefromweb { Public Static voidMain (String args[])throwsException {string[] servers={"192.168.0.144:23342", "192.168.0.144:23343"}; StringBuilder result=NewStringBuilder (); for(inti=0;i<servers.length;i++) {String packagelist= Getwebpage ("http//" +servers[i]+ "/WMROOT/PACKAGE-LIST.DSP"); while(Packagelist.indexof ("package-info.dsp?package=")!=-1) {packagelist=packagelist.substring (Packagelist.indexof ("package-info.dsp?package=") +25, Packagelist.length ()); String Temppackage=packagelist.substring (0, Packagelist.indexof ("\" ")); Result.append (Temppackage+","); } result.append ("\ n"); } bufferedwriter BufferedWriter=NewBufferedWriter (NewFileWriter ("C:\\result.csv")); Bufferedwriter.write (Result.tostring ()); Bufferedwriter.close (); } Public StaticString Getwebpage (String Addr)throwsException {String data= ""; URL URL=NewURL (ADDR); HttpURLConnection Httpurlcon=(HttpURLConnection) url.openconnection (); String author= "Basic qwrtaw5pc3ryyxrvcjptyw5hz2u="; Httpurlcon.setrequestproperty ("Authorization", author); Httpurlcon.connect (); InputStreamReader Inread=NewInputStreamReader (Httpurlcon.getinputstream (),"UTF-8"); BufferedReader Bufread=NewBufferedReader (inread); StringBuffer Strbuf=NewStringBuffer (); String Line= ""; while(line = Bufread.readline ())! =NULL) {strbuf.append (line); } Data=strbuf.tostring (); returndata; }
}
Remember a simple crawl of web data