1. The use of proxy IPs:
When crawling Web pages, some target sites have anti-crawler mechanisms, for frequent visits to the site and regular access to the site behavior, will collect the shielding IP measures. At this time, you can use proxy IP, shielded one on the other IP.
2. Proxy IP Classification:
Proxy IP words also divided into several: transparent agent, anonymous agent, obfuscation agent, high stealth agent, the general use of high stealth agent.
3. Use Requestconfig.custom (). SetProxy (proxy). Build () To set the proxy IP:
Public Static voidMain (string[] args)throwsclientprotocolexception, IOException {//Creating an HttpClient instanceCloseablehttpclient httpClient =Httpclients.createdefault (); //Creating an HttpGet instanceHttpGet HttpGet =NewHttpGet ("http://www.tuicool.com"); //Set the proxy IP, set the connection time-out, set the timeout for requesting read data, set the connection timeout from Connect manager , Httphost proxy = new Httphost ("122.228.25.97", 8101); Requestconfig requestconfig = Requestconfig.custom (). SetProxy (proxy). Setconnecttimeout (100 ). SetSocketTimeout (10000). Setconnectionrequesttimeout (+). Build (); Httpget.setconfig (Requestconfig); //set the request header messageHttpget.setheader ("User-agent", "mozilla/5.0" (Windows NT 6.1; WOW64; rv:45.0) gecko/20100101 firefox/45.0 "); Closeablehttpresponse Response=Httpclient.execute (HttpGet); if(Response! =NULL) {httpentity entity= Response.getentity ();//Get return entity if(Entity! =NULL) {System.out.println ("Page content:" + entityutils.tostring (Entity, "Utf-8"))); } } if(Response! =NULL) {response.close (); } if(HttpClient! =NULL) {httpclient.close (); } }
4. How do we obtain the proxy IP in the actual development?
We can use HttpClient to crawl the latest 20 high-stealth proxy IP on http://www.xicidaili.com/, to save to the list, when an IP is blocked to get the connection timeout, then take out the list of the IP, and so on, It can be determined that when the number in the list is less than 5, the agent IP is re-crawled to be saved to the linked list.
HttpClient (quad)--using proxy IP and timeout settings