Some time ago I studied how to capture the website's stuff. There is a good tool wget. Many UNIX operating systems are installed by default, and Windows wgetwin is also available. You can go to G.
In the past two days, You Need To Capture web images.ProgramTo analyze the web page.Source codeObtain links to webpages, track these links to obtain image information, and then download images of the specified size or size.
Write a paragraphCode
Private void printimg (string baseurl, stringparamid)
{
This. response. Clear ();
String strurl2 = baseurl + "icon/" + paramid + ". GIF ";
System. net. httpwebrequest hwreq = (system. net. httpwebrequest) system. net. httpwebrequest. Create (strurl2 );
System. net. httpwebresponse hwrep = (system. net. httpwebresponse) hwreq. getresponse ();
System. Drawing. Image BMP = system. Drawing. image. fromstream (hwrep. getresponsestream ());
System. Io. memorystream MS = new memorystream ();
BMP. Save (MS, imageformat. PNG );
Response. clearcontent (); // You Need to output the image information to modify the HTTP header.
Response. contenttype = "image/GIF ";
Response. binarywrite (Ms. toarray ());
This. response. End ();
}
However, when capturing remote images through a virtual path, exceptions may occur if the network is slow or the connection fails.
So when getresponse:
Try
{
System. net. httpwebresponse hwrep = (system. net. httpwebresponse) hwreq. getresponse ();
}
Catch
{
Throw new applicationexception ("network exception ");
}
This method is slow when obtaining the httpwebresponse object. You need to understand the existence of this situation and arrange your application.