This is a creation in Article, where the information may have evolved or changed.
"Disclaimer: Copyright, welcome reprint, please do not use for commercial purposes. Contact mailbox: feixiaoxing @163.com "
Currently, there are many web crawling codes on the Internet. However, since I looked at the Web download code of the Go language, I found that the original page download code is the simplest. If you don't believe it, you can look at it.
Package main import ( "FMT" "Log" " net/http" " OS") Func main () { resp,err:=http. Get ("http://www.baidu.com") if err!=nil{ //handleerror fmt. PRINTLN (Err) log. Fatal (Err) } defer resp. Body.close () if resp. Statuscode==http. statusok{ FMT. Println (resp. StatusCode) } buf:=make ([]byte,1024) //createfile F,err1:=os. OpenFile ("baidu.html", OS. O_rdwr|os. O_create|os. O_append,os. Modeperm) If err1!=nil{ panic (err1) return } defer f.close () for{ N,_:=resp. Body.read (BUF) if 0==n{ break } f.writestring (String (buf[:n)) }}
It is possible to note that these functions are available, respectively, HTTP. Get,os. Openfile,resp. Body.read,f.writestring. You can imagine what these functions are for. In fact, as their names show, their functions are HTTP downloading, creating files, reading characters, and writing files. I don't know if everyone answered correctly. Interested friends can copy this part of the code and test it.
If I remember correctly, this part of the code is referenced elsewhere. The author is welcome to contact me and I will comment later.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.