This is a creation in Article, where the information may have evolved or changed.
Objective
Long ago wanted to study GO , but because the preparation of research and internship has been delayed not to do, but occasionally look at the relevant basic grammar, and did not apply it specifically to the actual coding. Senior, the course was a lot less, so decided to use it to grab some image data from the Internet, and then provide the interface, for the back of learning to iOS provide some network data.
About GO The introduction I do not say here, for me this kind of beginner originally said is not clear not Chu, more give oneself fall gossip.
The main features I want to implement are the following:
Capture image links and other data from fine image websites;
Storing the acquired data in a MySQL database;
Provides a simple json interface that allows you to get data through a link json .
Preparatory work
Install go and configure the environment
Because I use it myself OS X , I also wrote a Mac install go article, if you use a Mac, you can refer to it. Baidu under Windows will also be a good solution.
Analyzing Small programs
$GOPATH/srcunder Create a project folder indiepic as the directory for this applet. Each of the go projects has and only one package main , create a new go file under the project folder indiepic.go as the main file:
package mainimport "fmt"func main () { fmt.Println("Hello World")}
Because the file is later started and the interface is provided with HTTP data, operations such as fetching data and storing it in a database are placed in one of the items for readability, 包 and the manipulation of the data is rarely manipulated and does not need to be executed at each boot, so it is organized into a package is a good way to call an interface in a function only when it needs to be crawled main .
Therefore, create a new folder in the project folder crawldata , which is what we need package . The following required fetching data and storing data in the database and fetching data from the database are written as a function under the package.
crawldatanew crawldata.go and files under Folders database.go . One is related to crawling data, and one to database access data.
The folder structure is as follows:
indiepic├── README.md├── crawldata│ ├── crawldata.go│ └── database.go└── indiepic.go
The next step is to start implementing the Data Capture section.
Main crawl image Site http://www.gratisography.com/