This is a creation in Article, where the information may have evolved or changed.
What's wasting today is tomorrow to those who died yesterday; What's the future of hate now?
What you are wasting today is the tomorrow that the man who died yesterday expects, and what you dislike is now the future you will never go back to.
Today found that there are a lot of collection of songs, so want to download down, want to use Golang next (don't ask why)
The download section is organized as a library with the following code:
package downloadimport ("strconv""sync""time")type Urls struct {Urls []stringWg sync.WaitGroupChs chan int // 默认下载量Ans chan bool // 每个进程的下载状态}// 初始化下载地址 根据项目确认使用配置文件的方式还是其他方式,此处使用爬虫处理没公开func (u *Urls) InitUrl(end chan bool) {for i := 0; i < 20; i++ {u.Urls = append(u.Urls, "https://studygolang.com/articles/2228")}end <- true}// 实际的下载操作func downloadHandle(url string, b *bar.Bar) string { //需要根据下载内容作存储等处理time.Sleep(3*time.Second)return ""}/**每个线程的操作url 下载地址chs 默认下载量ans 每个线程的下载状态*/func (u *Urls) Work(url string) {defer func() {<-u.Chs // 某个任务下载完成,让出u.Wg.Done()}() downloadHandle(url)u.Ans <- true // 告知下载完成}
Call Mode:
package mainimport (dl "downloadAndstup/download")func main(){end := make(chan bool)u := dl.Urls{Chs:make(chan int , 5), // 默认同时下载5个Ans: make(chan bool),}// 初始化urlgo u.InitUrl(end)if ok := <- end; ok{ // 分发的下载线程go func(){for _, v := range u.Urls{u.Chs <- 1 // 限制线程数 (每次下载缓存加1, 直到加满阻塞)u.Wg.Add(1)go u.Work(v)} u.Wg.Wait() // 等待所有分发出去的线程结束close(u.Ans)// 否则range 会报错哦}() // 静静的等待每个下载完成for _ = range u.Ans{}}}
The above thread will start a new goroutine with a total of 5 downloads after a download task is completed.
If the new requirement is downloaded at the same time and downloaded 5 sequentially, then start the new 5 download tasks? Subsequent updates