More like the Phoenix News client's Fun series, so I wrote a python program to download all the addresses of such jokes.
The following program is just download the first page of all the article URL, the program modified, you can crawl all the articles.
#!/usr/bin/python#-*-coding:utf-8-*-import requestsimport jsonimport reheaders={"Host":'i.ifeng.com', "user-agent":"mozilla/5.0 (Windows NT 6.1; WOW64; rv:34.0) gecko/20100101 firefox/34.0", "Accept":"*/*", "Accept-language":"zh-cn,zh;q=0.8,en-us;q=0.5,en;q=0.3", "Content-type":"application/x-www-form-urlencoded", "X-requested-with":"XMLHttpRequest", "Referer":"Http://i.ifeng.com/news/djch/fun/dir?vt=5&cid=17899&mid=64DflZ"}RSP=requests.Get("Http://i.ifeng.com/news/djch/fun/ajaxlist.php?p=1&cid=17899&htmltype=dir", timeout=5, headers=headers) HTML=Rsp.json () pattern=re.compile (R'aid= (. +?) &') Aid_list=Re.findall (pattern,str (HTML)) forAidinchAid_list:url="http://i.ifeng.com/news/djch/fun/news?vt=5&cid=0&aid=%s&mid=64DflZ&all=1&p=2"%Aid Print URL
Crawling the Phoenix client's jokes