python 抓取"一個"網站文章資訊放入資料庫

來源:互聯網
上載者:User

標籤:python   文章   爬蟲   

# coding:utf-8import requestsfrom bs4 import BeautifulSoupimport jsonimport timeimport datetimeimport pymysqlimport sysreload(sys)sys.setdefaultencoding(‘utf-8‘)# 擷取文章內容方法def getartinfo( url ): page = requests.get(url).content soup = BeautifulSoup(page,‘lxml‘) res={} res[‘curr‘] = soup.find(‘div‘,class_="comilla-cerrar").string.strip() res[‘title‘] = soup.find(‘h2‘,class_="articulo-titulo").string.strip() res[‘auchor‘] = soup.find(‘p‘,class_="articulo-autor").string.strip() res[‘contents‘] =soup.find(‘div‘,class_="articulo-contenido") res[‘add_time‘] = (int)(time.time()) return res# 擷取問答內容方法def getqueinfo( url ): page = requests.get(url).content soup = BeautifulSoup(page,‘lxml‘) res={} res[‘title‘] = soup.find(‘h4‘).string.strip() res[‘curr‘] = soup.find(‘div‘,class_="cuestion-contenido").string.strip() res[‘auchor‘] = soup.find(‘p‘,class_="cuestion-editor").string.strip() res[‘contents‘] =soup.find_all(‘div‘,class_="cuestion-contenido")[1] res[‘add_time‘] = (int)(time.time()) return res# 抓取“一個每日文章和問答”url = "http://wufazhuce.com/"page = requests.get(url).contentsoup = BeautifulSoup(page,‘lxml‘)# 每日文章art_list = soup.find_all("p", class_="one-articulo-titulo")art_url = art_list[0].a.get(‘href‘)artinfo = getartinfo(art_url)# 每日問答que_list = soup.find_all("p", class_="one-cuestion-titulo")que_url = que_list[0].a.get(‘href‘)queinfo = getqueinfo(que_url)que_list = list(queinfo.values())conn = pymysql.connect(host=‘localhost‘,port=3306,user=‘root‘,password=‘root‘,db=‘one‘,charset=‘utf8‘)cursor = conn.cursor()cursor.execute("INSERT INTO day_art(title,curr,author,contents,add_time)VALUES(‘{0}‘,‘{1}‘,‘{2}‘,‘{3}‘,‘{4}‘);".format(artinfo[‘title‘],artinfo[‘curr‘],artinfo[‘auchor‘],artinfo[‘contents‘],artinfo[‘add_time‘]))cursor.execute("INSERT INTO day_art(title,curr,author,contents,add_time)VALUES(‘{0}‘,‘{1}‘,‘{2}‘,‘{3}‘,‘{4}‘);".format(queinfo[‘title‘],queinfo[‘curr‘],queinfo[‘auchor‘],queinfo[‘contents‘],queinfo[‘add_time‘]))conn.commit()cursor.close()conn.close()print ‘ok‘

python 抓取"一個"網站文章資訊放入資料庫

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.