Crawler-simulated website login and simulated crawler Login
Use Selenium with PhantomJS to simulate login to Douban: https://www.douban.com/
#! /Usr/bin/python3 #-*-conding: UTF-8-*-_ author _ = 'mayi' "simulate logon to Douban: https://www.douban.com/"" from selenium import webdriver # Call the environment variable specified by the PhantomJS browser to create a browser object, executable_path: Specify the PhantomJS location driver = webdriver. phantomJS (executable_path = r "D: \ Program Files \ phantomjs \ bin \ phantomjs") # The get () method will wait until the page is fully loaded before continuing the Program driver. get ("https://www.douban.com/") # Wait 3 seconds driver. implicitly_wait (3) # The snapshot driver before the computer login. save_screenshot ("DoubanHome.jpg") email = input ("Enter your email/Mobile Phone:") password = input ("enter your password:") # enter your account password: driver. find_element_by_id ("form_email "). send_keys (email) driver. find_element_by_id ("form_password "). send_keys (password) # simulate clicking to log on to the driver. find_element_by_xpath ("// input [@ class = 'bn-submit ']"). click () # Wait 3 seconds for the driver. implicitly_wait (3) # generate the log-on Snapshot driver. save_screenshot ("DoubanLanding.jpg") # Save the source code after logon with open ("douban.html", "w", encoding = "UTF-8") as f: f. write (driver. page_source) # Close the current page. If there is only one page, the browser driver will be closed. close () # close the browser driver. quit ()