I recently lost my job and posted many resumes online. It is said that after refreshing, my resume can be placed in front of me! So I thought of a small program to refresh my resume. I happened to be learning Python, and I was too lazy to start vs slowly.
Refresh your resume. It is more practical than simulated Logon of 163!
Create a notebook on the table and click ". txt". You can understand it. Change the suffix to py and open it!
The function is very simple. The following code shows the strong C # encoding style.
We recommend two tools, one embedded as a browser tool and the other independent. HttpWatch and Fiddler2, which are powerful and essential for crawling.
[Python]
Import urllib. parse, urllib. request, http. cookiejar
# Submit a form based on the path and POST content
Def GetUrlRequest (iUrl, iStrPostData ):
Postdata = urllib. parse. urlencode (iStrPostData)
Postdata = postdata. encode (encoding = 'utf8 ')
Header = {'user-agent': 'mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0 )'}
Req = urllib. request. Request (
Url = iUrl,
Data = postdata,
Headers = header)
Return urllib. request. urlopen (req). read (). decode ("UTF8 ")
# Setting cookies
Cookie = http. cookiejar. CookieJar ()
CookieProc = urllib. request. HTTPCookieProcessor (cookie)
Opener = urllib. request. build_opener (cookieProc)
Urllib. request. install_opener (opener)
# Logon information
StrLoginInfo = {
'Chk _ remember_pwd': 'on ',
'User _ login': 'xxx @ 163.com ',
'User _ pwd': 'xxx'
}
UrlLogin = 'HTTP: // XXX/user/ajaxlogin /? IsMd5 = 1'
Print ('logon result: '+ GetUrlRequest (urlLogin, strLoginInfo ))
# Refresh resume
UrlRefresh = 'HTTP: // XXX/resume/refreshresume /'
StrRefresh = {'res _ id': 'xxx '}
Print ('refresh result: '+ GetUrlRequest (urlRefresh, strRefresh ))
Import urllib. parse, urllib. request, http. cookiejar
# Submit a form based on the path and POST content
Def GetUrlRequest (iUrl, iStrPostData ):
Postdata = urllib. parse. urlencode (iStrPostData)
Postdata = postdata. encode (encoding = 'utf8 ')
Header = {'user-agent': 'mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0 )'}
Req = urllib. request. Request (
Url = iUrl,
Data = postdata,
Headers = header)
Return urllib. request. urlopen (req). read (). decode ("UTF8 ")
# Setting cookies
Cookie = http. cookiejar. CookieJar ()
CookieProc = urllib. request. HTTPCookieProcessor (cookie)
Opener = urllib. request. build_opener (cookieProc)
Urllib. request. install_opener (opener)
# Logon information
StrLoginInfo = {
'Chk _ remember_pwd': 'on ',
'User _ login': 'xxx @ 163.com ',
'User _ pwd': 'xxx'
}
UrlLogin = 'HTTP: // XXX/user/ajaxlogin /? IsMd5 = 1'
Print ('logon result: '+ GetUrlRequest (urlLogin, strLoginInfo ))
# Refresh resume
UrlRefresh = 'HTTP: // XXX/resume/refreshresume /'
StrRefresh = {'res _ id': 'xxx '}
Print ('refresh result: '+ GetUrlRequest (urlRefresh, strRefresh ))
I use the UTF-8 here, can be adjusted according to different sites.
The environment used is python3.2.3.
Execution result: