We are writing irrigation robot, grasping resource robot and web online game aids when the first step to achieve is the user login. So how to use C # to simulate a user's login pull? To achieve a user's login, you must first understand how the general Web site determines whether the user is logged in.
HTTP protocol is a connectionless protocol, that is, the content and status of this dialogue is unrelated to the last, in order to achieve a lasting interaction with the user, the Web site and the browser will be in the first session of the server in the memory to establish a sessions, which identifies the user (browser), each Sessions have a unique ID, the first time the session is established, the server will generate this ID to the browser, the browser in the next browse every request sent to the server will contain the SessionID, thereby identifying their identity.
The server uses memory to hold the information in session, so what does the browser use to save the SessionID of the server assignment? Yes, it's a cookie. When a session is established, the browser will not include SessionID in the server's request in a cookie, the server is considered to be a completely new session, allocating a memory on the server to the sessions, and then assigning the sessions ID to HTTP The header is sent to the browser using Set-cookie.
Now the principle has been made clear, then we will implement a Web site login, here is a grand freely login as an example.
To write this protocol-oriented network program, grab tools are indispensable, we first want to use the Grab tool analysis in the normal browser to send and receive the content to be sent and received to further use C # to simulate the browser contract. Grab a lot of tools, look at personal hobbies, I mainly use the HTTP Analyzer, specifically for HTTP, too strong to grab the bag of what the package of packages are not to grasp the results of our analysis.
1. It is best to clear ie all cookie records, so as to avoid the impact of the bag analysis, and then open the grab package program.
2. In IE enter http://zh.sdo.com/web1.0/home/fastlogin.asp This fast login address, we will see has caught a lot of requests and responses to the package.
3. Enter the username and password, click Login, ie normal login, stop grasping the bag, we want all the information is crawled well. As shown in figure:
4. The grand login mechanism is still more complex, the middle involves several servers, after analysis that (this is a relatively long process, specific Web site analysis, this analysis process I do not write) freely login mechanism:
1 IE request Https://cas.sdo.com:80/cas/login? Service=http://zh.sdo.com/web1.0/home/index.asp page, this page gives IE a SessionId, such as set-cookie:asp.net_sessionid= avcbse55l5e03suqi4dx3555; path=/
2 ie at the same time in the HTTP text to get a ticket, this ticket will be useful in the login, of course, other sites certainly do not do, here is the analysis of the freely. Location.href = http://www.sdo.com/login2.asp?lt=sd-1420e593-d2cf-4c9c-b249-07fe27932a21-2008-05-06_01%3a25% 3a41.484&service=http%3a%2f%2fzh.sdo.com%2fweb1.0%2fhome% 2ffastlogin.asp%3ftest%3d1; The LT parameter is what I call the ticket.
3 will get the LT, username, password and some other irrelevant parameters are post to https://cas.sdo.com:80/cas/Login.PostTarget.aspx? Service=http://zh.sdo.com/web1.0/home/fastlogin_after.asp, the specific captured post data such as: warn=false&_eventid=submit& Idtype=0&gamearea=0&gametype=0&cha llenge=3623<= sd-1420e593-d2cf-4c9c-b249-07fe27932a21-2008-05-06_01%3a25% 3a41.484&username=studyzy&password=1234 &ekey=&challenge=3623, here we only care about the Lt,username,password three parameters.