In Socket network development recently, the first problem that must be solved first is the stability of communication between the server and the client. The most basic problem can be achieved with great care. I am using a two-way heartbeat mechanism. The general process is as follows:
1. The client sends a heartbeat packet to the server at intervals.
2. After receiving the heartbeat packet from the client, the server immediately returns a heartbeat packet to the client.
3. When the server does not receive the heartbeat packet from the client three times, the client is considered to have been disconnected.
4. If no response is received after the client sends three heartbeat packets to the server, the client is deemed to have lost connection with the server.
Hi, it's far from the question of the title.
When the server regularly detects client connections, I use the system. Threading. Timer timer for processing. However, when testing later, we found that this task was not executed after a period of time. I felt very strange. After analyzing and thinking, I found that using system. Threading. timer to start the timer only needs to be used like the following:
VaR timer = new system. Threading. Timer (checkonlineclient, null, 1000,600 0 );
Checkonlineclient is used to check whether the client is online.
The above definition refers to a delay of 1 second to start timing, and the checkonlineclient method is executed every 6 seconds.
However, I forgot a very important thing, that is, the automatic recovery mechanism of. Net CLR. There is only a definition for timer and no reference. In CLR's view, it should be recycled. Therefore, after a period of time, the timer will be recycled, and it is impossible to continue executing the checkonlineclient method.