How to calculate the time difference in milliseconds and the time difference in milliseconds

Source: Internet
Author: User

How to calculate the time difference in milliseconds and the time difference in milliseconds

Calculating the time difference in milliseconds is a common requirement...

 

It is a windows programming project at hand, so the first thing that comes to mind is GetTickCount (), but on MSDN:

Try a program:

1 # include <stdio. h> 2 # include <windows. h> 3 4 int main (void) 5 {6 DWORD dwLastTime = GetTickCount (); 7 for (int I = 0; I! = 10; ++ I) 8 {9 DWORD dwCurrentTime = GetTickCount (); 10 printf ("GetTickCount = % ldms TimeDiff = % ldms \ n", dwCurrentTime, dwCurrentTime-dwLastTime ); 11 dwLastTime = dwCurrentTime; 12 Sleep (500); 13} 14 return 0; 15}View Code

 

We can see that 10 times of calculation, each deviation usually has 1 ms. What's more, it reaches 15 ms, which is the same as the actual precision described in MSDN.

Therefore, it is unreliable to use GetTickCount () to calculate the time difference in milliseconds!

 

Now, how can we meet our needs?

Requirement 1: calculate the time difference in milliseconds.

Requirement 2: The returned value is preferably at the unsigned long level to maintain compatibility with existing code.

 

Solution 1:

Clock_t clock (void );

This function returns the number of CPU clock cycles from program startup to the current time. Encapsulate this function:

1 #include <ctime>2 3 ULONG GetTickCountClock()4 {5     return (ULONG)((LONGLONG)clock() * 1000 / CLOCKS_PER_SEC);6 }

Test results:

 

Solution 2:

SYSTEMTIME FILETIME

Through SYSTEMTIME and FILETIME, we can get the time that we experienced in the early morning of June January 1, 1601, in the unit of 100 s.

This time must be accurate enough, but the obtained value is LONGLONG. It doesn't matter. We can use this time to calibrate the native GetTickCount ().

 1 ULONG GetTickCountCalibrate() 2 { 3     static ULONG s_ulFirstCallTick = 0; 4     static LONGLONG s_ullFirstCallTickMS = 0; 5  6     SYSTEMTIME systemtime; 7     FILETIME filetime; 8     GetLocalTime(&systemtime);     9     SystemTimeToFileTime(&systemtime, &filetime);10     LARGE_INTEGER liCurrentTime;11     liCurrentTime.HighPart = filetime.dwHighDateTime;12     liCurrentTime.LowPart = filetime.dwLowDateTime;13     LONGLONG llCurrentTimeMS = liCurrentTime.QuadPart / 10000;14 15     if (s_ulFirstCallTick == 0)16     {17         s_ulFirstCallTick = GetTickCount();18     }19     if (s_ullFirstCallTickMS == 0)20     {21         s_ullFirstCallTickMS = llCurrentTimeMS;22     }23 24     return s_ulFirstCallTick + (ULONG)(llCurrentTimeMS - s_ullFirstCallTickMS);25 }

Test results:

 

Precision comparison

Obtain the current time every 50 ms, compare the gap between TimeDiff and 50, and count 1000 times:

1 # include <math. h> 2 3 int main (void) 4 {5 int nMaxDeviation = 0; 6 int nMinDeviation = 99; 7 int nSumDeviation = 0; 8 9 DWORD dwLastTime = GetTickCountCalibrate (); 10 Sleep (50); 11 12 for (int I = 0; I! = 1000; ++ I) 13 {14 DWORD dwCurrentTime = GetTickCountCalibrate (); 15 int nDeviation = abs (dwCurrentTime-dwLastTime-50); 16 nMaxDeviation = nDeviation> nMaxDeviation? NDeviation: nMaxDeviation; 17 nMinDeviation = nDeviation <nMinDeviation? NDeviation: nMinDeviation; 18 nSumDeviation + = nDeviation; 19 dwLastTime = dwCurrentTime; 20 Sleep (50); 21} 22 printf ("nMaxDeviation = % 2dms, nMinDeviation = % dms, nSumDeviation = % 4dms, AverDeviation = %. 3fms \ n ", 23 nMaxDeviation, nMinDeviation, nSumDeviation, nSumDeviation/1000.0f); 24 25 return 0; 26}View Code

The accuracy of GetTickCount, GetTickCountClock, and GetTickCountCalibrate is compared as follows:

GetTickCount           nMaxDeviation = 13ms, nMinDeviation = 3ms, nSumDeviation = 5079ms, AverDeviation = 5.079msGetTickCountClock      nMaxDeviation =  2ms, nMinDeviation = 0ms, nSumDeviation =    4ms, AverDeviation = 0.004msGetTickCountCalibrate  nMaxDeviation =  1ms, nMinDeviation = 0ms, nSumDeviation =    3ms, AverDeviation = 0.003ms

As you can see, the native GetTickCount error is too large, the maximum error is 13 ms, and the average error is 5 ms, it certainly cannot meet the time requirements in milliseconds.

The precision of GetTickCountClock and GetTickCountCalibrate is almost the same, both of which can meet the time requirements in milliseconds.

The difference is that GetTickCountClock starts timing from the current program running, and GetTickCountCalibrate starts timing from the system startup.

 

Overflow

The maximum value of the four-byte ULONG is 4294967296 ms, that is, 49.7 days. exceeding this value will overflow.


How to calculate the time difference with milliseconds in excel

1 second = 1000 milliseconds, so your 2 minutes 54 seconds 72 milliseconds is actually 2 '54. 072 ", so you can calculate the result: 2.993 seconds.

I understand what you mean, because the data in the table cannot be calculated, right.
If there is a way to solve this problem, for example, if you are 2 minutes 54 seconds in cell A1 and 2 minutes 51 seconds in cell B1, you need to get the difference in C1: 2.993 seconds. Is that true?

If it is your operation, it should be like this. First, divide the data such as 2 minutes 54 seconds into three columns and perform the operation twice. The separator is other and the Chinese characters are written in minutes and seconds. Similarly, the operation is performed on another group of data, when there are no empty columns in the data, use the insert method. After this operation is completed, A1 is 2, B1 is 54, C1 is 72, D1 is 2, E1 is 51, F1 is 79, and then you enter a formula on G1 to perform the subtraction operation. For example: = A1-D1 & "points" & (B1 + C1/1000)-(E1 + F1/1000) & "seconds"
The result of the formula is: 0 minute 2.993 seconds.
Drag and fill the formula.
I hope you can see it clearly.

If you still need to restore the table to the previous format, the following operations are performed.
Copy the G1 column and paste it in the G column. The value is determined. Insert a column before A1 and E1 (before two minutes), and enter the formula = B1 & "Minute" & C1 & "second" & D1,
Enter the formula in the E1 column: = F1 & "Minute" & G1 & "second" & H1
Drag them to fill them separately, copy and paste the values selectively to remove the formula you just edited, and delete the six columns BCDFGH In the table, so that the table will be restored to its original form.

How to calculate the time difference in milliseconds

The number of days calculated by the date subtraction. The time subtraction is calculated in seconds. Subtract milliseconds and calculate the milliseconds. Finally, Add.
There is another way. Convert the two time points into numbers and subtract them.
 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.