Starting from the introduction of basic concepts, this paper discusses the data structure and functions used in the operation of date and time in C/A + +, and expounds the timing, time acquisition, time calculation and display format. This article also shows you the detailed use of various functions and data structures declared in the Time.h header file through a large number of examples.
Keywords: UTC (World Standard Time), calendar times (Calendar time), epoch (point-in-time), clock tick (clock timing unit)
1. The concept
There are many noteworthy problems with the operation of the string in C + +, and there are many noteworthy points for the operation of C + + for time. Recently, in the technical group, many netizens have asked the C + + language on the operation of time, access and display, and so on. Below, in this article, the author will mainly introduce the time and date of the use method in C + +.
You can have a lot of ways to manipulate and use time by learning many of the C + + libraries. But before that you need to know some concepts of "time" and "date", mainly in the following:
Coordinated Universal Time (UTC): When reconciling the world, it is also known as world standard Times, known as Greenwich Mean Time (Greenwich Mean time,gmt). In China, for example, the time difference from UTC is +8, which is utc+8. The United States is UTC-5.
Calendar time: The time that is represented by the number of seconds elapsed from a standard point in time to this time. This standard point in time is different for various compilers, but for a compilation system, this standard time point is invariant, the time corresponding to the calendar time in the compilation system is measured by the standard point of time, so you can say that calendar time is "relative time", but no matter which time zone you are in, At the same time, the calendar time is the same for the same standard point.
Epoch: Time point. A time point is an integer in standard C + + that is represented by the number of seconds (that is, the calendar time) that differs from the standard point in time.
Clock tick: Clock timing unit (not called clock tick), the length of a clock timer is controlled by the CPU. A clock tick is not a clock cycle of the CPU, but a basic timing unit for C/S.
We can use the Time.h header file in the ANSI standard library. The method used for the time and date defined in this header file, whether in structure definition or naming, has an obvious C language style. Below, I will explain how to use the time function of the date in C + +.
2. Timing
The timing function in C/s + + is clock () and its associated data type is clock_t. In MSDN, the clock function is defined as follows:
clock_t clock (void);
This function returns the number of CPU clock timing units (clock tick) between the time the "start this program process" and the call to clock () function in the program), which is called the Wall Clock Time (Wal-clock) in MSDN. Where clock_t is the data type used to hold the time, in the Time.h file, we can find the definition of it:
#ifndef _CLOCK_T_DEFINED
typedef long clock_t;
#define _CLOCK_T_DEFINED
#endif
Obviously, clock_t is a long shaping number. In the time.h file, a constant clocks_per_sec is also defined to indicate how many clock ticks will be in a second, defined as follows:
#define CLOCKS_PER_SEC ((clock_t) 1000)
You can see that every 1 per thousand seconds (1 milliseconds), the value returned by the call to the clock () function is added to 1. For the following example, you can use the formula clock ()/clocks_per_sec to calculate the elapsed time of a process itself:
void elapsed_time()
{
printf("Elapsed time:%u secs.\n",clock()/CLOCKS_PER_SEC);
}
Of course, you can also use the clock function to calculate how long your machine runs a loop or how much time it takes to handle other events:
#include “stdio.h”
#include “stdlib.h”
#include “time.h”
int main( void )
{
long i = 10000000L;
clock_t start, finish;
double duration;
/* 测量一个事件持续的时间*/
printf( "Time to do %ld empty loops is ", i );
start = clock();
while( i-- ) ;
finish = clock();
duration = (double)(finish - start) / CLOCKS_PER_SEC;
printf( "%f seconds\n", duration );
system("pause");
}
On the author's machine, the results of the operation are as follows:
10000000 empty loops is 0.03000 seconds
We see that the clock timer is 1 milliseconds long and the precision of the timer is 1 milliseconds, so can we make the timing more accurate by changing the definition of clocks_per_sec by defining it a bit larger? By trying, you'll find that this is not going to work. The smallest unit of timekeeping in standard C/s is one millisecond.