How does the C language calculate program run time?

Source: Internet
Author: User

The timing function in C/s + + is clock (), and the data type associated with it is clock_t. In MSDN, the check for the clock function is defined as follows:

clock_t clock (void);

This function returns the number of CPU clock ticks (clock ticks) from "Open this program process" to "call clock () function in program", called Wall Clock Time (Wal-clock) in MSDN. Where clock_t is the data type used to hold the time, in the Time.h file, we can find the definition of it:

#ifndef _clock_t_defined
typedef long CLOCK_T;
#define _clock_t_defined
#endif

It is clear that clock_t is a long shaping number. In the Time.h file, you also define a constant clocks_per_sec, which is used to indicate how many clock units are in a second, defined as follows:

#define CLOCKS_PER_SEC ((clock_t) 1000)

You can see that every 1 per thousand seconds (1 milliseconds), the value returned by the call to the clock () function increases by 1. As an example, you can use the formula clock ()/clocks_per_sec to calculate the run time of a process itself:

void Elapsed_time ()
{
printf ("Elapsed time:%u secs.\n", Clock ()/clocks_per_sec);
}

Of course, you can also use the clock function to calculate how long it takes for your machine to run a loop or handle other events:

#include "stdio.h"
#include "Stdlib.h"
#include "time.h"

int main (void)
{
Long i = 10000000L;
clock_t start, finish;
Double duration;
/* Measure the duration of an event */
printf ("Time to does%ld empty loops is", i);
start = Clock ();
while (i--);
finish = Clock ();
Duration = (double) (finish-start)/clocks_per_sec;
printf ("%f seconds\n", duration);
System ("pause");
}

On the author's machine, the results of the operation are as follows:

Time to does 10000000 empty loops is 0.03000 seconds

We can see that the clock timing unit is 1 milliseconds in length, and the accuracy of the timing is 1 milliseconds, so could we change the definition of the clocks_per_sec by making it larger, so that the timing is more accurate? By trying, you will find that this is not possible. In standard C/s + +, the smallest unit of timing is one millisecond.

Referenced from: http://blog.csdn.net/querdaizhi/article/details/6925156

How does the C language calculate program run time?

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.