C language qsort function algorithm performance test
An intuitive method to measure the complexity of an algorithm is to measure the running time of an algorithm of a certain magnitude of data.
Taking the qsort provided by C language as an example, we can test the computing time of the qsort with 1 million of the data volume to detect the time cost of O (nlg (n:
The C code is as follows:
# Include
# Include
# Include
# Define N 1000000 // int (* Comp) (const void *, const void *) int compare (const void * p1, const void * p2) {return * (float *) p1> * (float *) p2;} int main () {float x [N]; srand (time (NULL); clock_t t1 = clock (); for (int j = 0; j <10; j ++) {for (long I = 0; I <N; I ++) x [I] = (float) rand ()/RAND_MAX; qsort (x, N, sizeof (float), compare) ;}for (int I = 0; I <10; I ++) printf ("% f", x [I]); printf ("\ n"); clock_t t2 = clock (); printf ("when the floating point sorting algorithm is used: % f second \ n ", (double) (t2-t1)/CLOCKS_PER_SEC); return 0 ;}
Compile and run the SDK with gcc qsort_test.c in The Notebook. 1 million data is randomly generated and sorted 10 times. The result is as follows:
~ /Tmp $./a. out
0.000000 0.000001 0.000001 0.000002 0.000002 0.000004 0.000004 0.000005 0.000006 0.000006
Floating Point Number Sorting Algorithm time: 2.236941 seconds