For the complexity of the algorithm. A direct method is to measure the perception of the execution time of a quantitative algorithm-level data.
As the C language provides qsort for example. With 10.01 million times of data to test its computational capacity. Perceived O (NLG (n)) time cost:
C Code as below:
#include <stdio.h> #include <stdlib.h> #include <time.h> #define N 1000000//int (*comp) (const void *, const void *) int compare (const void *P1, const void *P2) {return * (float*) p1 > * (float*) P2;} int main () {float X[n];srand (time (NULL)), clock_t T1 = Clock (), for (int j = 0; J <; J + +) {for (Long i = 0; i < N; I + +) X[i] = (float) rand ()/rand_max;qsort (x, N, sizeof (float), compare);} for (int i = 0; i < i++) printf ("%f", X[i]);p rintf ("\ n"), clock_t t2 = Clock ();p rintf ("Time for floating-point sorting algorithm:%f sec \ n", (double) (T2 -T1)/clocks_per_sec); return 0;}
Compile and execute with GCC qsort_test.c on your notebook. 10 plays 1 million random data generation and sequencing results are:
~/tmp$./a.out
0.000000 0.000001 0.000001 0.000002 0.000002 0.000004 0.000004 0.000005 0.000006 0.000006
When sorting floating-point algorithm: 2.236941 second
Copyright notice: This article Bo Master original article. Blog, not reproduced without consent.
Performance test of C-language qsort function algorithm