A fundamental and strange question: Does the algorithm perform addition, multiplication, division performance without distinction? The analysis of computer theory thinks: The calculation performance of addition, multiplication and division is reduced, but to what extent?
Write C program with 30次百万 data calculation to test the time difference between the opposite sex, code such as the following:
#include <stdio.h> #include <stdlib.h> #include <time.h> #define N 1000000void Add (float x[], long N) { float sum = 0;for (Long i = 0; i < n; i++) sum + = X[i];} void prod (float x[], long n) {float sum = 1;for (Long i = 0; i < n; i++) sum *= (X[i]);} void Div (float x[], long N) {for (Long i = 0; i < n; i++) {x[i]/= 3.0;}} int main () {float x[n];clock_t t1 = clock (); for (int i = 0; i <; i++) Add (x, N), clock_t t2 = Clock ();p rintf ("Million data addition time:% F seconds \ n ", (double) (T2-T1)/clocks_per_sec); t1 = clock (); for (int i = 0; i <; i++) prod (x, n) t2 = clock ();p rintf (" Million data Multiplication time:%f s \ n ", (double) (T2-T1)/clocks_per_sec), T1 = Clock (), for (int i = 0; i <; i++) div (x, n), t2 = Clock ();p rintf ( "Million Data division Time:%f sec \ n", (double) (T2-T1)/clocks_per_sec); return 0;}
The results are as follows:
Million data addition time: 0.157051 seconds
Million data multiplication time: 0.184712 seconds
Million Data division time: 0.161014 seconds
-----------------------------------------
Million data addition time: 0.156099 seconds
Million data multiplication time: 0.184023 seconds
Million Data division time: 0.159588 seconds
What is the problem?
Should we say that the basic operations in the algorithm are no different?
A fundamental and strange question: Does the algorithm perform addition, multiplication, division performance without distinction?