A basic and strange question: Does the algorithm run addition, multiplication, division performance is no different? The computer theory analyses that the computational performance of addition, multiplication and division decreases in descending order, but to what extent?
Write the C program with 30次百万 data to test the time difference, the code is as follows:
#include <stdio.h> #include <stdlib.h> #include <time.h> #define N 1000000void Add (float x[], long N) { float sum = 0;for (Long i = 0; i < n; i++) sum + = X[i];} void prod (float x[], long n) {float sum = 1;for (Long i = 0; i < n; i++) sum *= (X[i]);} void Div (float x[], long N) {for (Long i = 0; i < n; i++) {x[i]/= 3.0;}} int main () {float x[n];clock_t t1 = clock (); for (int i = 0; i <; i++) Add (x, N), clock_t t2 = Clock ();p rintf ("Million data addition time:% F seconds \ n ", (double) (T2-T1)/clocks_per_sec); t1 = clock (); for (int i = 0; i <; i++) prod (x, n) t2 = clock ();p rintf (" Million data Multiplication time:%f s \ n ", (double) (T2-T1)/clocks_per_sec), T1 = Clock (), for (int i = 0; i <; i++) div (x, n), t2 = Clock ();p rintf ( "Million Data division Time:%f sec \ n", (double) (T2-T1)/clocks_per_sec); return 0;}
The results are as follows:
Million data addition time: 0.157051 seconds
Million data multiplication time: 0.184712 seconds
Million Data division time: 0.161014 seconds
-----------------------------------------
Million data addition time: 0.156099 seconds
Million data multiplication time: 0.184023 seconds
Million Data division time: 0.159588 seconds
What is the problem?
Should we say that the basic operations in the algorithm are no different?
A basic and strange question: Does the algorithm run addition, multiplication, division performance is no different?