Optimization of Algorithm for counting the number of 1 in binary bits in C Language
To count the number of 1 in an integer binary:
Int one (int m) {int count = 0; while (m! = 0) {if (m % 2 = 1) // perform the modulo 2 Division 2 one-digit statistics {count ++;} m/= 2 ;}}
Or the following operations
Int two (int m) {int count = 0; while (m! = 0) {count + = m & 1; // The principle for calculating the number of statistical items in 1 is the same as that above, but more secure m >>=1 ;} return count ;}
However, neither of these algorithms can avoid making judgments or calculations on the values of 0 in binary bits. Isn't there any algorithm that is only related to the values of 1 in binary sequence? The following algorithm implements the following idea:
Int three (int m) {int count = 0; while (m! = 0) {count ++; m & = m-1; // you can remove a 1} return count;} each time the number of values in the position minus 1 ;}
Of course, the third algorithm is the most efficient, but there should be more effective methods.