Problem Description:
For an unsigned integer variable of one byte (8bit), the number of "1" in the binary representation is obtained, and the execution efficiency of the algorithm is required to be as high as possible.
Problem Solving:
#include <iostream> using namespace std;
#define BYTE unsigned char/* solution 1: Using integer data division characteristics, by dividing and judging the value of the remainder of the analysis.
Every time except for two, see if it is odd, yes add one, the final result is the binary representation of 1 of the number. * * int Count1 (BYTE v) {int sum=0;
while (v) {if (v%2==1) {sum++;
} V=V/2;
return sum;
}/* Solution 2:o (LOG2V) use bit operations to shift right to the last one directly discard the "and" Operation of V and 00000001 can determine whether the last one is 1*/int Count2 (BYTE v) {int sum=0;
while (v) {sum = v & 0x01;
V >>=1;
return sum;
}/* Solution 3:o (m) m is the number of 1 in V.
V & (V-1) Every time you can eliminate the last 1 of the binary representation, using this technique, the complexity of the algorithm is only related to the number of 1./int Count3 (BYTE v) {int sum = 0;
while (v) {v &= V-1;
sum++;
return sum;
int main () {BYTE v=10100010;
cout << Count1 (v) << Endl;
cout << Count2 (v) << Endl;
cout << Count3 (v) << Endl;
return 0; }