When I was studying information theory last semester, I came into contact with the topic of communication complexity. I think this is a very interesting field. In 1970s, Mr. Yao Zhizhi also made important achievements in this field.
I used a paragraph in matrix67 to introduce the definition of communication complexity.
Communication Complexity: A holds data $ x $, B holds data $ y $, they want to work together to calculate a binary function value about $ x $ and $ y $ f (x, y) $. The two need to transmit at least how many bits of data to calculate $What is the value of f (x, y) $?
The following are two interesting examples. Both examples are from the classroom. The previous example is the questions of previous years, and the other example is the questions left by the teacher, but no answers are left.
Note: As long as one person calculates $ f (x, y) $, the calculation is completed.
Problem 1:
A and B hold an integer x and y that cannot exceed n respectively. It is known that the number in one hand is twice that of the other. Now we need to calculate the sum of the numbers in the two hands. How much BIT data is transferred?
Obviously, the $ O (\ log n) $ scheme exists: A transmits the binary code of the number in his hand to B, and B can complete the computation.
Soon you will find that you do not need to be so troublesome. The numbers in the hands of A and B are in binary encoding, and only the numbers at the end of 0 are different. In fact, a only needs to tell B the number at the end of the number.
Since the number of zeros at the end is $ O (\ log n) $, you only need $ O (\ log n) $ bits for binary encoding.
However, this practice is obviously not sufficient. We only use one person's hand to get the number of $2 ^ K $ times that of another person, that is, 4 times, 8 times. I don't know if it is 2 times, this solution can be used for 4 times or 8 times. The question is limited to 2 times, so there must be room for optimization.
Is there a better way?
In fact, you only need to send two bits. Note that the number of zeros at the end of the binary code differs by 1. Therefore, as long as a emits the number of zeros at the end divided by the remainder of 4, that is, the last two digits of the binary code, then B will know whether his number is half of a or two times.
For example, if a sends 11 (3) and the remainder of 0 at the end of B is 2, then B knows that B must be half of. Otherwise, a should be sent to 1.
Here, the solution for sending 1 bit is clear: the number of 0 at the end of two people must be an odd pair, so the number of 0 at the end of a is an odd or even number, in fact, B does not need to exchange any information.
Therefore, a does not need to emit the last binary digit of the number 0 at the end, that is, if a sends:
The binary value of X indicates the last two digits of the number 0 at the end.
B.
Looking at the results, it seems that the design of this solution is incredible. However, the reasoning process is logical.
Problem 2:
Consider the situation of three people. If there are three people in ABC, each person has a $ N $ binary number, which is called XYZ.
The difference is that their data is not in their hands, but on their forehead.
That is to say, everyone knows what the number of the other two is, but does not know what the number is.
When they exchange information, the other two will receive the information as long as one person sends the information.
For three 0-1 digital devices, the mode can be defined.
That is to say, if two or more values of 0 appear in three digits, the mode is defined as 0. Otherwise, there must be two or more values of 1, and the mode is defined as 1.
Generally, for the three N-bit binary numbers, the mode number is defined as the number of places to be connected separately. For example, the numbers in three hands are:
0011001
1101100
1001110
The mode is:
1001100
Return to our questions. Now we want to calculate: in the mode of X, Y, and Z, is 1 an odd number or an even number? (The answer in this example is an odd number, because there are 3 1)
The obvious method is to send $ N $ bits: A sends the number on B's head, so B knows the number on his head and thus knows all the information.
Can it be faster?
As there is no standard answer to this question, I will write my own solution. Let's start with the conclusion, 2 bit.
Note that if the mode of the three digital numbers is 1, there are two possibilities: the three digital numbers are 1, the two are 1, and the other is 0.
Assume that P occurs in the previous case and Q occurs in the latter case in N bits. The question requires the parity of p + q.
Let's take a look at the following solution and send $2 \ log N $ bits:
A first looks at the numbers on B and C. How many of them are 1? Tell everyone the binary representation of this number
Then B looks at the numbers on the headers of A and C. How many of them are 1? Tell everyone the binary representation of the number B
C. Check the number on the head of a and B. How many of them are 1? Keep this number C in mind.
We evaluate the value of A + B + C.
If one of the three Members is 1, the three Members will see that the other two are both 1, that is, the other two are counted three times.
If one person has only one person, then only one person (the one with 0 in the head) will see that the other two people have one in the head, this digit is counted once.
Therefore, $ A + B + C = 3 p + q $ is the same as $ p + q $! So C can get the answer.
Since the parity only requires the last bit of binary, C can know the answer as long as A and B respectively tell the last bit of A and B. We only use 2bit
Open Question:
Is there a 1-bit solution?
XI Zi Ru Jin -- Communication Complexity problem 2