Drinking games, probability distributions and convolution
I've always had a passion for things that could count probabilities, including a little game when I was drinking.
Put a cup on the table, a table of people take turns to roll the dice, shaking two at a time. If the two dice result number and Y are not any of {7,8,9}, this player counted, without drinking, to the next person to shake, but if y=7, the player in the cup to pour alcohol, can be less, and continue to shake, if y=8, the cup of wine to drink half, continue to shake, if y=9, haha, that miserable, all drink, Keep shaking.
When I was in college, my classmates called the game "789" affectionately.
Obviously, the participants in the game will be very concerned about the two dice numbers (set to X1 and X2) and the probability of being 7,8 or 9. or further, they care about the probability distribution of y=x1+x2 (probability mass function). I was sorry that I had not been able to quickly realize the distribution of y in that year. If I can be a teacher in the future, then my students if I can drink a little show off, also count me not white.
First, consider y=7 this event, the probability of its occurrence equals (X1, X2) = (1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1) The probability of the occurrence of this six mutex event. Looking closely at the six events listed above, and noting that X1 and X2 are independent of each other, p (X1, X2) =p (X1) p (X2), we can get the following equation:
is not very familiar, mathematically speaking, this is the convolution operation. When it comes to the understanding of convolution, I think the linear time-invariant system (LTI system) is a good example: the convolution of the input function and the unit impulse response function is the output function. So why do convolution operations occur when calculating the distribution of independent random variables?
I do not want to use rigorous mathematical proofs to describe the answer, let me use the signal system analogy to explain. To put it simply, the output of a linear time-invariant system is a linear combination of many, many, time-delayed units of impulse response. Upside down, the last sentence is equivalent to the T-moment output signal is a lot of input (time input) and delay (elapsed time) sum. Because many teachers have done a detailed introduction to this issue, I will not say more "1" "2" "3" "7".
The convolution operation is used when calculating the probability, and is there any connection to the LTI system? As I understand it, X1 can be understood as an independent variable (time) of the input function, and P (X1) is interpreted as the dependent variable (function value) of the input functions. Similarly, X2 is the independent variable (time) of the system bit impulse response, and P (X2) is the function value of the unit impulse response. The independence of X1 and X2 guarantees the multiplication of the two functions, that is, the characteristics of the linear system. Also because the output is X1 and X2 and, in the X1 moment (corresponding time) the input pulse, after X2 time (corresponds to the corresponding times) the response of the output, which conforms to the definition of the time-invariant system. Thus, the distribution of Y can be likened to the output of an LTI system.
Of course, strictly speaking, this analogy is not rigorous, but I would like to introduce a new concept with an easy-to-understand concept.
Without proving the conclusion, two independent continuous random variables X1 and X2, subject to distribution probability density and distribution, then the probability density of random variable y=x1+x2 is,
For the distribution of discrete random variables, the upper formula can be simplified into a summation form:
Familiar with convolutional friends know that two of the convolution of the rectangular window is a triangular window function, I skipped the calculation, directly given the distribution such as:
It can be seen that the probability of occurrence of y=7 is greatest, followed by y=6 and y=8. If you want to make this game more brutal, you might as well change to 6,7,8.
If further analysis is needed, such as the average number of drinks a person will drink, then some assumptions need to be made (such as the volume of a discrete pour, a homogeneous distribution of wine), and a Markov chain. Each state transition is accompanied by a reward. And what we just calculated is just the first step in the transfer probability. In fact, it is a very complicated problem, there is a chance to write a sequel later.
Finally, when it comes to convolution, I want to say more about integral transformation. According to the definition, the dynamic difference of the continuous random variable y (moment generating function, the moment) can be expressed by the following formula:
where f (y) is the distribution function of the random variable y, in the example above is P (y).
Is it a bilateral pull-out transformation that Berthelot? That is, by definition, a sequence of real numbers, which is an independent random sequence that defines random variables
Then it is a function of the dynamic and inferior
The distribution of Y can be obtained by inverse transformation of Tholapuras.
For example, it is easy to prove that the Gaussian random variable and the Gaussian distribution are independent distributions. Because if the random variable y obeys, then
。
PostScript: One day inadvertently browse to Yu Xin Teacher's blog "Wrapped without disorderly convolution", and then after a related search found that science on the internet there is a blog post on the topic of convolution. (http://news.sciencenet.cn/news/sub18.aspx?id=851) not to see or not to know, a look is really wonderful. Watching all the masters from all angles to show the significance of convolution, I this small descendants benefited greatly. But I only in the Shang news teacher "4" and Tian Zanyong teacher "5" blog post see About convolution and probability of introduction, as a statistical signal processing students, I took the liberty to write this article as a tribute to their work. Li Xiaowen Teacher's teaching is very interesting, with a series of convolution, I was the first time to see "6." ^_^
Because I have to refer to the literature, I cannot but say my understanding of the literature. Gig, I would venture to say my understanding of the topic of "Big talk convolution". Please forgive the teachers for not saying the right thing. I feel Homotis the "7" Teacher's understanding of convolution is the understanding of the signal processor. I understand so, please let me impersonate the disciple. My first contact with convolution is in the Oppenheim series of signal system books, which is to make full use of the nature of LTI systems, the linear combination to explain the convolution. Later, I read two or three of the signaling system books, and that is the explanation. But this is certainly not the only way to understand convolution. I don't even remember the convolution in the conversion book. In short, I feel that volume is just two ways of understanding, see a personal hobby.
Tang Changjie's blog has many interesting examples of "8" that are suitable for convolution modeling.
There are a lot of blog posts about convolution, I will not list, please see the topic.
Ps:1. Modeling small problems in this life is one of my hobbies, it should not be a waste of time, after all, after writing a post in class can also be used.
2. Dear Father, if you read this article, don't worry, I'm always drinking to think about this thing. I don't drink, please rest assured.
3. Baidu Encyclopedia is too unfriendly to the entry of mathematics, it is very difficult to edit the formula, MGF the entry written in a miserable.
"1" Yu Xin Teacher's "wrapped and not messy convolution", with the introduction of LTI system.
"2" Cao Guangfo Teacher's "big story", the math teacher tells convolution
"3" Prof. Xiaogang Wang Teacher's "roll", or "no roll", this is a problem, physics teacher to tell convolution
"4" Shang's "about convolution---and its application in vibration, central limit theorem and renewal theory"
"5" Tian Zanyong Teacher's "convolution in probability"
"6" Li Xiaowen Teacher's "The year to the students to tell convolution", through the multiplication of power series to tell convolution.
"7" Homotis Teacher's "scientific research Small Note:" Volume "and" Not volume ".
"8" Tang Changjie Teacher's "radiation, iodine, salt, air raids and convolution-----a discussion of the teaching difficulties."
Related topics: big Talk convolution
This article refers to address:http://blog.sciencenet.cn/blog-624263-786132.html This article from the Science Network Wang Yunlong Blog, reproduced please indicate the source.
Drinking games, probability distributions and convolution