1. set X is a random variable, and the value range is a set of symbols containing M letters. Proof 0<=h(x)<=log2M.
Because of the entropy of all probability distributions p,
Maximum at equal probabilities and Hmax (X) =log2m
So H (X) <=log2m.
Because X is a random variable, the probability of occurrence is P (x).
The axiomatic definition of probability is as follows: 0<=p (X) <=1
Known h (x) =-e P (x) *LOGP (x) h (x) >=0
So 0<=h (X) <=log2m
2. prove that if the element of a sequence is observed as the IID distribution, the entropy of the sequence is equal to the first order entropy.
Certificate: because H (x) =limn→∞ (1/N*GN), the
Gn=-σi=n ... Σi=1 P (Ai) logp (x1=i1,x2=i2,..., xn=in) * log (X1=i1,x2=i2,..., xn=in)
When the sequence is an iid distribution, the
gn=-nσi=1p (X1=I1) Logp (X1=I1)
so H (x) =-σp (x1) LOGP (x1) is the first order entropy
3. given the symbol set a={a1,a2,a3,a4}, the first-order entropy is obtained under the following conditions:
(a) p(a 1)= P(a2)= P( a3)= P(a4)=1/4
Solution: H (a) =-4X1/4 log (section) =-log (1/4) =-(-2)
H (a) =2 bit/characters
(b) p(a1)=1/2, P(a2)=1/4, P( a3)= P(a4)=1/8
Solution: H=-∑p (AI) logp (AI) =-1/2*LOG1/2-1/4*LOG1/4-2*1/8*LOG1/8
h=7/4=1.75 bit/Characters
(c) P(a1)=0.505, P(a2)=1/4, p ( a3)=1/8, P(a4)=0.12
Solution: H=-∑p (AI) logp (AI) =-0.505*log0.505-1/4*log1/4-1/8*log1/8-0.12*log0.12
h=1.135 bit/Characters
Second assignment 140705010027 Tracy Wu