"Machine Learning algorithm-python implementation" Maximum likelihood estimation (Maximum likelihood)

Source: Internet
Author: User

1. BackgroundMaximum likelihood estimation is a statistical method often involved in probability theory. The general idea is that, given the probability density F, we take a sample, and we can calculate the probability of this sample implementation based on F. Of course the maximum likelihood can have a lot of changes, here to achieve a simple, actual project needs can be changed again. bloggers are referring to the wiki to learn, address please click on myhere is a particularly simple example of the following (from the maximum likelihood of a wiki) Discrete distribution, discrete finite parameter space ]

Consider an example of a coin toss. Let's say that the coin has a different front and reverse. We toss this coin 80 times (that is, we take a sample and write down the number of positives, the positive is H and the reverse is T). and the probability of throwing a positive is recorded as the probability of throwing a negative side (hence, here is equivalent to the top). Suppose we throw 49 heads, 31 tails, 49 times H, 31 times T. Suppose the coin was taken from a box of three coins. The probability of these three coins being thrown positive is,,. These coins are not marked, so we cannot know which is which. Using the maximum likelihood estimate , we can calculate which coin is the most likely to be used by these experimental data (i.e. sampled data). This likelihood function takes one of the following three values:

We can see that the likelihood function gets the maximum value at that time. This is the maximum likelihood estimate.


2. Implement part
one thing to mention is that, because of the factorial, the method of factorial is thought to be implemented by recursion. But Google has discovered that in fact Python's reduce method is more convenient to use, a word on the solution.  
def factorial (x):    return reduce (lambda x,y:x*y,range (1,x+1))  

Complete Project:
"' Created on 2014-8-22@author:garvinmaximum likelihood theory practicthis code are base on the Http://zh.wikipedia.org/wi Ki/%e6%9c%80%e5%a4%a7%e4%bc%bc%e7%84%b6%e4%bc%b0%e8%ae%a1 "W=2.0/3h=49t=31def DefineParam ():    H=h    T=t    return h,tdef Maximumlikelihood (p=w):    H,t=defineparam ()    f1=factorial (h+t)/(factorial (H) *factorial (T))        f2= (p**h) * ((1.0-p) **t)            return F1*F2        def factorial (x):    return reduce (lambda x,y:x*y,range (1,x+1))            


achieve the effect, corresponding to the above example, when h=49,t=31, is the probability of P=2/3 probabilities


Code Address: Please click on my

/********************************

* This article from the blog "Bo Li Garvin"

* Reprint Please indicate the source : Http://blog.csdn.net/buptgshengod

******************************************/



"Machine Learning algorithm-python implementation" Maximum likelihood estimation (Maximum likelihood)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.