[Probability theory] Bayesian Law

Source: Internet
Author: User

Basic knowledge description:

Joint probability:

Definition: refers to the probability that multiple random variables in a probability distribution satisfy their respective conditions at the same time.

For example, if both X and Y are normal distributions, P {x <4, Y <0} is a joint probability, which indicates x <4, the probability that Y <0 conditions are established at the same time. The Union probability between x and y is expressed as P (xy), P (x, y), or p (x ∩ Y)

 

Conditional Probability:

Definition: event a's probability of occurrence when another event B has occurred.

Example: Under Condition B, the probability of Condition A is expressed as P (A | B)

 

Bayes rule:

Generally, the probability of event a under event B is different from that of Event B Under event a. However, there is a definite relationship between the two, and Bayesian law is the statement of this relationship. As a norm principle, the Bayesian law is effective in interpreting all probabilities. However, the frequency and Bayesian have different opinions on how probabilities are assigned values in applications: A frequent operator assigns values based on the frequency of a random event or the number of samples in the population. A Bayesian operator assigns values based on an unknown proposition. One result is that Bayes has more opportunities to use Bayesian rules.

Therefore, if there are only two events a and B, the following can be obtained:

P (A | B) = P (AB)/P (B), P (B | A) = P (AB)/P () => P (A | B) = P (B | A) * P (A)/P (B)

Therefore, the Bayesian Rule is about the conditional probability and edge probability of random events A and B.

P (A | B) = P (B | A) * P (A)/P (B) ≈ L (A | B) * P ()

Here, l (a | B) is the possibility that a will occur when B occurs.

In Bayes's law, each term has a common name:

1. P (A) is the prior probability or edge probability of. It is called "A prior" because it does not consider any factors of B. 2. P (A | B) is the conditional probability of a after B is known. It is also called the posterior probability of a because of the value of B.

3. P (B | A) is the conditional probability of B after occurrence of A. It is also called the posterior probability of B because of the value of.

4. P (B) is the prior probability or edge probability of B, and is also a standard constant (normalized constant ).

Bayesian formula:

Posterior Probability = (likelihood * prior probability)/standard constant

That is, the posterior probability is proportional to the product of the prior probability and likelihood.

Therefore, posterior probability = standard likelihood * prior probability

Example:

For events a and B, the posterior probability of a indicates P (A | B), the prior probability of a indicates P (A), and the likelihood is P (B | ), standardized constant P (B), so the standard likelihood is P (B | A)/P (B)

[Probability theory] Bayesian Law

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.