Bayesian decision _bayes

Source: Internet
Author: User

1. Introduction of simple examples

2. Prior probability

3. Post-Test probability

4. Minimum error rate decision

5. Minimum risk Bayesian decision

1. Bayesian formula

2 simple examples

Under normal circumstances, we can quickly divide the people on the street into two categories: male and female. The people on the street here are the samples we have observed, and dividing each person into male and female is the process of making decisions. The problem above is a classification problem.

Classification can be thought of as a decision, that is, what kind of decision we should make to the sample based on observations.

Suppose I hold a coin in my hand and let you guess how much the coin is, which can actually be regarded as a question of categorical decision-making: you need to make a decision from every possible coin. The coin assumes a value of 1 horns, 5 horns, and 1 pieces.

If you tell the coin beforehand that it can only be a corner or a five-point, then the problem is a two classification problem.

3. Prior probabilities

Some problems of prior probability

4. Post-Test probabilities

5. Decision-making

7. Example

The prior probabilities of normal and abnormal two types of cell recognition in a local area were,

Normal state

An existing cell to be identified, whose observed value is X, is obtained from the probability density distribution curve of the class condition.

To classify the cells.

Solution: Two kinds of posterior probabilities are calculated by using Bayesian formula

8. Minimum error rate decision

According to Bayesian decision rules, because

P (W1 | x) = 0.818 > P (W2 | x) = 0.182

Therefore, the X is categorized in a normal state.

Suppose there is an observation x,

If X makes P (w1|x) > P (w2|x), then we will naturally make the actual category is W1 judgment

If X makes P (w2|x) > P (w1|x), then we prefer to choose W2

The error probability of making a decision on the basis of this rule:

Obviously, for a given x, using the above rule can minimize the probability of error.

The question is, can this rule minimize the average error probability?

Average error probability:

If for each x we can ensure that P (error|x) as small as possible, then the above integral value is also necessarily the smallest

9. Minimum risk Bayesian decision

The preceding decision rules are given under the principle of minimum error rate. However, depending on the situation, we may be concerned with not only the error rate but the loss of the error. It is different to think of the five-point mistake as a corner and the wrong angle of five.

In cancer cell recognition, if a normal cell is misjudged as a cancer cell, it will give the patient a mental burden and unnecessary further examination, which is a loss or risk; Conversely, if the cancer cells are misjudged as normal cells, then the loss is greater, which may lead to loss of valuable early detection of cancer opportunities, May even have serious consequences that affect the life of the patient.

It is inappropriate in many cases to treat both kinds of errors equally.

The so-called minimum-risk Bayesian decision-making is to consider a kind of optimal decision when various errors result in different losses.

Basic idea:

Minimizing the error rate is not necessarily the best choice for universal use.

Cancer Cell classification

Two types of errors:

Cancer cells –> Normal cells

Normal cells –> cancer cells

The cost (loss) of two errors is different

It is better to enlarge some total error rate, but also to reduce the total loss.

The introduction of a more extensive concept associated with loss-risk.

When making decisions, take into account the risks involved.

The Bayesian decision-making rules based on the minimum risk are precisely to reflect this point.

10. Classification method of Bayesian decision theory

Pros: Less data is still valid and can handle multiple categories of problems.

Cons: Sensitive to the way the input data is prepared.

Application: A common algorithm for document classification.

Code instance

Problem Analysis:

Example description

Unified Text Form

Code implementation

Build Glossary-Dictionaries

Word vector-word set, Word bag model

Class probabilities

Class conditional probability density

Judge:

There is a problem

The probability of class condition and the result of priori probability

Results

Example-Using the Bayesian method to classify messages

Normal data

Junk e-mail data

Classification steps

Junk e-mail classification

Bayesian decision _bayes

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.