Course Description:This is the last lesson of the course, the author first summed up the theory, methods, models, paradigms, and so on machine learning. Finally, the application of Bayesian theory and Aggregation (aggregation) method in machine learning is introduced. Course Outline:1, machine learning map.2, Bayesian theory.3, Aggregation (aggregation).
1. Machine Learning MapThere are so many ways and models of machine learning that it is simply dizzying. Like a list of them. However, we do not recommend one by one to learn these methods, models, otherwise easily lost in these methods can not extricate themselves. If we know enough about the framework of machine learning, then we can harness a multitude of methods.The basic framework of machine learning is listed below, which requires a deep understanding.Theory as the foundation of machine learning is the source of all things. Without the support of theory, everything is just bullying. Paradigms provide a big direction for our problems, and different problems correspond to different paradigms. When we know the paradigm of the problem that is currently being solved, it is easier to find a solution. Because there are preliminary guidelines under the paradigm. Technology is the concrete way we solve a problem.
2. Bayesian theoryAccording to Bayesian theory there are:Where h is assumed that D is a dataset and F is the target function. P (h=f) is a priori probability, p (d|h=f) is as long as we know P (h=f) and P (d|h=f). Then we can see which hypothesis (h) is better approximation f in the case of a given data set. The question is: How do I know P (h=f) and P (d|h=f)?In general, we have no way to directly calculate P (h=f), only the value of P (h=f) can be estimated by experience (-_-). The calculation of P (D|H=F) takes into account the influence of noise. In a word: hard to find. "See the video a few times do not see how to ask ...", you can refer to here.If we know a priori probability, we can: BayesianTheory is very useful, non-few words can be said clear, but also to see the Book of special introduction.
3. Aggregation (Aggregation)What is aggregation? is to carry out a number of independent learning processes, and finally the results are combined output. Note that the learning process here is independent of each other.There are two types of aggregations:1) After the fact: combine solutions that already exist.2) before the fact: build the solution that will be combined.For the first scenario, for the regression equation, suppose there is now a hypothetical set: H1,H2, ... HT, then:The selection principle of weight A is to minimize the errors in the aggregation hypothesis set.For the second scenario, there are: bagging, boost, and other methods.
California Institute of Technology Open Class: machine learning and data Mining _epilogue (18th session-end)