July online April machine learning algorithm class notes--no.1
Objective
Machine learning is a multidisciplinary interdisciplinary, including probability theory, statistics, convex analysis, feature engineering and so on. Recently followed the July algorithm to learn the knowledge of machine learning, dry goods, a lot of examples, more understanding than reading faster, a rough summary, easy to review later. First, how to understand the "machine learning" 1. The goal of machine learning is: How computer programs automatically improve performance as experience accumulates. 2. Process: For a certain class of tasks T and performance goal p, if a computer program on T on the performance of P measured with experience e and self improvement, then we call this computer program in the learning from experience E.
Take a chestnut: Chinese chess task t--chess performance Goal p--beat the opponent in the game (percentage) training experience e--and oneself play chess, see games study etc.
3. Machine learning and AI: machine learning is a branch of artificial intelligence. We use a computer to design a system that allows it to learn in a certain way based on the training data provided; With the increase of training times, the system can continuously learn and improve the performance, through the learning model of parameter optimization, it can be used to predict the output of related problems.
4. Machine Learning Algorithm Classification:
(1) Supervised learning supervised: For example, user click/purchase forecast, house price forecast
(2) Unsupervised learning unsupervised: e.g. mail/news clustering
(3) Intensive learning: For example, dynamic systems and robot control
Common algorithms for each category:
Second, the relevant mathematical background:
1. Derivative: Second derivative is the slope of the change in the speed of the response, characterizing the concave-convex curve.
2. Information entropy leading:
——》
3. Taylor Formula:
Function: The complex function is decomposed into polynomial, so that the solution is simpler.
Application: To investigate the relationship between the image, entropy and classification error rate of Gini index.
4. Convex Function Property:
5. The basic formula of probability theory
Examples of applications of Bayesian formulas:
6. Common Distribution Summary:
MORE:
July Algorithm April Machine learning course