Here are the note for lecture five.
There'll be several points
1. Training and Testing
Both of these is about data. Training is using the data to get a fine hypothesis, and testing are not.
If we get a final hypothesis and want to test it, it turns to testing.
2. Another-Verify that learning is feasible. Firstly, let me show you a inequlity.
As it mentions on note 2, in the inequlity, the complexity of your hypothesis can is reflected by M.
However, M is almost meaningless, and because of this, your hypothesis would be useless.If We can replace
M with another quantity, and the quantity isn't meaningless, that means not infinite, and then we can start
Our learning in an actual model. (Our learning is feasible)
What is M? It mentioned before thatM is the maxnum of hypothesis.So can we figure number of hypothesis to
Replace M? The answer turns true.
The maxnum of hypothesis is different choice of different points. If the number of uncertain is a, and the number
Of choice for uncertain are B, then the maxnum of hypothesis come out, its a^b.
But it seems isn't smoothly like that, there is several hypothesis could not being built up,Generlly the number of hypothesis
That can is built is less than a^b.
Let's come back to the inequlity, we can prove it mathematically thatif M can be replaced by a polynomial, which means the number of hypothesis in a set are not infinite and then we can declar E that learning was feasible using this hypothesis set.There is a new statement this wil be proved next lecture, if the maxnum of hypothesis are less than its max-value, the numb Er of hypothesis could was replaced by a polynimial, which is, and learning is feasible using the hypothesis set.
According to above statement, if there is several hypothesis can not is built up, then set for the hypothesis would be FEA Sible for learning.
Note for video machine learning and Data mining--training vs Testing