Overfitting, see figure below. That is, your model is good enough to be useful only for training data, and test data may not be visible.
The reason is, as the figure says, too many feature, perhaps these feature are redundant.
How to solve this problem i. The first thought might be to reduce feature. But this has to be done manually.
Second, look at the problem in a different way (the world may be very different). If, like the overfitting example above, the THETA3 and theta4 are very small, even 0, they can alleviate the overfitting to some extent. As shown in the figure below, we use the 1000 coefficient to make theta3 and theta4 very small at min costfunction. The last curve I got was the pink one.
So, our lovely predecessors proved the following formula (proving to be a headache, referring to the "cornerstone of machine learning" course at Professor Lin Xuan, Taiwan University)
A parameter lambda is introduced here. The choice of lambda is crucial, and a good lambda avoids overfitting. Also, if the lambda is particularly large, theta will have a value of 0 as a whole, resulting in underfit
After J (theta) joins the regularization item, its partial differential also must follow the change, directly affects the GD algorithm computation theta iterative formula
One thing to note here: Our lambda doesn't work on theta0, because X0=1 is our man-made addition.
In logistic regression, there are similar effects.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.