Workarounds for high-deviation, high-variance issues:
1. Solution to high variance problem: Increase training sample amount, reduce feature amount, increase lambda value
2. Solutions to high-variance problems: Increase the number of features, add polynomial features (such as x1*x2,x1 squared, and so on), reduce lambda values
Hidden layer selection affects the fit effect:
There are too few hidden layers, the neural network is simple, the parameters are few, and the lack of fitting is easy to appear.
There are too many hidden layers, complex neural networks, many parameters, easy to fit, and also a large amount of computation.
In fact, if a neural network, especially a large neural network, is used frequently, the larger the network performance, the better, if there is overfitting, you can use a regularization method to correct the Fit . The uses a large neural network and uses regularization to correct over-fitting problems, which is usually better than using a small neural network.
Finally, we need to determine the number of layers in the hidden layer. The default is to use a hidden layer is a reasonable choice, but if you want to choose the most appropriate layer of hidden layer, you can also try to split the data into training sets, validation sets and test sets, and then try to use a hidden layer of neural network to train the model. Then try two, three hidden layers, and so on. Then see which neural network behaves best on the cross-validation set. That means you get three neural network models, one, two, and three hidden layers, respectively. Then you test each model with cross-validation set data, calculate the cross-validation set error JCV in three cases, and then select the neural network structure you think is best.
Public lessons at Stanford University Machine Learning: Advice for applying machines learning | deciding-to-try Next (Revisited) (for high-deviation, high-variance issues Resolution and the selection of hidden layers)