Deep learning: Assuming a deep neural network are properly regulated, can adding more layers actually make the Performa NCE degrade?
I found this to be really puzzling. A deeper nn is supposed to being more powerful or at least equal to a shallower NN. I have already used dropout to prevent overfitting. How can the performance be degraded?Re-askFollow Yoshua ' s AnswerView 2 other Answers Yoshua Bengio, My Lab has been one of the three that started the deep learning approach, BAC ... Upvoted by Prateek Tandon,Robotics and strong Artificial Intelligence researcher PaulKing, computational neuroscientist, Technology entrepreneur Jack Rae,Google DeepMind , EngineerYoshua have endorsements inDeep Learning. If you don't change the size of the layers and just add more layers, capacity should increase, so we could be overfittin G. However, you should check whether training error increases or decreases. If it increases (which is also very plausible), it means this adding the layer made the optimization harder, with the Opti Mization methods and initialization that is using. That could also explain your problem. However, if training error decreases and test error increases, you is overfitting.
Deep learning:assuming A deep neural network are properly regulated, can adding more layers actually make the performance Degrade?