When training neural networks, use dropout technology to prevent over-fitting of the network. We don't talk about the details of this technique here, but we understand the interesting evolutionary interpretation of this technique. Nature's higher organisms have evolved sexual reproduction, which can be explained by the fact that mutated genes can spread throughout the race. But dropout that sexual reproduction is not just about making genes easier to spread, but that the strategy of sexual reproduction increases the robustness of genes. Why is it? Imagine that if an individual could reproduce, the individual could rely on the mutation and collaboration of each gene in his own genome to adapt to the environment, but when it was necessary to reproduce by mixing parents ' genes in order to produce offspring, collaboration within a single genome was broken by birds, Each gene not only tries to rely on itself to control various physiological functions independently, but also to be able to work with genes from other random individuals, both of which make each gene more robust. In the event of an environmental mutation, such as a space UV that destroys one-third of your genes, you can also survive to reproduce the next generation.
References: (Dropout:a simple-to-Prevent neural Networks from Overfitting.pdf) (improving neural Networks by preventing co- Adaptation of feature Detectors.pdf)
Interesting evolutionary explanations about dropout