Dropout (neural networks)
Dropout is a technique of reducing overfitting in neural networks by preventing complex co-adaptations on training data. It is a very efficient way of performing model averaging with neural networks.[1] The term "dropout" refers to dropping out units (both hidden and visible) in a neural network.[2]
See also
References
- โ "(1207.0580) Improving neural networks by preventing co-adaptation of feature detectors". Arxiv.org. Retrieved July 26, 2015.
- โ "Dropout: A Simple Way to Prevent Neural Networks from Overfitting". Jmlr.org. Retrieved July 26, 2015.
This article is issued from Wikipedia - version of the Sunday, January 17, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.