Paul Werbos
From Wikipedia, the free encyclopedia
Paul J. Werbos (born 1947) is a scientist best known for his 1974 Harvard University Ph.D. thesis, which first described the process of training artificial neural networks through backpropagation of errors.[1] The thesis, and some supplementary information, can be found in his book, The Roots of Backpropagation (ISBN 0-471-59897-6). He also was a pioneer of recurrent neural networks.[2]
Werbos was one of the original three two-year Presidents of the International Neural Network Society (INNS). He also has won the IEEE Neural Network Pioneer Award, for the discovery of backpropagation and other basic neural network learning frameworks such as Adaptive Dynamic Programming.
He currently works for the National Science Foundation.
References
- ↑ Paul J. Werbos. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD thesis, Harvard University, 1974
- ↑ Paul J. Werbos. Backpropagation through time: what it does and how to do it. Proceedings of the IEEE, Volume 78, Issue 10, 1550 - 1560, Oct 1990, doi10.1109/5.58337
External links
|
This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.