Back-propagation is fundamental to deep learning. Hinton (the inventor) recently said we should “throw it all away and start over”. What should we do? I’ll describe how back-propagation works, how its used in deep learning, then give 7 interesting research directions that could overtake back-propagation in the near term.

Code for this video:
https://github.com/llSourcell/7_Research_Directions_Deep_Learning

Please Subscribe! And like. And comment.

Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology

More learning resources:



https://deeplearning4j.org/deepautoencoder
https://deeplearning4j.org/glossary

[N] Hinton says we should scrap back propagation and invent new methods from MachineLearning


A Step by Step Backpropagation Example
http://kvfrans.com/generative-adversial-networks-explained/

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/

source