Published on July 9, 2018 by

The most popular optimization strategy in machine learning is called gradient descent. When gradient descent is applied to neural networks, its called back-propagation. In this video, i’ll use analogies, animations, equations, and code to give you an in-depth understanding of this technique. Once you feel comfortable with back-propagation, everything else becomes easier. It uses calculus to help us update our machine learning models. Enjoy!

Code for this video:
https://github.com/llSourcell/backpropagation_explained

Please Subscribe! And like. And comment. That’s what keeps me going.

Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology
instagram: https://www.instagram.com/sirajraval

This video is apart of my Machine Learning Journey course:
https://github.com/llSourcell/Machine_Learning_Journey

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

Sign up for the next course at The School of AI:
https://www.theschool.ai

And please support me on Patreon:
https://www.patreon.com/user?u=3191693

Category Tag