Published on August 12, 2018 by

Deepmind released a paper just a few days ago describing a module for neural networks called the Neural Arithmetic Logic Unit (NALU). Although deep neural networks can learn to represent and manipulate numerical information, they don’t generalize well outside of the range of numbers encountered during training. Meaning train it on the numbers 1-10 and it won’t be able to count to 11. To improve this ability, the researchers created an architecture that represents numerical quantities as linear activations which are manipulated using primitive arithmetic operators, controlled by learned gates. Its really fascinating stuff, i’ll detail how it works in this video.

Code for this video:
https://github.com/llSourcell/Neural_Arithmetic_Logic_Units

Please Subscribe! And like. And comment. That’s what keeps me going.

Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
instagram: https://www.instagram.com/sirajraval
Facebook: https://www.facebook.com/sirajology

This video is apart of my Machine Learning Journey course:
https://github.com/llSourcell/Machine_Learning_Journey

More Learning Resources:
https://arxiv.org/abs/1808.00508
https://deepmind.com/blog/

Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/

Sign up for the next course at The School of AI:
https://www.theschool.ai

And please support me on Patreon:
https://www.patreon.com/user?u=3191693

Category Tag