Deepmind released a paper just a few days ago describing a module for neural networks called the Neural Arithmetic Logic Unit (NALU). Although deep neural networks can learn to represent and manipulate numerical information, they don’t generalize well outside of the range of numbers encountered during training. Meaning train it on the numbers 1-10 and it won’t be able to count to 11. To improve this ability, the researchers created an architecture that represents numerical quantities as linear activations which are manipulated using primitive arithmetic operators, controlled by learned gates. Its really fascinating stuff, i’ll detail how it works in this video.
Code for this video:
Please Subscribe! And like. And comment. That’s what keeps me going.
This video is apart of my Machine Learning Journey course:
Join us in the Wizards Slack channel:
Sign up for the next course at The School of AI:
And please support me on Patreon: