Learning division with neural arithmetic logic modules
Learning division with neural arithmetic logic modules
To achieve systematic generalisation, it first makes sense to master simple tasks such as arithmetic. Of the four fundamental arithmetic operations (+,-,$\times$,$\div$), division is considered the most difficult for both humans and computers. In this paper we show that robustly learning division in a systematic manner remains a challenge even at the simplest level of dividing two numbers. We propose two novel approaches for division which we call the Neural Reciprocal Unit (NRU) and the Neural Multiplicative Reciprocal Unit (NMRU), and present improvements for an existing division module, the Real Neural Power Unit (Real NPU). Experiments in learning division with input redundancy on 225 different training sets, find that our proposed modifications to the Real NPU obtains an average success of 85.3$\%$ improving over the original by 15.1$\%$. In light of the suggestion above, our NMRU approach can further improve the success to 91.6$\%$.
cs.NE, cs.LG, stat.ML
Mistry, Bhumika
36ac2f06-1a50-4c50-ab5e-a57c3faab549
Farrahi, Katayoun
bc848b9c-fc32-475c-b241-f6ade8babacb
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
11 October 2021
Mistry, Bhumika
36ac2f06-1a50-4c50-ab5e-a57c3faab549
Farrahi, Katayoun
bc848b9c-fc32-475c-b241-f6ade8babacb
Hare, Jonathon
65ba2cda-eaaf-4767-a325-cd845504e5a9
[Unknown type: UNSPECIFIED]
Abstract
To achieve systematic generalisation, it first makes sense to master simple tasks such as arithmetic. Of the four fundamental arithmetic operations (+,-,$\times$,$\div$), division is considered the most difficult for both humans and computers. In this paper we show that robustly learning division in a systematic manner remains a challenge even at the simplest level of dividing two numbers. We propose two novel approaches for division which we call the Neural Reciprocal Unit (NRU) and the Neural Multiplicative Reciprocal Unit (NMRU), and present improvements for an existing division module, the Real Neural Power Unit (Real NPU). Experiments in learning division with input redundancy on 225 different training sets, find that our proposed modifications to the Real NPU obtains an average success of 85.3$\%$ improving over the original by 15.1$\%$. In light of the suggestion above, our NMRU approach can further improve the success to 91.6$\%$.
Text
2110.05177v2
- Author's Original
More information
Published date: 11 October 2021
Additional Information:
28 pages, 24 figures. New experiments included(Section 7 and Appendix G)
Keywords:
cs.NE, cs.LG, stat.ML
Identifiers
Local EPrints ID: 469130
URI: http://eprints.soton.ac.uk/id/eprint/469130
PURE UUID: a902a04e-65b4-4b2d-894e-ef00d443ffae
Catalogue record
Date deposited: 07 Sep 2022 17:09
Last modified: 17 Mar 2024 03:58
Export record
Altmetrics
Contributors
Author:
Bhumika Mistry
Author:
Katayoun Farrahi
Author:
Jonathon Hare
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics