Seq2Seq experiment with mathematical expressions

After reading the article “How to Learn to Add Numbers with seq2seq Recurrent Neural Networks” by Jason Brownlee (that I suggest reading before going on), I’ve decided to try an experiment with more complex expressions like: -(10+5) or 4+ -2, etc. The code (with some extra information) is published on the GIST: https://goo.gl/ZmH6Tf, where there are also some test results.

Unfortunately, the results are not extraordinary and there are still many errors, however, I think it depends on the size of the dataset and on the limited ability to generalize that Seq2Seq networks show. I’m working on an enhanced version, to allow a bit more generalization.

Complete Python script (Keras 2 with Theano/Tensorflow is needed, moreover I’ve used Scikit-Learn for binarization):

See also:

Hopfield Networks addendum: Brain-State-in-a-Box model – Giuseppe Bonaccorso

The Brain-State-in-a-Box is neural model proposed by Anderson, Silverstein, Ritz and Jones in 1977, that presents very strong analogies with Hopfield networks (read the previous post about them). The structure of the network is similar: recurrent, fully-connected with symmetric weights and non-null auto-recurrent connections. All neurons are bipolar (-1 and 1).