ML Algorithms Addendum: Hopfield Networks

Hopfield networks (named after the scientist John Hopfield) are a family of recurrent neural networks with bipolar thresholded neurons. Even if they are have replaced by more efficient models, they represent an excellent example of associative memory, based on the shaping of an energy surface. In the following picture, there’s…

Quickprop: an almost forgotten neural training algorithm

Standard Back-propagation is probably the best neural training algorithm for shallow and deep networks, however, it is based on the chain rule of derivatives and an update in the first layers requires a knowledge back-propagated from the last layer. This non-locality, especially in deep neural networks, reduces the biological plausibility…

Artificial Intelligence is a matter of Language

“The limits of my language means the limits of my world.” (L. Wittgenstein)   When Jacques Lacan proposed his psychoanalytical theory based on the influence of language on human beings, many auditors remained initially astonished. Is language an actual limitation? In the popular culture, it isn’t. It cannot be! But,…

An annotated path to start with Machine Learning

“Do not worry about your difficulties in Mathematics. I can assure you mine are still greater.” (A. Einstein)   Machine Learning is becoming more and more widespread and, day after day, new computer scientists and engineers begin their long jump into this wonderful world. Unfortunately, the number of theories, algorithms, applications,…

ML Algorithms Addendum: Hebbian Learning

Hebbian Learning is one the most famous learning theories, proposed by the Canadian psychologist Donald Hebb in 1949, many years before his results were confirmed through neuroscientific experiments. Artificial Intelligence researchers immediately understood the importance of his theory when applied to artificial neural networks and, even if more efficient algorithms…

Hodgkin-Huxley spiking neuron model in Python

The Hodgkin-Huxley model (published on 1952 in The Journal of Physiology [1]) is the most famous spiking neuron model (also if there are simpler alternatives like the “Integrate-and-fire” model which performs quite well). It’s made up of a system of four ordinary differential equations that can be easily integrated using several…

Twitter Sentiment Analysis with Gensim Word2Vec and Keras Convolutional Networks

Fork Word2Vec (https://code.google.com/archive/p/word2vec/) offers a very interesting alternative to classical NLP based on term-frequency matrices. In particular, as each word is embedded into a high-dimensional vector, it’s possible to consider a sentence like a sequence of points that determine an implicit geometry. For this reason, the idea of considering 1D…

Lossy image autoencoders with convolution and deconvolution networks in Tensorflow

Fork Autoencoders are a very interesting deep learning application because they allow a consistent dimensionality reduction of an entire dataset with a controllable loss level. The Jupyter notebook for this small project is available on the Github repository: https://github.com/giuseppebonaccorso/lossy_image_autoencoder. The structure of a generic autoencoder is represented in the following figure:…

Keras-based Neural Artistic Style Transfer

Fork I’ve just moved my Keras-based Neural Artistic Style Transfer GIST to a dedicated repository: https://github.com/giuseppebonaccorso/Neural_Artistic_Style_Transfer. Please refer always to it because the GIST is not more maintained. See also: Neural artistic style transfer experiments with Keras – Giuseppe Bonaccorso Artistic style transfer using neural networks is a technique proposed by…