Keras-based Deepdream experiment based on VGG19

Fork

I’ve just published a repository (https://github.com/giuseppebonaccorso/keras_deepdream) with a Jupyter notebook containing a Deepdream (https://github.com/google/deepdream) experiment created with Keras and a pre-trained VGG19 convolutional network. The experiment (which is a work in progress) is based on some suggestions provided by the Deepdream team in this blog post but works in a slightly different way. I use a Gaussian Pyramid and average the rescaled results of a layer with the next one. A total variation loss could be employed too, but after some experiments, I’ve preferred to remove it because of its blur effect.

Some examples obtained with different settings in terms of layers and number of iterations:

 

It’s possible to create amazing videos by zooming into the same image. This in an example created with 1500 frames:

Deepdream animation with Keras and VGG19

This video has been created using the notebook https://github.com/giuseppebonaccorso/keras_deepdream which is a Deepdream experiment based on some suggestions provided by the Deepdream team in http://googleresearch.blogspot.ch/2015/06/inceptionism-going-deeper-into-neural.html (with some modifications) and created with Keras and VGG19 convolutional network.

See also:

Neural artistic style transfer experiments with Keras – Giuseppe Bonaccorso

Artistic style transfer using neural networks is a technique proposed by Gatys, Ecker and Bethge in the paper: arXiv:1508.06576 [cs.CV] which exploits a trained convolutional network in order to reconstruct the elements of a picture adopting the artistic style of a particular painting.