Tag Archives: LSTM

Teaching recurrent Neural Networks about Monet


via Teaching recurrent Neural Networks about Monet.

Recurrent Neural Networks have boomed in popularity over the past months, thanks to articles like the amazing The Unreasonable Effectiveness of Recurrent Neural Networks by Andrej Karpathy.

Long story short, Recurrent Neural Networks (RNNs) are a type of NNs that can work over sequences of vectors and where their elements keep track of their state history.

Neural Networks are increasingly easy to use, specially in the Python ecosystem, with libraries like Caffe, Keras or Lasagne making the assembly of neural networks a trivial task.

I was checking the documentation on Keras and found an example togenerate text from Nietzsche readings via a Long Short Term Memory Network (LSTM).

I run the example, and after a couple hours the model started producing pretty convincing, Nietzsche-looking text.

RecurrentJS: Deep Recurrent Neural Networks and LSTMs in Javascript.


via karpathy/recurrentjs · GitHub.

RecurrentJS is a Javascript library that implements:

  • Deep Recurrent Neural Networks (RNN)
  • Long Short-Term Memory networks (LSTM)
  • In fact, the library is more general because it has functionality to construct arbitrary expression graphs over which the library can perform automatic differentiation similar to what you may find in Theano for Python, or in Torch etc. Currently, the code uses this very general functionality to implement RNN/LSTM, but one can build arbitrary Neural Networks and do automatic backprop.