Recurrent Neural Network & Long short term memory is one of the hottest areas in Deep Learning. 

 

1. Recurrent Neural Network (RNN)

RNN can handle sequence input as well as sequence output.

 

2. Long short term memory (LSTM)

LSTM is  one of  RNN architectures that provide a solution to the exploding and vanishing gradient problem. LTSM also provides methods to deal with long term dependencies that arise from variable assignment. This is a structure of LSTM provided by Alex Graves, Google Deep Mind (Generaing Sequences With Recurrent Neural Networks, June 2014, p5) 

Theses are formulas that present structure of LSTM

 LSTM structure     i  : input gate, f : forget gate,  o  : output gate,  c  : memory cell

LSTM structure    i:input gate, f:forget gate, o:output gate, c:memory cell