Recurrent neural networks
The basic idea behind recurrent neural networks is the vectorization of data. If you look at figure Fixed sized inputs of neural networks, which represents a traditional neural network, each node in the network accepts a scalar value and generates another scalar value. Another way to view this architecture is that each layer in the network accepts a vector as its input and generates another vector as its output. Figure Neural network horizontally rolled up and figure Neural network vertically rolled up illustrate this representation:

Neural network horizontally rolled up

Neural network vertically rolled up
The figure Neural network vertically rolled up is a simple RNN representation, which is a one-to-one RNN; one input is mapped to one output using one hidden layer.
RNN architectures
Typically, RNNs have many different architectures. In this section, we will go over some basic architectures of RNNs and discuss how they fit various, different text mining applications...