Adding a convolutional layer
We can add one-dimensional CNN and max-pooling layers after the embedding layer, which will then feed the consolidated features to the LSTM.
Here is our embedding layer:
model = Sequential() model.add(Embedding(top_words,\ embedding_vector_length,\ input_length=max_review_length))
We can apply a convolution layer with a small kernel filter (filter_length
) of size 3
, with 32
output features (nb_filter
):
model.add(Conv1D (padding="same", activation="relu", kernel_size=3,\ num_filter=32))
Next, we add a pooling layer; the size of the region to which max pooling is applied is equal to 2
:
model.add(GlobalMaxPooling1D ())
The next layer is a LSTM
layer, with 100
memory units:
model.add(LSTM(100))
The final layer is a Dense
output layer, with a single neuron and a sigmoid
activation function, to make 0
or 1
predictions for the two classes (good and bad) in the problem (that is, binary classification problem):
model.add(Dense(1, activation='sigmoid...