Dropout optimization
During the learning phase, the connections with the next layer can be limited to a subset of neurons to reduce the weights to be updated, this learning optimization technique is called dropout. The dropout is therefore a technique used todecrease the overfitting within a network with many layers and/or neurons. In general, the dropout layers are positioned after the layers that possess a large amount of trainable neurons.
This technique allows setting to 0, and then excluding the activation of a certain percentage of the neurons of the preceding layer. The probability that the neuron's activation is set to 0 is indicated by the dropout ratio parameter within the layer, via a number between 0 and 1: in practice the activation of a neuron is held with probability equal to the dropout ratio, otherwise it is discarded, that is, set to 0.
The neurons by this transaction do not affect, therefore, during the forward propagation and even during the next backward propagation of...