Add Dropout Layer
Synopsis
Adds a dropout layer to your neural net structure.
Description
This operator has to be placed into the subprocess of the Deep Learning, Deep Learning (Tensor) or Autoencoder operator. It adds a dropout layer to the neural net structure. Dropout is a regularization technique meant to reduce the chance of overfitting by randomly dropping neurons during training. The dropout rate of neurons is set with the dropout rate parameter.
Input
layerArchitecture
A network configuration setup with previous operators. Connect this port to the layerArchitecture output port of another add layer operator or to the layer port of the "Deep Learning" operator if this layer is the first one.
Output
layerArchitecture
The network with the configuration for this Dropout layer added. Connect this port to the next input port of another layer or the layer port on the right side of the "Deep Learning" operator.
Parameters
Dropout rate
Define a rate between 0 and 1 that is used to randomly remove a neuron of this layer during training. This is only applied during training and helps to reduce overtraining.
Layer name
Provide a name for the layer for ease of identification, when inspecting the model or re-using it.