DropoutLayer
represents a net layer that sets its input elements to zero with probability 0.5 during training, multiplying the remainder by 2.
DropoutLayer[p]
sets its input elements to zero with probability p during training.
Details and Options
- DropoutLayer[…][input] explicitly computes the output from applying the layer.
- DropoutLayer[…][{input1,input2,…}] explicitly computes outputs for each of the inputi.
- DropoutLayer is typically used inside NetChain, NetGraph, etc.
- The following optional parameters can be included:
-
Method "Dropout" dropout method to use - Possible explicit settings for the Method option include:
-
"Dropout" sets the input elements to zero with probability p during training, multiplying the remainder by 1/(1-p) "AlphaDropout" keeps the mean and variance of the input constant; designed to be used together with the ElementwiseLayer["SELU"] activation - DropoutLayer exposes the following ports for use in NetGraph etc.:
-
"Input" a numerical tensor or sequence of tensors of arbitrary rank "Output" a numerical tensor or sequence of tensors of arbitrary rank - DropoutLayer normally infers the dimensions of its input from its context in NetChain etc. To specify the dimensions explicitly as {n1,n2,…}, use DropoutLayer["Input"->{n1,n2,…}].
- DropoutLayer only randomly sets input elements to zero during training. During evaluation, DropoutLayer leaves the input unchanged, unless NetEvaluationMode->"Train" is specified when applying the layer.
- DropoutLayer is commonly used as a form of neural network regularization.
Examples
open allclose allBasic Examples (2)
Create a DropoutLayer:
Create a DropoutLayer and apply it to an input, which remains unchanged:
Use NetEvaluationMode to force training behavior of DropoutLayer:
Scope (2)
Options (1)
Properties & Relations (1)
Possible Issues (1)
See Also
BatchNormalizationLayer NetEvaluationMode NetChain NetGraph NetTrain ElementwiseLayer
Related Guides
Introduced in 2016
(11.0)
| Updated in 2017 (11.2)