BatchNormalizationLayer
represents a trainable net layer that normalizes its input data by learning the data mean and variance.
Details and Options
- The following optional parameters can be included:
-
"Beta" Automatic learnable bias parameters "Epsilon" 0.001` stability parameter "Gamma" Automatic learnable scaling parameters "Momentum" 0.9 momentum used during training "MovingMean" Automatic moving estimate of the mean "MovingVariance" Automatic moving estimate of the variance - With Automatic settings, gamma, beta, moving variance, and moving mean are added automatically when NetInitialize or NetTrain is used.
- If gamma, beta, moving variance, and moving mean have been added, BatchNormalizationLayer[…][input] explicitly computes the output from applying the layer.
- BatchNormalizationLayer[…][{input1,input2,…}] explicitly computes outputs for each of the inputi.
- NetExtract can be used to extract gamma, beta, moving variance, and moving mean from a BatchNormalizationLayer object.
- BatchNormalizationLayer is typically used inside NetChain, NetGraph, etc. to regularize and speed up network training.
- BatchNormalizationLayer exposes the following ports for use in NetGraph etc.:
-
"Input" a rank-1 or rank-3 tensor "Output" a rank-1 or rank-3 tensor - When it cannot be inferred from other layers in a larger net, the option "Input"->{n1,n2,…} can be used to fix the input dimensions of BatchNormalizationLayer.
- BatchNormalizationLayer updates the values of "MovingVariance" and "MovingMean" during training with NetTrain.
Examples
open allclose allBasic Examples (2)
Create a BatchNormalizationLayer:
Create an initialized BatchNormalizationLayer that takes a vector and returns a vector:
Scope (2)
Options (3)
Applications (1)
Possible Issues (3)
See Also
DropoutLayer NetEvaluationMode ConvolutionLayer PoolingLayer LocalResponseNormalizationLayer NetChain NetGraph NetInitialize NetTrain NetExtract
Related Guides
Introduced in 2016
(11.0)