SoftmaxLayer
represents a softmax net layer.
Details and Options
- SoftmaxLayer[][input] explicitly computes the output for input.
- SoftmaxLayer[][{input1,input2,…}] explicitly computes outputs for each of the inputi.
- SoftmaxLayer is typically used inside NetChain, NetGraph, etc. to normalize the output of other layers in order to use them as class probabilities for classification tasks.
- SoftmaxLayer exposes the following ports for use in NetGraph etc.:
-
"Input" a numerical tensor of dimensions d1×d2×…×dn "Output" a numerical tensor of dimensions d1×d2×…×dn - When it cannot be inferred from other layers in a larger net, the option "Input"->n can be used to fix the input dimensions of SoftmaxLayer.
- SoftmaxLayer effectively normalizes the exponential of the input tensor, producing vectors that sum to 1. The innermost dimension is used as the normalization dimension. Explicitly, when SoftmaxLayer is given a k-rank input tensor
, it produces the tensor
. - Equivalently, SoftmaxLayer computes Normalize[Exp[v],Total] when applied to a vector v, and is mapped onto level
when applied to a tensor of higher rank. - SoftmaxLayer["Input"shape] allows the shape of the input to be specified. Possible forms for shape are:
-
n a vector of size n {d1,d2,…} a tensor of dimensions d1×d2×… {"Varying",d1,d2,…} varying number of tensors of dimensions d1×d2×…
Examples
open allclose allBasic Examples (2)
Create a SoftmaxLayer:
Create a SoftmaxLayer that takes a vector of length 5 as input:
Scope (4)
Properties & Relations (3)
See Also
CrossEntropyLossLayer LinearLayer SummationLayer TotalLayer NetChain NetGraph NetTrain NetDecoder
Related Guides
Introduced in 2016
(11.0)