MeanAbsoluteLossLayer
represents a loss layer that computes the mean absolute loss between the "Input" port and "Target" port.
Details and Options
- MeanAbsoluteLossLayer exposes the following ports for use in NetGraph etc.:
-
"Input" a tensor of arbitrary rank "Target" a tensor of the same rank as "Input" "Loss" a real number - MeanAbsoluteLossLayer[…][<"Input" -> in, "Target"target>] explicitly computes the output from applying the layer.
- MeanAbsoluteLossLayer[…][<"Input"->{in1,in2,…},"Target"->{target1,target2,…}>] explicitly computes outputs for each of the ini and targeti.
- MeanAbsoluteLossLayer is typically used inside NetGraph to construct a training network.
- A MeanAbsoluteLossLayer[…] can be provided as the third argument to NetTrain when training a specific network.
- MeanAbsoluteLossLayer["port"->shape] allows the shape of the given input "port" to be specified. Possible forms for shape include:
-
"Real" a single real number n a vector of length n {n1,n2,…} a tensor of dimensions n1×n2×… "Varying" a variable-length vector {"Varying",n2,n3,…} a variable-length sequence of tensors of dimensions n2×n3×…
Examples
open allclose allBasic Examples (3)
Create a MeanAbsoluteLossLayer layer:
Create a MeanAbsoluteLossLayer that takes length-3 vectors:
Create a NetGraph containing a MeanAbsoluteLossLayer:
Scope (4)
Applications (1)
Properties & Relations (2)
Possible Issues (1)
See Also
MeanSquaredLossLayer CrossEntropyLossLayer NetGraph NetTrain ManhattanDistance
Related Guides
Introduced in 2016
(11.0)