NetGraph
NetGraph[{layer1,layer2,…},{m1n1,m2n2,…}]
specifies a neural net defined by a graph in which the output of layer mi is given as input to layer ni.
NetGraph["name1"layer1,"name2"layer2,…,{"namem1""namen1",…}]
specifies a net with explicitly named layers.
Details
- For a net with a single input port, NetGraph[…][data] gives the result of applying the net to data.
- For a net with multiple input ports, NetGraph[…][<port1->data1,…>] provides data to each port.
- For a net with a single output port, NetGraph[…][data] gives the output for that port.
- For a net with multiple output ports, NetGraph[…][data,oport] gives the output for the output port named oport. NetGraph[…][data] or NetGraph[…][data,All] gives an association of the outputs for all ports.
- NetGraph[…][data,NetPortGradient[iport]] gives the gradient of the output with respect to the value of the input port iport.
- Normal[NetGraph[…]] will return a list or association of the layers used to construct the graph.
- If one or more input or output ports of any layers are left unconnected, these will become ports of the entire NetGraph.
- If multiple output ports of layers are left unconnected and share the same name, they will become separate ports of the entire NetGraph with names "Port1", "Port2", etc.
- Input or output ports for the entire NetGraph can be created by specifying NetPort["input"]->… or …->NetPort["output"] in the list of connections.
- If the n
layer has more than one input port or more than one output port, these can be disambiguated by writing NetPort[{n,"port"}] or NetPort[n,port]. - If a layer has a port that accepts multiple inputs, such as CatenateLayer or SummationLayer, multiple connections can be made simultaneously by writing {m1,m2,…}->n, which is equivalent to …,m1->n,…,m2->n,…. The inputs mi are always passed to n in the order m1,m2,….
- A linear chain of connections within the graph can be specified as layer1->layer2->…->layern, which will cause each layeri to be connected to layeri+1.
- When ambiguous, the tensor shapes of input and output ports of the entire graph can be specified with options of the form "port"shape. Valid shapes include:
-
"Real" a single real number "Integer" a single integer n a vector of length n {n1,n2,…} a tensor of dimensions n1×n2×… "Varying" a variable-length vector {"Varying",n2,n3,…} a variable-length sequence of tensors of dimensions n2×n3×… NetEncoder[…] an encoder (for input ports) NetDecoder[…] a decoder (for output ports) "type" NetEncoder["type"] or NetDecoder["type"] {n,coder} an encoder or decoder mapped over a sequence of length n - NetGraph supports the following special layer specifications when giving individual layers:
-
Ramp,LogisticSigmoid,… ElementwiseLayer[f] Plus,Times,Divide,… ThreadingLayer[f] {layer1,layer2,…} NetChain[{layer1,layer2,…}] - The StandardForm of NetGraph shows the connectivity of layers in the graph and annotates edges with the dimensions of the tensor that the edge represents. Clicking a layer in the graph shows more information about that layer.
- The TraditionalForm of NetGraph shows a more publication-appropriate depiction of the graph.
- Take[NetGraph[…],{start,end}] returns a subgraph of the given NetGraph that contains only the layers that connect start and end. The following forms can be given for start and end:
-
n,"layer" a specified layer NetPort[layer,"port"] the specified input or output port of a layer NetPort["port"] an input or output port of the entire graph All all of the inputs or outputs of the graph {spec1,spec2,…} the union of the speci - VertexDelete[NetGraph[…],layer] deletes one or more layers from a NetGraph, returning a new graph. Layers such as ElementwiseLayer[…] that have the same input and output size will be removed such that their output is connected directly to their input.
- NetGraph[…][data,opts] specifies that options should be used in applying the net to data. Possible options include:
-
NetEvaluationMode "Test" what mode to use in performing evaluation TargetDevice "CPU" the target device on which to perform evaluation - With the setting NetEvaluationMode->"Training", layers such as DropoutLayer will behave as they do for training rather than ordinary evaluation.
- NetGraph[…][[spec]] extracts the layer specified by spec from the net.
Examples
open allclose allBasic Examples (2)
Scope (9)
Applications (1)
Properties & Relations (4)
Possible Issues (1)
See Also
NetModel NetPort NetChain NetInitialize NetTrain NetExtract NetEncoder NetDecoder LinearLayer ElementwiseLayer ClassifierMeasurements
Related Guides
Introduced in 2016
(11.0)