Neural Networks
The Wolfram Language has state-of-the-art capabilities for the construction, training and deployment of neural network machine learning systems. Many standard layer types are available and are assembled symbolically into a network, which can then immediately be trained and deployed on available CPUs and GPUs.
Automated Machine Learning
Classify — automatic training and classification using neural networks and other methods
Predict — automatic training and data prediction
FeatureExtraction — automatic feature extraction from image, text, numeric, etc. data
ImageIdentify — fully trained image identification for common objects
Net Representation
NetGraph — symbolic representation of trained or untrained net graphs to be applied to data
NetChain — symbolic representation of a simple chain of net layers
NetPort — symbolic representation of a named input or output port for a layer
NetExtract — extract properties and weights etc. from nets
Net Operations
NetTrain — train parameters in any net from examples
NetInitialize — randomly initialize parameters for a network
NetReplacePart — replace arrays or ports on existing networks
NetPortGradient — differentiate a net
Prebuilt Material
NetModel — complete pre-trained net models
ResourceData — access to training data, networks, etc.
Basic Layers
LinearLayer — trainable layer with dense connections computing
ElementwiseLayer — apply a specified function to each element in a tensor
SoftmaxLayer — layer globally normalizing elements to the unit interval
Loss Layers
MeanSquaredLossLayer ▪ MeanAbsoluteLossLayer ▪ CrossEntropyLossLayer ▪ ContrastiveLossLayer
Elementwise Computation Layers
ElementwiseLayer ▪ ThreadingLayer ▪ ConstantTimesLayer ▪ ConstantPlusLayer
Structure Manipulation Layers
CatenateLayer ▪ FlattenLayer ▪ ReshapeLayer ▪ ReplicateLayer ▪ PaddingLayer ▪ PartLayer ▪ TransposeLayer
Array Operation Layers
ConstantArrayLayer — embed a learned constant array into a NetGraph
SummationLayer ▪ TotalLayer ▫ AggregationLayer ▪ DotLayer
Convolutional and Filtering Layers
ConvolutionLayer ▪ DeconvolutionLayer ▪ PoolingLayer ▪ ResizeLayer ▪ SpatialTransformationLayer
Recurrent Layers
BasicRecurrentLayer ▪ GatedRecurrentLayer ▪ LongShortTermMemoryLayer
Sequence-Handling Layers
EmbeddingLayer — trainable layer for embedding integers into continuous vector spaces
SequenceLastLayer ▪ SequenceReverseLayer ▪ SequenceMostLayer ▪ SequenceRestLayer ▪ UnitVectorLayer
SequenceAttentionLayer — trainable layer for finding weights for inputs based on queries
Training Optimization Layers
ImageAugmentationLayer ▪ BatchNormalizationLayer ▪ DropoutLayer ▪ LocalResponseNormalizationLayer ▪ InstanceNormalizationLayer
Higher-Order Network Construction
NetMapOperator — define a network that maps over a sequence
NetFoldOperator — define a recurrent network that folds in elements of a sequence
NetPairEmbeddingOperator ▪ NetNestOperator
Encoding & Decoding
NetEncoder — convert images, categories, etc. to net-compatible numerical arrays
NetDecoder — interpret net-generated numerical arrays as images, probabilities, etc.
Activation Functions
Ramp — rectified linear (ReLU)
Tanh ▪ LogisticSigmoid ▪ Exp ▪ Log ▪ Sin ▪ Cos ▪ Sqrt ▪ Abs
Importing & Exporting
"WLNet" — Wolfram Language Net representation format
"MXNet" — MXNet net representation format
Managing Data & Training
ClassifierMeasurements — measure accuracy, recall, etc. of a classifier net
DeleteMissing — remove missing data before training
TargetDevice ▪ ValidationSet ▪ TrainingProgressFunction ▪ TrainingProgressCheckpointing ▪ TrainingProgressReporting ▪ LearningRateMultipliers ▪ NetEvaluationMode