MATLAB Deep Learning Toolbox

Deep Learning Toolbox

Version 23.2 (R2023b) 01-Aug-2023

Training for Deep Learning

assembleNetwork - Assemble a neural network from pretrained layers

augmentedImageDatastore - Generate batches of augmented image data

imageDataAugmenter - Configure image data augmentation

dlnetwork - Neural network for custom training

trainNetwork - Train a neural network

trainingOptions - Options for training a neural network

Layers for Deep Learning

additionLayer - Addition layer

averagePooling2dLayer - 2-D average pooling layer

averagePooling3dLayer - 3-D average pooling layer

batchNormalizationLayer - Batch normalization layer

bilstmLayer - Bidirectional long short-term memory (biLSTM) layer

classificationLayer - Classification output layer for a neural network

clippedReluLayer - Clipped rectified linear unit (ReLU) layer

concatenationLayer - Concatenation layer

convolution2dLayer - 2-D convolution layer for Convolutional Neural Networks

convolution3dLayer - 3-D convolution layer for Convolutional Neural Networks

crop2dLayer - 2-D crop layer

crop3dLayer - 3-D crop layer

crossChannelNormalizationLayer - Local response normalization along channels

depthConcatenationLayer - Depth concatenation layer

dropoutLayer - Dropout layer

eluLayer - Exponential linear unit (ELU) layer

featureInputLayer - Feature input layer

flattenLayer - Flatten layer

fullyConnectedLayer - Fully connected layer

globalAveragePooling2dLayer - 2-D global average pooling layer

globalAveragePooling3dLayer - 3-D global average pooling layer

globalMaxPooling2dLayer - 2-D global max pooling layer

globalMaxPooling3dLayer - 3-D global max pooling layer

groupedConvolution2dLayer - 2-D grouped convolution layer

groupNormalizationLayer - Group normalization layer

gruLayer - Gated recurrent unit (GRU) layer

imageInputLayer - Image input layer

image3dInputLayer - 3-D image input layer

instanceNormalizationLayer - Instance normalization layer

layerNormalizationLayer - Layer normalization layer

leakyReluLayer - Leaky rectified linear unit (ReLU) layer

lstmLayer - Long short-term memory (LSTM) layer

maxPooling2dLayer - 2-D max pooling layer

maxPooling3dLayer - 3-D max pooling layer

maxUnpooling2dLayer - Max unpooling layer

multiplicationLayer - Element-wise Multiplication layer

regressionLayer - Regression output layer for a neural network

reluLayer - Rectified linear unit (ReLU) layer

sequenceFoldingLayer - Sequence folding layer

sequenceInputLayer - Sequence input layer

sequenceUnfoldingLayer - Sequence unfolding layer

sigmoidLayer - Sigmoid layer

softmaxLayer - Softmax layer

swishLayer - Swish layer

tanhLayer - Hyperbolic tangent layer

transposedConv2dLayer - 2-D transposed convolution layer

transposedConv3dLayer - 3-D transposed convolution layer

Custom Layers for Deep Learning

checkLayer - Check layer validity

findPlaceholderLayers - Find placeholder layers in a layer graph or layer array

Apps for Deep Learning

analyzeNetwork - Analyze deep learning networks

deepNetworkDesigner - Design and edit deep learning networks

experimentManager - Design and run experiments to train and compare deep learning networks

deepNetworkQuantizer - Quantize deep learning networks for deployment

Extract and Visualize Features, Predict Outcomes for Deep Learning

deepDreamImage - Visualize network features using Deep Dream

occlusionSensitivity - Explain network predictions by occluding the inputs

imageLIME - Explain network predictions using LIME

gradCAM - Explain network predictions using Grad-CAM

layerGraph - Create a layer graph

confusionmat - Confusion matrix for classification algorithms

confusionchart - Plot a confusion matrix

Using Pretrained Networks for Deep Learning

alexnet - Pretrained AlexNet convolutional neural network

darknet19 - Pretrained DarkNet-19 convolutional neural network

darknet53 - Pretrained DarkNet-53 convolutional neural network

densenet201 - Pretrained DenseNet-201 convolutional neural network

googlenet - Pretrained GoogLeNet convolutional neural network

inceptionv3 - Pretrained Inception-v3 convolutional neural network

inceptionresnetv2 - Pretrained Inception-ResNet-v2 convolutional neural network

mobilenetv2 - Pretrained MobileNetV2 convolutional neural network

nasnetmobile - Pretrained NASNet-Mobile convolutional neural network

nasnetlarge - Pretrained NASNet-Large convolutional neural network

resnet18 - Pretrained ResNet-18 convolutional neural network

resnet50 - Pretrained ResNet-50 convolutional neural network

resnet101 - Pretrained ResNet-101 convolutional neural network

squeezenet - Pretrained SqueezeNet convolutional neural network

shufflenet - Pretrained ShuffleNet convolutional neural network

vgg16 - Pretrained VGG-16 convolutional neural network

vgg19 - Pretrained VGG-19 convolutional neural network

xception - Pretrained Xception convolutional neural network

importCaffeLayers - Import Convolutional Neural Network Layers from Caffe

importCaffeNetwork - Import Convolutional Neural Network Models from Caffe

importKerasLayers - Import Convolutional Neural Network Layers from Keras

importKerasNetwork - Import Convolutional Neural Network Models from Keras

importONNXFunction - Import Convolutional Neural Network Function from ONNX Format

importONNXLayers - Import Convolutional Neural Network Layers from ONNX Format

importONNXNetwork - Import Convolutional Neural Network Models from ONNX Format

exportONNXNetwork - Export Convolutional Neural Network Models to ONNX Format

importTensorFlowLayers - Import Convolutional Neural Network Layers from TensorFlow

importTensorFlowNetwork - Import Convolutional Neural Network Models from TensorFlow

exportNetworkToTensorFlow - Export Convolutional Neural Network Models to TensorFlow

Graphical User Interface Functions for Shallow Neural Networks

nnstart - Neural Network Start GUI
nctool - Neural Classification app

nftool - Neural Fitting app

nntraintool - Neural Network Training Tool
nprtool - Neural Pattern Recognition app

ntstool - Neural Time Series app

view - View a neural network.

Shallow Neural Network Creation Functions

cascadeforwardnet - Cascade-forward neural network.

competlayer - Competitive neural layer.

distdelaynet - Distributed delay neural network.

elmannet - Elman neural network.

feedforwardnet - Feed-forward neural network.

fitnet - Function fitting neural network.

layrecnet - Layered recurrent neural network.

linearlayer - Linear neural layer.

lvqnet - Learning vector quantization (LVQ) neural network.

narnet - Nonlinear auto-associative time-series network.

narxnet - Nonlinear auto-associative time-series network with external input.

newgrnn - Design a generalized regression neural network.

newhop - Create a Hopfield recurrent network.

newlind - Design a linear layer.

newpnn - Design a probabilistic neural network.

newrb - Design a radial basis network.

newrbe - Design an exact radial basis network.

patternnet - Pattern recognition neural network.

perceptron - Perceptron.

selforgmap - Self-organizing map.

timedelaynet - Time-delay neural network.

Using Shallow Neural Networks

network - Create a custom neural network.

sim - Simulate a neural network.

init - Initialize a neural network.

adapt - Allow a neural network to adapt.

train - Train a neural network.

disp - Display a neural network's properties.

display - Display the name and properties of a neural network

adddelay - Add a delay to a neural network's response.

closeloop - Convert neural network open feedback to closed feedback loops.

formwb - Form bias and weights into single vector.

getwb - Get all network weight and bias values as a single vector.

noloop - Remove neural network open and closed feedback loops.

openloop - Convert neural network closed feedback to open feedback loops.

removedelay - Remove a delay to a neural network's response.

separatewb - Separate biases and weights from a weight/bias vector.

setwb - Set all network weight and bias values with a single vector.

Simulink Support for Shallow Neural Networks

gensim - Generate a Simulink block to simulate a neural network.

setsiminit - Set neural network Simulink block initial conditions

getsiminit - Get neural network Simulink block initial conditions

neural - Neural network Simulink blockset.

Training Functions for Shallow Neural Networks

trainb - Batch training with weight & bias learning rules.

trainbfg - BFGS quasi-Newton backpropagation.

trainbr - Bayesian Regulation backpropagation.

trainbu - Unsupervised batch training with weight & bias learning rules.

trainc - Cyclical order weight/bias training.

traincgb - Conjugate gradient backpropagation with Powell-Beale restarts.

traincgf - Conjugate gradient backpropagation with Fletcher-Reeves updates.

traincgp - Conjugate gradient backpropagation with Polak-Ribiere updates.

traingd - Gradient descent backpropagation.

traingda - Gradient descent with adaptive lr backpropagation.

traingdm - Gradient descent with momentum.

traingdx - Gradient descent w/momentum & adaptive lr backpropagation.

trainlm - Levenberg-Marquardt backpropagation.

trainoss - One step secant backpropagation.

trainr - Random order weight/bias training.

trainrp - RPROP backpropagation.

trainru - Unsupervised random order weight/bias training.

trains - Sequential order weight/bias training.

trainscg - Scaled conjugate gradient backpropagation.

Plotting Functions for Shallow Neural Networks

plotconfusion - Plot classification confusion matrix.

ploterrcorr - Plot autocorrelation of error time series.

ploterrhist - Plot error histogram.

plotfit - Plot function fit.

plotinerrcorr - Plot input to error time series cross-correlation.

plotperform - Plot network performance.

plotregression - Plot linear regression.

plotresponse - Plot dynamic network time-series response.

plotroc - Plot receiver operating characteristic.

plotsomhits - Plot self-organizing map sample hits.

plotsomnc - Plot Self-organizing map neighbor connections.

plotsomnd - Plot Self-organizing map neighbor distances.

plotsomplanes - Plot self-organizing map weight planes.

plotsompos - Plot self-organizing map weight positions.

plotsomtop - Plot self-organizing map topology.

plottrainstate - Plot training state values.

plotwb - Plot Hinton diagrams of weight and bias values.

List of other Shallow Neural Network Implementation Functions

nnadapt - Adapt functions.

nnderivative - Derivative functions.

nndistance - Distance functions.

nndivision - Division functions.

nninitlayer - Initialize layer functions.

nninitnetwork - Initialize network functions.

nninitweight - Initialize weight functions.

nnlearn - Learning functions.

nnnetinput - Net input functions.

nnperformance - Performance functions.

nnprocess - Processing functions.

nnsearch - Line search functions.

nntopology - Topology functions.

nntransfer - Transfer functions.

nnweight - Weight functions.

Obsolete Functions

nntool - replaced by nnstart

相关推荐
geneculture2 分钟前
融智学院十大学部知识架构示范样板
人工智能·数据挖掘·信息科学·哲学与科学统一性·信息融智学
无风听海4 分钟前
神经网络之交叉熵与 Softmax 的梯度计算
人工智能·深度学习·神经网络
算家计算5 分钟前
AI树洞现象:是社交降级,还是我们都在失去温度?
人工智能
java1234_小锋7 分钟前
TensorFlow2 Python深度学习 - TensorFlow2框架入门 - 神经网络基础原理
python·深度学习·tensorflow·tensorflow2
JJJJ_iii8 分钟前
【深度学习03】神经网络基本骨架、卷积、池化、非线性激活、线性层、搭建网络
网络·人工智能·pytorch·笔记·python·深度学习·神经网络
sensen_kiss12 分钟前
INT301 Bio-computation 生物计算(神经网络)Pt.1 导论与Hebb学习规则
人工智能·神经网络·学习
玉石观沧海14 分钟前
高压变频器故障代码解析F67 F68
运维·经验分享·笔记·分布式·深度学习
mwq3012315 分钟前
GPT系列模型演进:从GPT-1到GPT-4o的技术突破与差异解析
人工智能
JJJJ_iii17 分钟前
【深度学习05】PyTorch:完整的模型训练套路
人工智能·pytorch·python·深度学习
mwq3012330 分钟前
AI的“物理学”:揭秘GPT-3背后改变一切的“缩放定律”
人工智能