MATLAB Deep Learning Toolbox

Deep Learning Toolbox

Version 23.2 (R2023b) 01-Aug-2023

Training for Deep Learning

assembleNetwork - Assemble a neural network from pretrained layers

augmentedImageDatastore - Generate batches of augmented image data

imageDataAugmenter - Configure image data augmentation

dlnetwork - Neural network for custom training

trainNetwork - Train a neural network

trainingOptions - Options for training a neural network

Layers for Deep Learning

additionLayer - Addition layer

averagePooling2dLayer - 2-D average pooling layer

averagePooling3dLayer - 3-D average pooling layer

batchNormalizationLayer - Batch normalization layer

bilstmLayer - Bidirectional long short-term memory (biLSTM) layer

classificationLayer - Classification output layer for a neural network

clippedReluLayer - Clipped rectified linear unit (ReLU) layer

concatenationLayer - Concatenation layer

convolution2dLayer - 2-D convolution layer for Convolutional Neural Networks

convolution3dLayer - 3-D convolution layer for Convolutional Neural Networks

crop2dLayer - 2-D crop layer

crop3dLayer - 3-D crop layer

crossChannelNormalizationLayer - Local response normalization along channels

depthConcatenationLayer - Depth concatenation layer

dropoutLayer - Dropout layer

eluLayer - Exponential linear unit (ELU) layer

featureInputLayer - Feature input layer

flattenLayer - Flatten layer

fullyConnectedLayer - Fully connected layer

globalAveragePooling2dLayer - 2-D global average pooling layer

globalAveragePooling3dLayer - 3-D global average pooling layer

globalMaxPooling2dLayer - 2-D global max pooling layer

globalMaxPooling3dLayer - 3-D global max pooling layer

groupedConvolution2dLayer - 2-D grouped convolution layer

groupNormalizationLayer - Group normalization layer

gruLayer - Gated recurrent unit (GRU) layer

imageInputLayer - Image input layer

image3dInputLayer - 3-D image input layer

instanceNormalizationLayer - Instance normalization layer

layerNormalizationLayer - Layer normalization layer

leakyReluLayer - Leaky rectified linear unit (ReLU) layer

lstmLayer - Long short-term memory (LSTM) layer

maxPooling2dLayer - 2-D max pooling layer

maxPooling3dLayer - 3-D max pooling layer

maxUnpooling2dLayer - Max unpooling layer

multiplicationLayer - Element-wise Multiplication layer

regressionLayer - Regression output layer for a neural network

reluLayer - Rectified linear unit (ReLU) layer

sequenceFoldingLayer - Sequence folding layer

sequenceInputLayer - Sequence input layer

sequenceUnfoldingLayer - Sequence unfolding layer

sigmoidLayer - Sigmoid layer

softmaxLayer - Softmax layer

swishLayer - Swish layer

tanhLayer - Hyperbolic tangent layer

transposedConv2dLayer - 2-D transposed convolution layer

transposedConv3dLayer - 3-D transposed convolution layer

Custom Layers for Deep Learning

checkLayer - Check layer validity

findPlaceholderLayers - Find placeholder layers in a layer graph or layer array

Apps for Deep Learning

analyzeNetwork - Analyze deep learning networks

deepNetworkDesigner - Design and edit deep learning networks

experimentManager - Design and run experiments to train and compare deep learning networks

deepNetworkQuantizer - Quantize deep learning networks for deployment

Extract and Visualize Features, Predict Outcomes for Deep Learning

deepDreamImage - Visualize network features using Deep Dream

occlusionSensitivity - Explain network predictions by occluding the inputs

imageLIME - Explain network predictions using LIME

gradCAM - Explain network predictions using Grad-CAM

layerGraph - Create a layer graph

confusionmat - Confusion matrix for classification algorithms

confusionchart - Plot a confusion matrix

Using Pretrained Networks for Deep Learning

alexnet - Pretrained AlexNet convolutional neural network

darknet19 - Pretrained DarkNet-19 convolutional neural network

darknet53 - Pretrained DarkNet-53 convolutional neural network

densenet201 - Pretrained DenseNet-201 convolutional neural network

googlenet - Pretrained GoogLeNet convolutional neural network

inceptionv3 - Pretrained Inception-v3 convolutional neural network

inceptionresnetv2 - Pretrained Inception-ResNet-v2 convolutional neural network

mobilenetv2 - Pretrained MobileNetV2 convolutional neural network

nasnetmobile - Pretrained NASNet-Mobile convolutional neural network

nasnetlarge - Pretrained NASNet-Large convolutional neural network

resnet18 - Pretrained ResNet-18 convolutional neural network

resnet50 - Pretrained ResNet-50 convolutional neural network

resnet101 - Pretrained ResNet-101 convolutional neural network

squeezenet - Pretrained SqueezeNet convolutional neural network

shufflenet - Pretrained ShuffleNet convolutional neural network

vgg16 - Pretrained VGG-16 convolutional neural network

vgg19 - Pretrained VGG-19 convolutional neural network

xception - Pretrained Xception convolutional neural network

importCaffeLayers - Import Convolutional Neural Network Layers from Caffe

importCaffeNetwork - Import Convolutional Neural Network Models from Caffe

importKerasLayers - Import Convolutional Neural Network Layers from Keras

importKerasNetwork - Import Convolutional Neural Network Models from Keras

importONNXFunction - Import Convolutional Neural Network Function from ONNX Format

importONNXLayers - Import Convolutional Neural Network Layers from ONNX Format

importONNXNetwork - Import Convolutional Neural Network Models from ONNX Format

exportONNXNetwork - Export Convolutional Neural Network Models to ONNX Format

importTensorFlowLayers - Import Convolutional Neural Network Layers from TensorFlow

importTensorFlowNetwork - Import Convolutional Neural Network Models from TensorFlow

exportNetworkToTensorFlow - Export Convolutional Neural Network Models to TensorFlow

Graphical User Interface Functions for Shallow Neural Networks

nnstart - Neural Network Start GUI
nctool - Neural Classification app

nftool - Neural Fitting app

nntraintool - Neural Network Training Tool
nprtool - Neural Pattern Recognition app

ntstool - Neural Time Series app

view - View a neural network.

Shallow Neural Network Creation Functions

cascadeforwardnet - Cascade-forward neural network.

competlayer - Competitive neural layer.

distdelaynet - Distributed delay neural network.

elmannet - Elman neural network.

feedforwardnet - Feed-forward neural network.

fitnet - Function fitting neural network.

layrecnet - Layered recurrent neural network.

linearlayer - Linear neural layer.

lvqnet - Learning vector quantization (LVQ) neural network.

narnet - Nonlinear auto-associative time-series network.

narxnet - Nonlinear auto-associative time-series network with external input.

newgrnn - Design a generalized regression neural network.

newhop - Create a Hopfield recurrent network.

newlind - Design a linear layer.

newpnn - Design a probabilistic neural network.

newrb - Design a radial basis network.

newrbe - Design an exact radial basis network.

patternnet - Pattern recognition neural network.

perceptron - Perceptron.

selforgmap - Self-organizing map.

timedelaynet - Time-delay neural network.

Using Shallow Neural Networks

network - Create a custom neural network.

sim - Simulate a neural network.

init - Initialize a neural network.

adapt - Allow a neural network to adapt.

train - Train a neural network.

disp - Display a neural network's properties.

display - Display the name and properties of a neural network

adddelay - Add a delay to a neural network's response.

closeloop - Convert neural network open feedback to closed feedback loops.

formwb - Form bias and weights into single vector.

getwb - Get all network weight and bias values as a single vector.

noloop - Remove neural network open and closed feedback loops.

openloop - Convert neural network closed feedback to open feedback loops.

removedelay - Remove a delay to a neural network's response.

separatewb - Separate biases and weights from a weight/bias vector.

setwb - Set all network weight and bias values with a single vector.

Simulink Support for Shallow Neural Networks

gensim - Generate a Simulink block to simulate a neural network.

setsiminit - Set neural network Simulink block initial conditions

getsiminit - Get neural network Simulink block initial conditions

neural - Neural network Simulink blockset.

Training Functions for Shallow Neural Networks

trainb - Batch training with weight & bias learning rules.

trainbfg - BFGS quasi-Newton backpropagation.

trainbr - Bayesian Regulation backpropagation.

trainbu - Unsupervised batch training with weight & bias learning rules.

trainc - Cyclical order weight/bias training.

traincgb - Conjugate gradient backpropagation with Powell-Beale restarts.

traincgf - Conjugate gradient backpropagation with Fletcher-Reeves updates.

traincgp - Conjugate gradient backpropagation with Polak-Ribiere updates.

traingd - Gradient descent backpropagation.

traingda - Gradient descent with adaptive lr backpropagation.

traingdm - Gradient descent with momentum.

traingdx - Gradient descent w/momentum & adaptive lr backpropagation.

trainlm - Levenberg-Marquardt backpropagation.

trainoss - One step secant backpropagation.

trainr - Random order weight/bias training.

trainrp - RPROP backpropagation.

trainru - Unsupervised random order weight/bias training.

trains - Sequential order weight/bias training.

trainscg - Scaled conjugate gradient backpropagation.

Plotting Functions for Shallow Neural Networks

plotconfusion - Plot classification confusion matrix.

ploterrcorr - Plot autocorrelation of error time series.

ploterrhist - Plot error histogram.

plotfit - Plot function fit.

plotinerrcorr - Plot input to error time series cross-correlation.

plotperform - Plot network performance.

plotregression - Plot linear regression.

plotresponse - Plot dynamic network time-series response.

plotroc - Plot receiver operating characteristic.

plotsomhits - Plot self-organizing map sample hits.

plotsomnc - Plot Self-organizing map neighbor connections.

plotsomnd - Plot Self-organizing map neighbor distances.

plotsomplanes - Plot self-organizing map weight planes.

plotsompos - Plot self-organizing map weight positions.

plotsomtop - Plot self-organizing map topology.

plottrainstate - Plot training state values.

plotwb - Plot Hinton diagrams of weight and bias values.

List of other Shallow Neural Network Implementation Functions

nnadapt - Adapt functions.

nnderivative - Derivative functions.

nndistance - Distance functions.

nndivision - Division functions.

nninitlayer - Initialize layer functions.

nninitnetwork - Initialize network functions.

nninitweight - Initialize weight functions.

nnlearn - Learning functions.

nnnetinput - Net input functions.

nnperformance - Performance functions.

nnprocess - Processing functions.

nnsearch - Line search functions.

nntopology - Topology functions.

nntransfer - Transfer functions.

nnweight - Weight functions.

Obsolete Functions

nntool - replaced by nnstart

相关推荐
妄想成为master1 分钟前
快速入门深度学习系列(2)----损失函数、逻辑回归、向量化
人工智能·深度学习·神经网络
武乐乐~1 分钟前
YOLO-World:基于YOLOv8的开放词汇目标检测
人工智能·yolo·目标检测
打小就很皮...1 小时前
使用 React 实现语音识别并转换功能
人工智能·语音识别
老朋友此林1 小时前
MiniMind:3块钱成本 + 2小时!训练自己的0.02B的大模型。minimind源码解读、MOE架构
人工智能·python·nlp
LitchiCheng1 小时前
复刻低成本机械臂 SO-ARM100 单关节控制(附代码)
人工智能·机器学习·机器人
微学AI1 小时前
大模型的应用中A2A(Agent2Agent)架构的部署过程,A2A架构实现不同机器人之间的高效通信与协作
人工智能·架构·机器人·a2a
AI视觉网奇1 小时前
MoE 学习笔记
人工智能
多巴胺与内啡肽.2 小时前
Opencv进阶操作:图像拼接
人工智能·opencv·计算机视觉
是代码侠呀2 小时前
飞蛾扑火算法matlab实现
开发语言·算法·matlab·github·github star·github 加星
小草cys2 小时前
查看YOLO版本的三种方法
人工智能·深度学习·yolo