gradients

Yongqiang Cheng5 小时前
梯度·导数·gradients·derivatives·selu
SELU Function - Derivatives and Gradients (导数和梯度)class torch.nn.SELU(inplace=False) https://docs.pytorch.org/docs/stable/generated/torch.nn.SELU.html
Yongqiang Cheng6 天前
梯度·导数·gradients·derivatives·elu
ELU Function - Derivatives and Gradients (导数和梯度)class torch.nn.ELU(alpha=1.0, inplace=False) https://docs.pytorch.org/docs/stable/generated/torch.nn.ELU.html
Yongqiang Cheng7 天前
梯度·导数·tanh·gradients·derivatives
Tanh Function - Derivatives and Gradients (导数和梯度)class torch.nn.Tanh(*args, **kwargs) https://docs.pytorch.org/docs/stable/generated/torch.nn.Tanh.html
Yongqiang Cheng8 天前
梯度·导数·relu·gradients·derivatives·leaky relu
ReLU Function and Leaky ReLU Function - Derivatives and Gradients (导数和梯度)Activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias to it. They are differentiable operators for transforming input signals to outputs, while most of them add nonlinearity. 激活函数 (
Yongqiang Cheng16 天前
梯度·导数·mae·gradients·loss function·derivatives
Mean Absolute Error (MAE) Loss Function - Derivatives and Gradients (导数和梯度)https://docs.pytorch.org/docs/stable/generated/torch.nn.L1Loss.html https://github.com/pytorch/pytorch/blob/main/torch/nn/modules/loss.py
Yongqiang Cheng8 个月前
深度学习·gradients·matrix-matrix
Gradients of Matrix-Matrix Multiplication in Deep LearningUnderstanding Artificial Neural Networks with Hands-on Experience - Part 1. Matrix Multiplication, Its Gradients and Custom Implementations https://coolgpu.github.io/coolgpu_blog/github/pages/2020/09/22/matrixmultiplication.html
我是有底线的