derivatives

Yongqiang Cheng2 天前
梯度·导数·abs·gradients·derivatives
Abs Function - Derivatives and Gradients (导数和梯度)torch.abs(input: Tensor, *, out: Optional[Tensor]) -> Tensor https://docs.pytorch.org/docs/stable/generated/torch.abs.html
Yongqiang Cheng4 天前
梯度·导数·gradients·derivatives·softplus
Softplus Function - Derivatives and Gradients (导数和梯度)class torch.nn.Softplus(beta=1.0, threshold=20.0) https://docs.pytorch.org/docs/stable/generated/torch.nn.Softplus.html
Yongqiang Cheng10 天前
梯度·导数·gradients·derivatives·selu
SELU Function - Derivatives and Gradients (导数和梯度)class torch.nn.SELU(inplace=False) https://docs.pytorch.org/docs/stable/generated/torch.nn.SELU.html
Yongqiang Cheng16 天前
梯度·导数·gradients·derivatives·elu
ELU Function - Derivatives and Gradients (导数和梯度)class torch.nn.ELU(alpha=1.0, inplace=False) https://docs.pytorch.org/docs/stable/generated/torch.nn.ELU.html
Yongqiang Cheng17 天前
梯度·导数·tanh·gradients·derivatives
Tanh Function - Derivatives and Gradients (导数和梯度)class torch.nn.Tanh(*args, **kwargs) https://docs.pytorch.org/docs/stable/generated/torch.nn.Tanh.html
Yongqiang Cheng18 天前
梯度·导数·relu·gradients·derivatives·leaky relu
ReLU Function and Leaky ReLU Function - Derivatives and Gradients (导数和梯度)Activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias to it. They are differentiable operators for transforming input signals to outputs, while most of them add nonlinearity. 激活函数 (
Yongqiang Cheng1 个月前
梯度·导数·mae·gradients·loss function·derivatives
Mean Absolute Error (MAE) Loss Function - Derivatives and Gradients (导数和梯度)https://docs.pytorch.org/docs/stable/generated/torch.nn.L1Loss.html https://github.com/pytorch/pytorch/blob/main/torch/nn/modules/loss.py
我是有底线的