PyTorch中特殊函数梯度的计算
普通函数
对于简单的多元函数,对自变量求梯度很容易,例如:
f ( x , y ) = x 2 + y 2 f(x,y)=x^2+y^2 f(x,y)=x2+y2
则有:
{ ∇ x f ( x , y ) = 2 x ∇ y f ( x , y ) = 2 y \left\{ \begin{aligned} \nabla_xf(x,y)&=2x\\ \nabla_yf(x,y)&=2y \end{aligned} \right . {∇xf(x,y)∇yf(x,y)=2x=2y
python
import torch
x = torch.tensor([1, 1, 1.0], requires_grad=True)
y = torch.tensor([2, 2, 2.0], requires_grad=True)
z = torch.pow(x, 2) + torch.pow(y, 2)
z.sum().backward()
x.grad, y.grad
python
(tensor([2., 2., 2.]), tensor([4., 4., 4.]))
特殊函数
1. Max函数
一般是求几个输入元素的最大值,如何计算梯度呢?
f ( x 0 , x 1 , ... , x n ) = max ( x 0 , x 1 , ... , x n ) f(x_0,x_1,\ldots,x_n)=\max(x_0,x_1,\ldots,x_n) f(x0,x1,...,xn)=max(x0,x1,...,xn)
-
在数值上求出最大值 a a a
-
对函数进行变换
f ( x 0 , x 1 , ... , x n , a ) = max ( x 0 , x 1 , ... , x n , a ) = { a i f x < a x i f x = a f(x_0,x_1,\ldots,x_n,a)=\max(x_0,x_1,\ldots,x_n,a)= \left\{ \begin{aligned} a\quad if\ x<a\\ x\quad if\ x=a \end{aligned} \right. f(x0,x1,...,xn,a)=max(x0,x1,...,xn,a)={aif x<axif x=a -
变换后就可以求梯度了
∇ x f ( x , a ) = { 0 i f x < a 1 i f x = a \nabla_x f(x,a)= \left\{ \begin{aligned} 0\quad if\ x<a\\ 1\quad if\ x=a \end{aligned} \right . ∇xf(x,a)={0if x<a1if x=a
在PyTorch中,如果存在多个相等的最大值,那么它们均分"1":
python
import torch
x = torch.tensor([1, 2, 3, 4, 4, 0.], requires_grad=True)
y = torch.max(x)
y.backward()
x.grad
python
tensor([0.0000, 0.0000, 0.0000, 0.5000, 0.5000, 0.0000])
2. Clip函数
在数据落在一定范围外时,与输入无关
f ( x ) = { x i f a < x < b a i f x < a b i f x > b f(x)= \left\{ \begin{aligned} &x\quad if\ a<x<b\\ &a\quad if\ x<a\\ &b\quad if\ x>b \end{aligned} \right. f(x)=⎩ ⎨ ⎧xif a<x<baif x<abif x>b
python
import torch
x = torch.tensor([1, 2, 3, 4, 5, 6, 7.0], requires_grad=True)
y = torch.clip(x, 1.5, 5.5)
y.sum().backward()
x.grad
python
tensor([0., 1., 1., 1., 1., 0., 0.])