Fisher矩阵和Hessian矩阵的关系:证明Fisher为负对数似然函数的Hessian的期望

证明Fisher等于Expectation of Hessian of Negative log likelihood.

符号约定

  • f θ ( ⋅ ) f_{\theta}(\cdot) fθ(⋅): 概率密度

  • p ( x ∣ θ ) = p θ ( x ) = ∏ i N f θ ( x i ) p(x|\theta) = p_{\theta}(x) = \prod\limits_i^N f_{\theta}(x_i) p(x∣θ)=pθ(x)=i∏Nfθ(xi): 似然函数

  • s ( θ ) = ∇ θ p θ ( x ) s(\theta) = \nabla_{\theta} \ p_{\theta}(x) s(θ)=∇θ pθ(x): score function,即似然函数的梯度。

  • I = E p θ ( x ) [ ( ∇ θ l o g p θ ( x ) ) ( ∇ θ l o g p θ ( x ) ) T ] I = E_{p_{\theta}(x)}[(\nabla_{\theta} log p_{\theta}(x))(\nabla_{\theta} log p_{\theta}(x))^T] I=Epθ(x)[(∇θlogpθ(x))(∇θlogpθ(x))T]: Fisher矩阵。

  • I i , j ( θ ) = E p θ ( x ) [ ( D i l o g p θ ( x ) ) ( D j l o g p θ ( x ) ) ] I_{i,j}(\theta) = E_{p_{\theta}(x)}[(D_i log p_{\theta}(x))(D_j log p_{\theta}(x))] Ii,j(θ)=Epθ(x)[(Dilogpθ(x))(Djlogpθ(x))]: 为Fisher的第i行第j列元素。其中 D i = ∂ ∂ θ i ; D i , j = ∂ ∂ θ i ∂ θ j D_i = \frac{\partial}{\partial{\theta_i}}; \ D_{i,j} = \frac{\partial}{\partial{\theta_i} \partial{\theta_j}} Di=∂θi∂; Di,j=∂θi∂θj∂。

  • H i , j = D i , j l o g P θ ( x ) H_{i,j} = D_{i,j} log P_{\theta}(x) Hi,j=Di,jlogPθ(x): Hessian矩阵的第i行第j列元素。

证明

证明目标:
I i , j ( θ ) = − E p θ ( x ) [ H i , j ] I_{i,j}(\theta) = -E_{p_{\theta}(x)}[ H_{i,j} ] Ii,j(θ)=−Epθ(x)[Hi,j]

从 H i , j H_{i,j} Hi,j入手。
H i , j = D i , j l o g P θ ( x ) = D i ( D j p θ ( x ) p θ ( x ) ) = ( D i , j p θ ( x ) ) ⋅ p θ ( x ) − D i p θ ( x ) D j p θ ( x ) p θ 2 ( x ) = D i , j p θ ( x ) p θ ( x ) − D i p θ ( x ) p θ ( x ) D j p θ ( x ) p θ ( x ) \begin{align*} H_{i,j} & = D_{i,j} log P_{\theta}(x) \\ & = D_i(\frac{ D_j p_{\theta}(x) }{ p_{\theta}(x) }) \\ & = \frac{(D_{i,j}p_{\theta}(x)) \cdot p_{\theta}(x) - D_i p_{\theta}(x) D_j p_{\theta}(x)} {p_{\theta}^2(x)} \\ & = \frac{D_{i,j}p_{\theta}(x)}{p_{\theta}(x)} - \frac{D_{i}p_{\theta}(x)}{p_{\theta}(x)}\frac{D_{j}p_{\theta}(x)}{p_{\theta}(x)} \end{align*} Hi,j=Di,jlogPθ(x)=Di(pθ(x)Djpθ(x))=pθ2(x)(Di,jpθ(x))⋅pθ(x)−Dipθ(x)Djpθ(x)=pθ(x)Di,jpθ(x)−pθ(x)Dipθ(x)pθ(x)Djpθ(x)

故右式:

− E p θ ( x ) ( H i , j ) = − E p θ ( x ) [ D i , j p θ ( x ) p θ ( x ) ] + E p θ ( x ) [ ( D i p θ ( x ) p θ ( x ) ) ⋅ ( D j p θ ( x ) p θ ( x ) ) ] \begin{align*} -E_{p_{\theta}(x)}( H_{i,j} ) & = -E_{p_{\theta}(x)}[ \frac{D_{i,j}p_{\theta}(x)}{p_{\theta}(x)}] + E_{p_{\theta}(x)}[(\frac{D_i p_{\theta}(x)}{p_{\theta}(x)}) \cdot (\frac{D_j p_{\theta}(x)}{p_{\theta}(x)})] \end{align*} −Epθ(x)(Hi,j)=−Epθ(x)[pθ(x)Di,jpθ(x)]+Epθ(x)[(pθ(x)Dipθ(x))⋅(pθ(x)Djpθ(x))]

其中:
E p θ ( x ) ( D i , j p θ ( x ) p θ ( x ) ) = ∫ D i , j p θ ( x ) p θ ( x ) ⋅ p θ ( x ) ⋅ d x = D i , j ∫ p θ ( x ) ⋅ d x ( 积分求导换序 ) = D i , j 1 ( 对常数求导,为0 ) = 0 \begin{align*} E_{p_{\theta}(x)}( \frac{D_{i,j}p_{\theta}(x)}{p_{\theta}(x)}) & = \int \frac{D_{i,j}p_{\theta}(x)}{p_{\theta}(x)} \cdot p_{{\theta}(x)} \cdot dx \\ & = D_{i,j} \int {p_{\theta}(x) \cdot dx} \qquad & (\text{积分求导换序}) \\ & = D_{i,j} 1 \qquad & (\text{对常数求导,为0}) \\ & = 0 \end{align*} Epθ(x)(pθ(x)Di,jpθ(x))=∫pθ(x)Di,jpθ(x)⋅pθ(x)⋅dx=Di,j∫pθ(x)⋅dx=Di,j1=0(积分求导换序)(对常数求导,为0)

且根据复合函数求导可知:
D i p θ ( x ) p θ ( x ) = D i l o g p θ ( x ) \frac{D_i p_{\theta}(x)}{p_{\theta}(x)} = D_i log p_{\theta}(x) pθ(x)Dipθ(x)=Dilogpθ(x)

故右式为:
E p θ ( x ) [ ( D i p θ ( x ) p θ ( x ) ) ⋅ ( D j p θ ( x ) p θ ( x ) ) ] = E p θ ( x ) [ ( D i l o g p θ ( x ) ) ( D j l o g p θ ( x ) ) ] = I i , j ( θ ) \begin{align*} & E_{p_{\theta}(x)}[(\frac{D_i p_{\theta}(x)}{p_{\theta}(x)}) \cdot (\frac{D_j p_{\theta}(x)}{p_{\theta}(x)})] = E_{p_{\theta}(x)}[(D_i log p_{\theta}(x))(D_j log p_{\theta}(x))] \\ & = I_{i,j}(\theta) \end{align*} Epθ(x)[(pθ(x)Dipθ(x))⋅(pθ(x)Djpθ(x))]=Epθ(x)[(Dilogpθ(x))(Djlogpθ(x))]=Ii,j(θ)

得证

实际应用中,计算 H H H非常复杂,但是计算 I I I并将其作为 H H H的近似值是比较容易的,一些剪枝方法中就利用了这一点,如NAP [Network Automatic Pruning Start NAP and Take a Nap](基于OBS,OBD)

参考链接:

https://zhuanlan.zhihu.com/p/546885304?utm_psn=1840735001693523969
https://zhuanlan.zhihu.com/p/546885304?utm_psn=1840431492376969216
https://jaketae.github.io/study/fisher/
https://mark.reid.name/blog/fisher-information-and-log-likelihood.html
https://bobondemon.github.io/2022/01/07/Score-Function-and-Fisher-Information-Matrix/

相关推荐
荒古前8 小时前
线性代数期末总复习的点点滴滴(1)
人工智能·线性代数·机器学习
程序猿阿伟9 小时前
《C++与 Armadillo:线性代数助力人工智能算法简化之路》
c++·人工智能·线性代数
云云32113 小时前
云手机:小红书矩阵搭建方案
服务器·线性代数·安全·智能手机·矩阵
阿正的梦工坊1 天前
矩阵-向量乘法的行与列的解释(Row and Column Interpretations):中英双语
线性代数·矩阵
m0_749317521 天前
蓝桥杯练习生第四天
java·算法·职场和发展·矩阵·蓝桥杯
橘子遇见BUG2 天前
Unity Shader学习日记 part 2 线性代数--矩阵
学习·线性代数·unity·矩阵·shader
雷达学弱狗2 天前
伪逆不能把矩阵变成单位阵
线性代数·算法·矩阵
阿正的梦工坊2 天前
差分矩阵(Difference Matrix)与累计和矩阵(Running Sum Matrix)的概念与应用:中英双语
线性代数·机器学习·矩阵
阿正的梦工坊2 天前
矩阵:Input-Output Interpretation of Matrices (中英双语)
线性代数·矩阵
迅猛龙办公室2 天前
C语言——实现矩阵转置
c语言·算法·矩阵