当前位置:   article > 正文

全连接与卷积神经网络反向传播公式推导_全连接、卷积、循环神经网络的反向传播推导

全连接、卷积、循环神经网络的反向传播推导

全连接与卷积神经网络反向传播公式推导

全连接网络反向传播公式

BP四项基本原则:

δ i ( L ) = ▽ y i C o s t ⋅ σ ′ ( l o g i t i ( L ) ) δ i ( l ) = ∑ j δ j ( l + 1 ) w j i ( l + 1 ) σ ′ ( l o g i t i ( l ) ) ∂ C o s t ∂ b i a s i ( l ) = δ i ( l ) ∂ C o s t ∂ w i j ( l ) = δ i ( l ) h j ( l − 1 )

δi(L)=yiCostσ(logiti(L))δi(l)=jδj(l+1)wji(l+1)σ(logiti(l))Costbiasi(l)=δi(l)Costwij(l)=δi(l)hj(l1)
δi(L)δi(l)biasi(l)Costwij(l)Cost=yiCostσ(logiti(L))=jδj(l+1)wji(l+1)σ(logiti(l))=δi(l)=δi(l)hj(l1)

其中, ( l ) (l) (l)表示第 l l l层,一共有L层, i , j i,j i,j表示当前层神经元的序号。

反向传播公式的目的主要是得到: ∂ C o s t ∂ b i a s i ( l ) \frac{\partial Cost}{\partial bias_i^{(l)}} biasi(l)Cost ∂ C o s t ∂ w i j ( l ) \frac{\partial Cost}{\partial w_{ij}^{(l)}} wij(l)Cost

在推导的过程中

∂ C o s t ∂ b i a s i ( l ) = ∂ C o s t ∂ l o g i t i ( l ) ⋅ ∂ l o g i t i ( l ) ∂ b i a s i ( l ) ∂ C o s t ∂ w i j ( l ) = ∂ C o s t ∂ l o g i t i ( l ) ⋅ ∂ l o g i t i ( l ) ∂ w i j ( l )

Costbiasi(l)=Costlogiti(l)logiti(l)biasi(l)Costwij(l)=Costlogiti(l)logiti(l)wij(l)
biasi(l)Costwij(l)Cost=logiti(l)Costbiasi(l)logiti(l)=logiti(l)Costwij(l)logiti(l)

会发现都要用到 ∂ C o s t ∂ l o g i t i ( l ) \frac{\partial Cost}{\partial logit_i^{(l)}} logiti(l)Cost

l o g i t i ( l ) = w i j ( l ) h j ( l ) + ∑ k ≠ j w i k ( l ) h k ( l ) + b i a s i ( l ) logit_i^{(l)} = w_{ij}^{(l)} h_j^{(l)} + \sum_{k\ne j} w_{ik}^{(l)} h_{k}^{(l)} + bias_i^{(l)} logiti(l)=wij(l)hj(l)+k=jwik(l)hk(l)+biasi(l)

所以

∂ l o g i t i ( l ) ∂ b i a s i ( l ) = 1 ∂ l o g i t i ( l ) ∂ w i j ( l ) = h j ( l )

logiti(l)biasi(l)=1logiti(l)wij(l)=hj(l)
biasi(l)logiti(l)wij(l)logiti(l)=1=hj(l)

那接下来的问题就只有求 ∂ C o s t ∂ l o g i t i ( l ) \frac{\partial Cost}{\partial logit_i^{(l)}} logiti(l)Cost了,求它可以用递推法:

为公式看起来简洁,我们把 ∂ C o s t ∂ l o g i t i ( l ) \frac{\partial Cost}{\partial logit_i^{(l)}} logiti(l)Cost记为 δ i ( l ) \delta_i^{(l)} δi(l),那么

δ i ( l ) = ∂ C o s t ∂ l o g i t i ( l ) = ∑ j ∂ C o s t ∂ l o g i t j ( l + 1 ) ⋅ ∂ l o g i t j ( l + 1 ) ∂ l o g i t i ( l ) = ∑ j δ j ( l + 1 ) ⋅ ∂ l o g i t j ( l + 1 ) ∂ l o g i t i ( l ) \delta_i^{(l)} = \frac{\partial Cost}{\partial logit_i^{(l)}} = \sum_j \frac{\partial Cost}{\partial logit_j^{(l+1)}} \cdot \frac{\partial logit_j^{(l+1)}}{\partial logit_i^{(l)}} = \sum_j \delta_j^{(l+1)} \cdot \frac{\partial logit_j^{(l+1)}}{\partial logit_i^{(l)}} δi(l)=logiti(l)Cost=jlogitj(l+1)Costlogiti(l)logitj(l+1)

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Cpp五条/article/detail/634985
推荐阅读
相关标签
  

闽ICP备14008679号