当前位置:   article > 正文

caffe学习笔记3:Loss和多个Loss合并问题_caffe多个loss

caffe多个loss

caffe学习笔记3:Loss

原网页:http://caffe.berkeleyvision.org/tutorial/loss.html


Loss

caffe模型其实就是为了优化loss。net在forward的过程中被计算出来,每个层获取输入,得到输出,某些层的输出被用于Loss function之中。最常用于分类的loss是softmaxWithLoss,在网络中我们是这样定义的:

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "pred"
  bottom: "label"
  top: "loss"
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7

loss层的top是一个标量,是对整个batch loss的平均值


Loss weights 多个loss的权重分配问题

For nets with multiple layers producing a loss (e.g., a network that both classifies the input using a SoftmaxWithLoss layer and reconstructs it using a EuclideanLoss layer), loss weights can be used to specify their relative importance.

By convention, Caffe layer types with the suffix Loss contribute to the loss function, but other layers are assumed to be purely used for intermediate computations. However, any layer can be used as a loss by adding a field loss_weight: to a layer definition for each top blob produced by the layer. Layers with the suffix Loss have an implicit loss_weight: 1 for the first top blob (and loss_weight: 0 for any additional tops); other layers have an implicit loss_weight: 0 for all tops. So, the above SoftmaxWithLoss layer could be equivalently written as:

layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "pred"
  bottom: "label"
  top: "loss"
  loss_weight: 1
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

However, any layer able to backpropagate may be given a non-zero loss_weight, allowing one to, for example, regularize the activations produced by some intermediate layer(s) of the network if desired. For non-singleton outputs with an associated non-zero loss, the loss is computed simply by summing over all entries of the blob.

The final loss in Caffe, then, is computed by summing the total weighted loss over the network, as in the following pseudo-code:

loss := 0
for layer in layers:
  for top, loss_weight in layer.tops, layer.loss_weights:
    loss += loss_weight * sum(top)
  • 1
  • 2
  • 3
  • 4
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/133292
推荐阅读
相关标签
  

闽ICP备14008679号