赞
踩
TF-Slim提供了一个易于使用的机制,通过损失模块定义和跟踪损失功能。 考虑一下我们想要训练VGG网络的简单情况:
- import tensorflow as tf
- vgg = tf.contrib.slim.nets.vgg
-
- # Load the images and labels.
- images, labels = ...
-
- # Create the model.
- predictions, _ = vgg.vgg_16(images)
-
- # Define the loss functions and get the total loss.
- loss = slim.losses.softmax_cross_entropy(predictions, labels)
- # Load the images and labels.
- images, scene_labels, depth_labels = ...
-
- # Create the model.
- scene_predictions, depth_predictions = CreateMultiTaskModel(images)
-
- # Define the loss functions and get the total loss.
- classification_loss = slim.losses.softmax_cross_entropy(scene_predictions, scene_labels)
- sum_of_squares_loss = slim.losses.sum_of_squares(depth_predictions, depth_labels)
-
- # The following two lines have the same effect:
- total_loss = classification_loss + sum_of_squares_loss
- total_loss = slim.losses.get_total_loss(add_regularization_losses=False)
如果你想让TF-Slim管理你的损失,通过一个自定义的损失函数呢? loss_ops.py也有一个功能,把这个损失添加到TF-Slims集合中。 例如:
- # Load the images and labels.
- images, scene_labels, depth_labels, pose_labels = ...
-
- # Create the model.
- scene_predictions, depth_predictions, pose_predictions = CreateMultiTaskModel(images)
-
- # Define the loss functions and get the total loss.
- classification_loss = slim.losses.softmax_cross_entropy(scene_predictions, scene_labels)
- sum_of_squares_loss = slim.losses.sum_of_squares(depth_predictions, depth_labels)
- pose_loss = MyCustomLossFunction(pose_predictions, pose_labels)
- slim.losses.add_loss(pose_loss) # Letting TF-Slim know about the additional loss.
-
- # The following two ways to compute the total loss are equivalent:
- regularization_loss = tf.add_n(slim.losses.get_regularization_losses())
- total_loss1 = classification_loss + sum_of_squares_loss + pose_loss + regularization_loss
-
- # (Regularization Loss is included in the total loss by default).
- total_loss2 = slim.losses.get_total_loss()
在这个例子中,我们可以再次手动产生总损失函数,或者让TF-Slim知道额外的损失,让TF-Slim处理损失。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。