赞
踩
TF-Slim在learning.py中的训练模型提供了一套简单但功能强大的工具。 这些功能包括一个训练函数可以反复测量损失,计算梯度并将模型保存到磁盘,以及用于操纵梯度的几个便利函数。 例如,一旦我们指定了模型,损失函数和优化方案,我们可以调用slim.learning.create_train_op和slim.learning.train来执行优化:
- g = tf.Graph()
-
- # Create the model and specify the losses...
- ...
-
- total_loss = slim.losses.get_total_loss()
- optimizer = tf.train.GradientDescentOptimizer(learning_rate)
-
- # create_train_op ensures that each time we ask for the loss, the update_ops
- # are run and the gradients being computed are applied too.
- train_op = slim.learning.create_train_op(total_loss, optimizer)
- logdir = ... # Where checkpoints are stored.
-
- slim.learning.train(
- train_op,
- logdir,
- number_of_steps=1000,
- save_summaries_secs=300,
- save_interval_secs=600):
- import tensorflow as tf
-
- slim = tf.contrib.slim
- vgg = tf.contrib.slim.nets.vgg
-
- ...
-
- train_log_dir = ...
- if not tf.gfile.Exists(train_log_dir):
- tf.gfile.MakeDirs(train_log_dir)
-
- with tf.Graph().as_default():
- # Set up the data loading:
- images, labels = ...
-
- # Define the model:
- predictions = vgg.vgg_16(images, is_training=True)
-
- # Specify the loss function:
- slim.losses.softmax_cross_entropy(predictions, labels)
-
- total_loss = slim.losses.get_total_loss()
- tf.summary.scalar('losses/total_loss', total_loss)
-
- # Specify the optimization scheme:
- optimizer = tf.train.GradientDescentOptimizer(learning_rate=.001)
-
- # create_train_op that ensures that when we evaluate it to get the loss,
- # the update_ops are done and the gradient updates are computed.
- train_tensor = slim.learning.create_train_op(total_loss, optimizer)
-
- # Actually runs training.
- slim.learning.train(train_tensor, train_log_dir)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。