当前位置:   article > 正文

MNIST 模型测试_mnist 在线测试

mnist 在线测试

网上很多训练mnist模型的教程,但是没有将怎么测试,其实在caffe根目录输入:

:/caffe-master$ ./build/tools/caffe.bin test \

-model examples/mnist/lenet_train_test.prototxt \

-weights examples/mnist/lenet_iter_10000.caffemodel \

-iterations 100

命令行解释:

./build/tools/caffe.bin test ,表示只作预测(前向传播计算),不进行参数更新(后向传播计算)

-model examples/mnist/lenet_train_test.prototxt ,指定模型描述文本文件

-weights examples/mnist/lenet_iter_10000.caffemodel ,指定模型预先训练好的权值文件

-iterations 100 ,指定测试迭代次数,参与测试的样例数目为(iterations * batch_size),

                         batch_size在model prototxt 里定义,设为100时刚好覆盖10000个测试样本

以下是输出结果:




I1116 10:44:50.146291  3247 caffe.cpp:279] Use CPU.

I1116 10:44:51.683002  3247 net.cpp:322] The NetState phase (1) differed from the phase (0) specified by a rule in layer mnist  //只测试
I1116 10:44:51.683157  3247 net.cpp:58] Initializing net from parameters:
name: "LeNet"
state {
  phase: TEST
  level: 0
  stage: ""
}
layer {
  name: "mnist"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    scale: 0.00390625
  }
  data_param {
    source: "examples/mnist/mnist_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 20
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "pool1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 50
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 2
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool2"
  top: "ip1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 500
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "ip1"
  top: "ip1"
}
layer {
  name: "ip2"
  type: "InnerProduct"
  bottom: "ip1"
  top: "ip2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "xavier"
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip2"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip2"
  bottom: "label"
  top: "loss"
}
I1116 10:44:51.683599  3247 layer_factory.hpp:77] Creating layer mnist
I1116 10:44:51.709772  3247 net.cpp:100] Creating Layer mnist
I1116 10:44:51.709861  3247 net.cpp:408] mnist -> data
I1116 10:44:51.709889  3247 net.cpp:408] mnist -> label
I1116 10:44:51.734688  3260 db_lmdb.cpp:35] Opened lmdb examples/mnist/mnist_test_lmdb   //打开测试数据

I1116 10:44:51.741736  3247 data_layer.cpp:41] output data size: 100,1,28,28   //data四维数组尺寸(100,1,28,28)

I1116 10:44:51.742224  3247 net.cpp:150] Setting up mnist

I1116 10:44:51.742274  3247 net.cpp:157] Top shape: 100 1 28 28 (78400)I1116 10:44:51.742281  3247net.cpp:157] Top shape: 100 (100)

I1116 10:44:51.742300  3247 net.cpp:165] Memory required for data: 314000

I1116 10:44:51.742311  3247 layer_factory.hpp:77] Creating layer label_mnist_1_split
I1116 10:44:51.742331  3247 net.cpp:100] Creating Layer label_mnist_1_split
I1116 10:44:51.742339  3247 net.cpp:434] label_mnist_1_split <- label
I1116 10:44:51.742365  3247 net.cpp:408] label_mnist_1_split -> label_mnist_1_split_0
I1116 10:44:51.742377  3247 net.cpp:408] label_mnist_1_split -> label_mnist_1_split_1
I1116 10:44:51.742401  3247 net.cpp:150] Setting up label_mnist_1_split
I1116 10:44:51.742409  3247 net.cpp:157] Top shape: 100 (100)
I1116 10:44:51.742415  3247 net.cpp:157] Top shape: 100 (100)
I1116 10:44:51.742420  3247 net.cpp:165] Memory required for data: 314800
I1116 10:44:51.742426  3247 layer_factory.hpp:77] Creating layer conv1
I1116 10:44:51.742442  3247 net.cpp:100] Creating Layer conv1
I1116 10:44:51.742449  3247 net.cpp:434] conv1 <- data
I1116 10:44:51.742458  3247 net.cpp:408] conv1 -> conv1
I1116 10:44:51.743266  3261 blocking_queue.cpp:50] Waiting for data
I1116 10:44:53.022564  3247 net.cpp:150] Setting up conv1
I1116 10:44:53.022596  3247 net.cpp:157] Top shape: 100 20 24 24 (1152000)
I1116 10:44:53.022616  3247 net.cpp:165] Memory required for data: 4922800
I1116 10:44:53.022687  3247 layer_factory.hpp:77] Creating layer pool1
I1116 10:44:53.022701  3247 net.cpp:100] Creating Layer pool1
I1116 10:44:53.022711  3247 net.cpp:434] pool1 <- conv1
I1116 10:44:53.022718  3247 net.cpp:408] pool1 -> pool1
I1116 10:44:53.022737  3247 net.cpp:150] Setting up pool1
I1116 10:44:53.022745  3247 net.cpp:157] Top shape: 100 20 12 12 (288000)
I1116 10:44:53.022750  3247 net.cpp:165] Memory required for data: 6074800
I1116 10:44:53.022756  3247 layer_factory.hpp:77] Creating layer conv2
I1116 10:44:53.022768  3247 net.cpp:100] Creating Layer conv2
I1116 10:44:53.022774  3247 net.cpp:434] conv2 <- pool1
I1116 10:44:53.022797  3247 net.cpp:408] conv2 -> conv2
I1116 10:44:53.023576  3247 net.cpp:150] Setting up conv2
I1116 10:44:53.023591  3247 net.cpp:157] Top shape: 100 50 8 8 (320000)
I1116 10:44:53.023597  3247 net.cpp:165] Memory required for data: 7354800
I1116 10:44:53.023622  3247 layer_factory.hpp:77] Creating layer pool2
I1116 10:44:53.023629  3247 net.cpp:100] Creating Layer pool2
I1116 10:44:53.023635  3247 net.cpp:434] pool2 <- conv2
I1116 10:44:53.023643  3247 net.cpp:408] pool2 -> pool2
I1116 10:44:53.023665  3247 net.cpp:150] Setting up pool2
I1116 10:44:53.023674  3247 net.cpp:157] Top shape: 100 50 4 4 (80000)
I1116 10:44:53.023679  3247 net.cpp:165] Memory required for data: 7674800
I1116 10:44:53.023685  3247 layer_factory.hpp:77] Creating layer ip1
I1116 10:44:53.023696  3247 net.cpp:100] Creating Layer ip1
I1116 10:44:53.023702  3247 net.cpp:434] ip1 <- pool2
I1116 10:44:53.023710  3247 net.cpp:408] ip1 -> ip1
I1116 10:44:53.026619  3247 net.cpp:150] Setting up ip1
I1116 10:44:53.026635  3247 net.cpp:157] Top shape: 100 500 (50000)
I1116 10:44:53.026641  3247 net.cpp:165] Memory required for data: 7874800
I1116 10:44:53.026666  3247 layer_factory.hpp:77] Creating layer relu1
I1116 10:44:53.026675  3247 net.cpp:100] Creating Layer relu1
I1116 10:44:53.026681  3247 net.cpp:434] relu1 <- ip1
I1116 10:44:53.026688  3247 net.cpp:395] relu1 -> ip1 (in-place)
I1116 10:44:53.026942  3247 net.cpp:150] Setting up relu1
I1116 10:44:53.026954  3247 net.cpp:157] Top shape: 100 500 (50000)
I1116 10:44:53.026960  3247 net.cpp:165] Memory required for data: 8074800
I1116 10:44:53.026980  3247 layer_factory.hpp:77] Creating layer ip2
I1116 10:44:53.026989  3247 net.cpp:100] Creating Layer ip2
I1116 10:44:53.026995  3247 net.cpp:434] ip2 <- ip1
I1116 10:44:53.027017  3247 net.cpp:408] ip2 -> ip2
I1116 10:44:53.027065  3247 net.cpp:150] Setting up ip2
I1116 10:44:53.027086  3247 net.cpp:157] Top shape: 100 10 (1000)
I1116 10:44:53.027091  3247 net.cpp:165] Memory required for data: 8078800
I1116 10:44:53.027112  3247 layer_factory.hpp:77] Creating layer ip2_ip2_0_split
I1116 10:44:53.027120  3247 net.cpp:100] Creating Layer ip2_ip2_0_split
I1116 10:44:53.027125  3247 net.cpp:434] ip2_ip2_0_split <- ip2
I1116 10:44:53.027132  3247 net.cpp:408] ip2_ip2_0_split -> ip2_ip2_0_split_0
I1116 10:44:53.027140  3247 net.cpp:408] ip2_ip2_0_split -> ip2_ip2_0_split_1
I1116 10:44:53.027153  3247 net.cpp:150] Setting up ip2_ip2_0_split
I1116 10:44:53.027173  3247 net.cpp:157] Top shape: 100 10 (1000)
I1116 10:44:53.027179  3247 net.cpp:157] Top shape: 100 10 (1000)
I1116 10:44:53.027184  3247 net.cpp:165] Memory required for data: 8086800
I1116 10:44:53.027189  3247 layer_factory.hpp:77] Creating layer accuracy
I1116 10:44:53.027199  3247 net.cpp:100] Creating Layer accuracy

I1116 10:44:53.027205  3247 net.cpp:434] accuracy <- ip2_ip2_0_split_0    //兵分两路~split_0给了accuracy层

I1116 10:44:53.027211  3247 net.cpp:434] accuracy <- label_mnist_1_split_0

I1116 10:44:53.027218  3247 net.cpp:408] accuracy -> accuracy
I1116 10:44:53.027775  3247 net.cpp:150] Setting up accuracy
I1116 10:44:53.027791  3247 net.cpp:157] Top shape: (1)
I1116 10:44:53.027797  3247 net.cpp:165] Memory required for data: 8086804
I1116 10:44:53.027817  3247 layer_factory.hpp:77] Creating layer loss
I1116 10:44:53.027825  3247 net.cpp:100] Creating Layer loss
I1116 10:44:53.027832  3247 net.cpp:434] loss <- ip2_ip2_0_split_1  //兵分两路~split_1给了loss层
I1116 10:44:53.027859  3247 net.cpp:434] loss <- label_mnist_1_split_1
I1116 10:44:53.027868  3247 net.cpp:408] loss -> loss
I1116 10:44:53.027899  3247 layer_factory.hpp:77] Creating layer loss
I1116 10:44:53.028553  3247 net.cpp:150] Setting up loss
I1116 10:44:53.028569  3247 net.cpp:157] Top shape: (1)
I1116 10:44:53.028575  3247 net.cpp:160]     with loss weight 1
I1116 10:44:53.028594  3247 net.cpp:165] Memory required for data: 8086808
I1116 10:44:53.028599  3247 net.cpp:226] loss needs backward computation.
I1116 10:44:53.028605  3247 net.cpp:228] accuracy does not need backward computation.
I1116 10:44:53.028611  3247 net.cpp:226] ip2_ip2_0_split needs backward computation.
I1116 10:44:53.028616  3247 net.cpp:226] ip2 needs backward computation.
I1116 10:44:53.028621  3247 net.cpp:226] relu1 needs backward computation.
I1116 10:44:53.028626  3247 net.cpp:226] ip1 needs backward computation.
I1116 10:44:53.028631  3247 net.cpp:226] pool2 needs backward computation.
I1116 10:44:53.028636  3247 net.cpp:226] conv2 needs backward computation.
I1116 10:44:53.028641  3247 net.cpp:226] pool1 needs backward computation.
I1116 10:44:53.028646  3247 net.cpp:226] conv1 needs backward computation.
I1116 10:44:53.028651  3247 net.cpp:228] label_mnist_1_split does not need backward computation.
I1116 10:44:53.028657  3247 net.cpp:228] mnist does not need backward computation.
I1116 10:44:53.028661  3247 net.cpp:270] This network produces output accuracy
I1116 10:44:53.028667  3247 net.cpp:270] This network produces output loss
I1116 10:44:53.028678  3247 net.cpp:283] Network initialization done.
I1116 10:44:53.057978  3247 caffe.cpp:285] Running for 100 iterations.
I1116 10:44:53.126924  3247 caffe.cpp:308] Batch 0, accuracy = 1
I1116 10:44:53.126963  3247 caffe.cpp:308] Batch 0, loss = 0.0061212
I1116 10:44:53.168745  3247 caffe.cpp:308] Batch 1, accuracy = 0.99
I1116 10:44:53.168772  3247 caffe.cpp:308] Batch 1, loss = 0.0133515
I1116 10:44:53.210995  3247 caffe.cpp:308] Batch 2, accuracy = 0.99
I1116 10:44:53.211024  3247 caffe.cpp:308] Batch 2, loss = 0.0117903
I1116 10:44:53.253181  3247 caffe.cpp:308] Batch 3, accuracy = 0.99
I1116 10:44:53.253211  3247 caffe.cpp:308] Batch 3, loss = 0.0317178
I1116 10:44:53.294770  3247 caffe.cpp:308] Batch 4, accuracy = 0.99
I1116 10:44:53.294795  3247 caffe.cpp:308] Batch 4, loss = 0.0568775
I1116 10:44:53.336119  3247 caffe.cpp:308] Batch 5, accuracy = 0.99
I1116 10:44:53.336146  3247 caffe.cpp:308] Batch 5, loss = 0.0253471
I1116 10:44:53.377841  3247 caffe.cpp:308] Batch 6, accuracy = 0.97
I1116 10:44:53.377868  3247 caffe.cpp:308] Batch 6, loss = 0.0566137
I1116 10:44:53.419960  3247 caffe.cpp:308] Batch 7, accuracy = 0.99
I1116 10:44:53.419988  3247 caffe.cpp:308] Batch 7, loss = 0.0153878
I1116 10:44:53.462306  3247 caffe.cpp:308] Batch 8, accuracy = 1
I1116 10:44:53.462332  3247 caffe.cpp:308] Batch 8, loss = 0.00515131
I1116 10:44:53.504509  3247 caffe.cpp:308] Batch 9, accuracy = 0.99
I1116 10:44:53.504535  3247 caffe.cpp:308] Batch 9, loss = 0.0175177
I1116 10:44:53.547013  3247 caffe.cpp:308] Batch 10, accuracy = 0.98
I1116 10:44:53.547040  3247 caffe.cpp:308] Batch 10, loss = 0.0777676
I1116 10:44:53.589557  3247 caffe.cpp:308] Batch 11, accuracy = 0.98
I1116 10:44:53.589584  3247 caffe.cpp:308] Batch 11, loss = 0.0444185
I1116 10:44:53.632716  3247 caffe.cpp:308] Batch 12, accuracy = 0.95
I1116 10:44:53.632742  3247 caffe.cpp:308] Batch 12, loss = 0.148465
I1116 10:44:53.674947  3247 caffe.cpp:308] Batch 13, accuracy = 0.98
I1116 10:44:53.674973  3247 caffe.cpp:308] Batch 13, loss = 0.0589911
I1116 10:44:53.717653  3247 caffe.cpp:308] Batch 14, accuracy = 1
I1116 10:44:53.717679  3247 caffe.cpp:308] Batch 14, loss = 0.00966478
I1116 10:44:53.759479  3247 caffe.cpp:308] Batch 15, accuracy = 0.99
I1116 10:44:53.759534  3247 caffe.cpp:308] Batch 15, loss = 0.0354297
I1116 10:44:53.802633  3247 caffe.cpp:308] Batch 16, accuracy = 0.99
I1116 10:44:53.802661  3247 caffe.cpp:308] Batch 16, loss = 0.0254676
I1116 10:44:53.845255  3247 caffe.cpp:308] Batch 17, accuracy = 0.99
I1116 10:44:53.845299  3247 caffe.cpp:308] Batch 17, loss = 0.0211856
I1116 10:44:53.886400  3247 caffe.cpp:308] Batch 18, accuracy = 0.99
I1116 10:44:53.886426  3247 caffe.cpp:308] Batch 18, loss = 0.0162245
I1116 10:44:53.927749  3247 caffe.cpp:308] Batch 19, accuracy = 0.98
I1116 10:44:53.927774  3247 caffe.cpp:308] Batch 19, loss = 0.0700603
I1116 10:44:53.969600  3247 caffe.cpp:308] Batch 20, accuracy = 0.98
I1116 10:44:53.969645  3247 caffe.cpp:308] Batch 20, loss = 0.0812266
I1116 10:44:54.011916  3247 caffe.cpp:308] Batch 21, accuracy = 0.97
I1116 10:44:54.011941  3247 caffe.cpp:308] Batch 21, loss = 0.0836177
I1116 10:44:54.054926  3247 caffe.cpp:308] Batch 22, accuracy = 0.99
I1116 10:44:54.054952  3247 caffe.cpp:308] Batch 22, loss = 0.0409367
I1116 10:44:54.096271  3247 caffe.cpp:308] Batch 23, accuracy = 0.98
I1116 10:44:54.096295  3247 caffe.cpp:308] Batch 23, loss = 0.0364728
I1116 10:44:54.137509  3247 caffe.cpp:308] Batch 24, accuracy = 0.99
I1116 10:44:54.137537  3247 caffe.cpp:308] Batch 24, loss = 0.0299302
I1116 10:44:54.179723  3247 caffe.cpp:308] Batch 25, accuracy = 0.99
I1116 10:44:54.179762  3247 caffe.cpp:308] Batch 25, loss = 0.0700943
I1116 10:44:54.222487  3247 caffe.cpp:308] Batch 26, accuracy = 0.99
I1116 10:44:54.222512  3247 caffe.cpp:308] Batch 26, loss = 0.110037
I1116 10:44:54.264148  3247 caffe.cpp:308] Batch 27, accuracy = 1
I1116 10:44:54.264174  3247 caffe.cpp:308] Batch 27, loss = 0.0183853
I1116 10:44:54.305537  3247 caffe.cpp:308] Batch 28, accuracy = 0.99
I1116 10:44:54.305567  3247 caffe.cpp:308] Batch 28, loss = 0.0425091
I1116 10:44:54.347069  3247 caffe.cpp:308] Batch 29, accuracy = 0.96
I1116 10:44:54.347095  3247 caffe.cpp:308] Batch 29, loss = 0.137452
I1116 10:44:54.388613  3247 caffe.cpp:308] Batch 30, accuracy = 0.99
I1116 10:44:54.388655  3247 caffe.cpp:308] Batch 30, loss = 0.0188697
I1116 10:44:54.431311  3247 caffe.cpp:308] Batch 31, accuracy = 1
I1116 10:44:54.431339  3247 caffe.cpp:308] Batch 31, loss = 0.00298686
I1116 10:44:54.474185  3247 caffe.cpp:308] Batch 32, accuracy = 1
I1116 10:44:54.474213  3247 caffe.cpp:308] Batch 32, loss = 0.00986821
I1116 10:44:54.516254  3247 caffe.cpp:308] Batch 33, accuracy = 1
I1116 10:44:54.516280  3247 caffe.cpp:308] Batch 33, loss = 0.00496284
I1116 10:44:54.558568  3247 caffe.cpp:308] Batch 34, accuracy = 0.98
I1116 10:44:54.558593  3247 caffe.cpp:308] Batch 34, loss = 0.0732583
I1116 10:44:54.600500  3247 caffe.cpp:308] Batch 35, accuracy = 0.95
I1116 10:44:54.600528  3247 caffe.cpp:308] Batch 35, loss = 0.159537
I1116 10:44:54.643375  3247 caffe.cpp:308] Batch 36, accuracy = 1
I1116 10:44:54.643406  3247 caffe.cpp:308] Batch 36, loss = 0.00401762
I1116 10:44:54.685214  3247 caffe.cpp:308] Batch 37, accuracy = 0.99
I1116 10:44:54.685256  3247 caffe.cpp:308] Batch 37, loss = 0.0490689
I1116 10:44:54.727855  3247 caffe.cpp:308] Batch 38, accuracy = 0.99
I1116 10:44:54.727882  3247 caffe.cpp:308] Batch 38, loss = 0.0382032
I1116 10:44:54.769639  3247 caffe.cpp:308] Batch 39, accuracy = 0.99
I1116 10:44:54.769665  3247 caffe.cpp:308] Batch 39, loss = 0.0403106
I1116 10:44:54.811323  3247 caffe.cpp:308] Batch 40, accuracy = 1
I1116 10:44:54.811364  3247 caffe.cpp:308] Batch 40, loss = 0.0177203
I1116 10:44:54.852814  3247 caffe.cpp:308] Batch 41, accuracy = 0.98
I1116 10:44:54.852843  3247 caffe.cpp:308] Batch 41, loss = 0.0884625
I1116 10:44:54.894551  3247 caffe.cpp:308] Batch 42, accuracy = 1
I1116 10:44:54.894604  3247 caffe.cpp:308] Batch 42, loss = 0.0325108
I1116 10:44:54.936128  3247 caffe.cpp:308] Batch 43, accuracy = 0.99
I1116 10:44:54.936153  3247 caffe.cpp:308] Batch 43, loss = 0.0148244
I1116 10:44:54.978384  3247 caffe.cpp:308] Batch 44, accuracy = 0.99
I1116 10:44:54.978411  3247 caffe.cpp:308] Batch 44, loss = 0.0352546
I1116 10:44:55.020856  3247 caffe.cpp:308] Batch 45, accuracy = 0.98
I1116 10:44:55.020906  3247 caffe.cpp:308] Batch 45, loss = 0.0425254
I1116 10:44:55.063021  3247 caffe.cpp:308] Batch 46, accuracy = 1
I1116 10:44:55.063052  3247 caffe.cpp:308] Batch 46, loss = 0.00386924
I1116 10:44:55.105479  3247 caffe.cpp:308] Batch 47, accuracy = 0.99
I1116 10:44:55.105521  3247 caffe.cpp:308] Batch 47, loss = 0.0165543
I1116 10:44:55.147413  3247 caffe.cpp:308] Batch 48, accuracy = 0.95
I1116 10:44:55.147451  3247 caffe.cpp:308] Batch 48, loss = 0.0724568
I1116 10:44:55.189774  3247 caffe.cpp:308] Batch 49, accuracy = 0.99
I1116 10:44:55.189800  3247 caffe.cpp:308] Batch 49, loss = 0.0166493
I1116 10:44:55.231638  3247 caffe.cpp:308] Batch 50, accuracy = 1
I1116 10:44:55.231664  3247 caffe.cpp:308] Batch 50, loss = 0.000193432
I1116 10:44:55.273180  3247 caffe.cpp:308] Batch 51, accuracy = 1
I1116 10:44:55.273211  3247 caffe.cpp:308] Batch 51, loss = 0.00549245
I1116 10:44:55.315142  3247 caffe.cpp:308] Batch 52, accuracy = 1
I1116 10:44:55.315170  3247 caffe.cpp:308] Batch 52, loss = 0.00735993
I1116 10:44:55.357081  3247 caffe.cpp:308] Batch 53, accuracy = 1
I1116 10:44:55.357106  3247 caffe.cpp:308] Batch 53, loss = 0.00249736
I1116 10:44:55.398986  3247 caffe.cpp:308] Batch 54, accuracy = 1
I1116 10:44:55.399014  3247 caffe.cpp:308] Batch 54, loss = 0.00380324
I1116 10:44:55.441495  3247 caffe.cpp:308] Batch 55, accuracy = 1
I1116 10:44:55.441521  3247 caffe.cpp:308] Batch 55, loss = 0.000834018
I1116 10:44:55.484167  3247 caffe.cpp:308] Batch 56, accuracy = 1
I1116 10:44:55.484194  3247 caffe.cpp:308] Batch 56, loss = 0.0137293
I1116 10:44:55.526473  3247 caffe.cpp:308] Batch 57, accuracy = 1
I1116 10:44:55.526501  3247 caffe.cpp:308] Batch 57, loss = 0.00453061
I1116 10:44:55.567912  3247 caffe.cpp:308] Batch 58, accuracy = 1
I1116 10:44:55.567937  3247 caffe.cpp:308] Batch 58, loss = 0.00564778
I1116 10:44:55.610164  3247 caffe.cpp:308] Batch 59, accuracy = 0.97
I1116 10:44:55.610190  3247 caffe.cpp:308] Batch 59, loss = 0.1065
I1116 10:44:55.651887  3247 caffe.cpp:308] Batch 60, accuracy = 1
I1116 10:44:55.651913  3247 caffe.cpp:308] Batch 60, loss = 0.00507791
I1116 10:44:55.693601  3247 caffe.cpp:308] Batch 61, accuracy = 1
I1116 10:44:55.693630  3247 caffe.cpp:308] Batch 61, loss = 0.00784779
I1116 10:44:55.735671  3247 caffe.cpp:308] Batch 62, accuracy = 1
I1116 10:44:55.735697  3247 caffe.cpp:308] Batch 62, loss = 2.75084e-05
I1116 10:44:55.778391  3247 caffe.cpp:308] Batch 63, accuracy = 1
I1116 10:44:55.778417  3247 caffe.cpp:308] Batch 63, loss = 8.72722e-05
I1116 10:44:55.821347  3247 caffe.cpp:308] Batch 64, accuracy = 1
I1116 10:44:55.821372  3247 caffe.cpp:308] Batch 64, loss = 0.000635555
I1116 10:44:55.863615  3247 caffe.cpp:308] Batch 65, accuracy = 0.96
I1116 10:44:55.863641  3247 caffe.cpp:308] Batch 65, loss = 0.118228
I1116 10:44:55.905485  3247 caffe.cpp:308] Batch 66, accuracy = 0.98
I1116 10:44:55.905514  3247 caffe.cpp:308] Batch 66, loss = 0.0594917
I1116 10:44:55.947041  3247 caffe.cpp:308] Batch 67, accuracy = 0.99
I1116 10:44:55.947067  3247 caffe.cpp:308] Batch 67, loss = 0.0321053
I1116 10:44:55.988890  3247 caffe.cpp:308] Batch 68, accuracy = 1
I1116 10:44:55.988915  3247 caffe.cpp:308] Batch 68, loss = 0.00137083
I1116 10:44:56.030824  3247 caffe.cpp:308] Batch 69, accuracy = 1
I1116 10:44:56.030849  3247 caffe.cpp:308] Batch 69, loss = 0.00270546
I1116 10:44:56.073659  3247 caffe.cpp:308] Batch 70, accuracy = 1
I1116 10:44:56.073686  3247 caffe.cpp:308] Batch 70, loss = 0.00114084
I1116 10:44:56.116312  3247 caffe.cpp:308] Batch 71, accuracy = 1
I1116 10:44:56.116336  3247 caffe.cpp:308] Batch 71, loss = 0.000395572
I1116 10:44:56.158524  3247 caffe.cpp:308] Batch 72, accuracy = 1
I1116 10:44:56.158550  3247 caffe.cpp:308] Batch 72, loss = 0.00495989
I1116 10:44:56.200970  3247 caffe.cpp:308] Batch 73, accuracy = 1
I1116 10:44:56.200997  3247 caffe.cpp:308] Batch 73, loss = 0.000111445
I1116 10:44:56.243125  3247 caffe.cpp:308] Batch 74, accuracy = 1
I1116 10:44:56.243151  3247 caffe.cpp:308] Batch 74, loss = 0.00227753
I1116 10:44:56.284889  3247 caffe.cpp:308] Batch 75, accuracy = 1
I1116 10:44:56.284915  3247 caffe.cpp:308] Batch 75, loss = 0.00349699
I1116 10:44:56.327323  3247 caffe.cpp:308] Batch 76, accuracy = 1
I1116 10:44:56.327347  3247 caffe.cpp:308] Batch 76, loss = 0.000260342
I1116 10:44:56.370226  3247 caffe.cpp:308] Batch 77, accuracy = 1
I1116 10:44:56.370260  3247 caffe.cpp:308] Batch 77, loss = 0.000231701
I1116 10:44:56.413427  3247 caffe.cpp:308] Batch 78, accuracy = 1
I1116 10:44:56.413452  3247 caffe.cpp:308] Batch 78, loss = 0.0027974
I1116 10:44:56.455852  3247 caffe.cpp:308] Batch 79, accuracy = 1
I1116 10:44:56.455878  3247 caffe.cpp:308] Batch 79, loss = 0.00245169
I1116 10:44:56.497237  3247 caffe.cpp:308] Batch 80, accuracy = 0.99
I1116 10:44:56.497263  3247 caffe.cpp:308] Batch 80, loss = 0.0126951
I1116 10:44:56.538885  3247 caffe.cpp:308] Batch 81, accuracy = 1
I1116 10:44:56.538910  3247 caffe.cpp:308] Batch 81, loss = 0.00183047
I1116 10:44:56.580886  3247 caffe.cpp:308] Batch 82, accuracy = 1
I1116 10:44:56.580912  3247 caffe.cpp:308] Batch 82, loss = 0.00443107
I1116 10:44:56.624069  3247 caffe.cpp:308] Batch 83, accuracy = 1
I1116 10:44:56.624095  3247 caffe.cpp:308] Batch 83, loss = 0.0138792
I1116 10:44:56.666620  3247 caffe.cpp:308] Batch 84, accuracy = 0.99
I1116 10:44:56.666646  3247 caffe.cpp:308] Batch 84, loss = 0.018512
I1116 10:44:56.709748  3247 caffe.cpp:308] Batch 85, accuracy = 0.99
I1116 10:44:56.709771  3247 caffe.cpp:308] Batch 85, loss = 0.0235319
I1116 10:44:56.752324  3247 caffe.cpp:308] Batch 86, accuracy = 1
I1116 10:44:56.752351  3247 caffe.cpp:308] Batch 86, loss = 7.07973e-05
I1116 10:44:56.795102  3247 caffe.cpp:308] Batch 87, accuracy = 1
I1116 10:44:56.795130  3247 caffe.cpp:308] Batch 87, loss = 9.06585e-05
I1116 10:44:56.836699  3247 caffe.cpp:308] Batch 88, accuracy = 1
I1116 10:44:56.836726  3247 caffe.cpp:308] Batch 88, loss = 0.000124428
I1116 10:44:56.879022  3247 caffe.cpp:308] Batch 89, accuracy = 1
I1116 10:44:56.879047  3247 caffe.cpp:308] Batch 89, loss = 5.27109e-05
I1116 10:44:56.920341  3247 caffe.cpp:308] Batch 90, accuracy = 0.96
I1116 10:44:56.920367  3247 caffe.cpp:308] Batch 90, loss = 0.0960243
I1116 10:44:56.962316  3247 caffe.cpp:308] Batch 91, accuracy = 1
I1116 10:44:56.962340  3247 caffe.cpp:308] Batch 91, loss = 3.48874e-05
I1116 10:44:57.004240  3247 caffe.cpp:308] Batch 92, accuracy = 1
I1116 10:44:57.004266  3247 caffe.cpp:308] Batch 92, loss = 0.000362797
I1116 10:44:57.046540  3247 caffe.cpp:308] Batch 93, accuracy = 1
I1116 10:44:57.046566  3247 caffe.cpp:308] Batch 93, loss = 0.000916503
I1116 10:44:57.088037  3247 caffe.cpp:308] Batch 94, accuracy = 1
I1116 10:44:57.088065  3247 caffe.cpp:308] Batch 94, loss = 0.00034051
I1116 10:44:57.129601  3247 caffe.cpp:308] Batch 95, accuracy = 1
I1116 10:44:57.129624  3247 caffe.cpp:308] Batch 95, loss = 0.0044833
I1116 10:44:57.171982  3247 caffe.cpp:308] Batch 96, accuracy = 0.98
I1116 10:44:57.172008  3247 caffe.cpp:308] Batch 96, loss = 0.0463236
I1116 10:44:57.213846  3247 caffe.cpp:308] Batch 97, accuracy = 0.98
I1116 10:44:57.213872  3247 caffe.cpp:308] Batch 97, loss = 0.076892
I1116 10:44:57.255475  3247 caffe.cpp:308] Batch 98, accuracy = 1
I1116 10:44:57.255499  3247 caffe.cpp:308] Batch 98, loss = 0.00295341
I1116 10:44:57.297261  3247 caffe.cpp:308] Batch 99, accuracy = 1
I1116 10:44:57.297286  3247 caffe.cpp:308] Batch 99, loss = 0.0056333   //100批,每批100数。刚好10000
I1116 10:44:57.297293  3247 caffe.cpp:313] Loss: 0.0284559
I1116 10:44:57.297319  3247 caffe.cpp:325] accuracy = 0.9912     //最终精度
I1116 10:44:57.297343  3247 caffe.cpp:325] loss = 0.0284559 (* 1 = 0.0284559 loss)

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/知新_RL/article/detail/965814
推荐阅读
相关标签
  

闽ICP备14008679号