当前位置:   article > 正文

【PyTorch】Resnet/深度残差网络_loss: 0.5167 -

loss: 0.5167 -

1 模型描述

深度残差网络(Resnet)是由来自Microsoft Research的4位学者(Kaiming He等人)提出的卷积神经网络,在2015年的ImageNet大规模视觉识别竞赛(ImageNet Large Scale Visual Recognition Challenge, ILSVRC)中获得了图像分类和物体识别的优胜。 残差网络的特点是容易优化,并且能够通过增加相当的深度来提高准确率。其内部的残差块使用了跳跃连接,缓解了在深度神经网络中增加深度带来的梯度消失问题。

本文采用PyTorch来实现Resnet。

2 具体代码

# codes for implementing Resnet
# ---------------------------------------------------------------------------- #
# An implementation of https://arxiv.org/pdf/1512.03385.pdf                    #
# See section 4.2 for the model architecture on CIFAR-10                       #
# Some part of the code was referenced from below                              #
# https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py   #
# ---------------------------------------------------------------------------- #

import torch
import torch.nn as nn
import torchvision 
import torchvision.transforms as transforms

# Device configuration 
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

# Hyper-parameters
num_epochs = 80
batch_size = 100
learning_rate = 0.001

# image preprocessing modules
transform = transforms.Compose([
    transforms.Pad(4), # 填充
    transforms.RandomHorizontalFlip(),# 垂直翻转
    transforms.RandomCrop(32), # 随机裁剪
    transforms.ToTensor()])

# CIFAR-10 dataset
train_dataset = torchvision.datasets.CIFAR10(
    root = 'data/',
    train = True,
    transform=transform,
    download = True
    )
test_dataset = torchvision.datasets.CIFAR10(
    root = 'data/',
    train = False,
    transform = transforms.ToTensor()
)

# Data loader
train_loader = torch.utils.data.DataLoader(
    dataset = train_dataset,
    batch_size = batch_size,
    shuffle = True
)
test_loader = torch.utils.data.DataLoader(
    dataset = test_dataset,
    batch_size = batch_size,
    shuffle = False
)

# 3×3 convolution
def conv3x3(in_channels, out_channels, stride=1):
    return nn.Conv2d(
        in_channels, 
        out_channels, 
        kernel_size = 3, 
        stride = stride, 
        padding = 1,
        bias = False
        )

# Residual block
class ResidualBlock(nn.Module):
    def __init__(self, in_channels, out_channels, stride = 1, downsample = None):
        super(ResidualBlock, self).__init__()
        self.conv1 = conv3x3(in_channels, out_channels, stride)
        self.bn1 = nn.BatchNorm2d(out_channels)
        self.relu = nn.ReLU(inplace = True)
        self.conv2 = conv3x3(out_channels, out_channels)
        self.bn2 = nn.BatchNorm2d(out_channels)
        self.downsample = downsample
    
    def forward(self, x):
        residual = x
        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)
        out = self.conv2(out)
        out = self.bn2(out)
        if self.downsample:
            residual = self.downsample(x)
        out += residual
        out = self.relu(out)
        return out

# ResNet
class ResNet(nn.Module):
    def __init__(self, block, layers, num_classes=10):
        super(ResNet, self).__init__()
        self.in_channels = 16
        self.conv = conv3x3(3, 16)
        self.bn = nn.BatchNorm2d(16)
        self.relu = nn.ReLU(inplace=True)
        self.layer1 = self.make_layer(block, 16, layers[0])
        self.layer2 = self.make_layer(block, 32, layers[1], 2)
        self.layer3 = self.make_layer(block, 64, layers[2], 2)
        self.avg_pool = nn.AvgPool2d(8)
        self.fc = nn.Linear(64, num_classes)

    def make_layer(self, block, out_channels, blocks, stride=1):
        downsample = None
        if (stride != 1) or (self.in_channels != out_channels):
            downsample = nn.Sequential(
                conv3x3(self.in_channels, out_channels, stride=stride),
                nn.BatchNorm2d(out_channels))
        layers = []
        layers.append(block(self.in_channels, out_channels, stride, downsample))
        self.in_channels = out_channels
        for i in range(1, blocks):
            layers.append(block(out_channels, out_channels))
        return nn.Sequential(*layers)

    def forward(self, x):
        out = self.conv(x)
        out = self.bn(out)
        out = self.relu(out)
        out = self.layer1(out)
        out = self.layer2(out)
        out = self.layer3(out)
        out = self.avg_pool(out)
        out = out.view(out.size(0), -1)
        out = self.fc(out)
        return out

model = ResNet(ResidualBlock, [2, 2, 2]).to(device)

# Loss and optimizer
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

# For updating learning rate
def update_lr(optimizer, lr):    
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr

# Train the model
total_step = len(train_loader)
curr_lr = learning_rate
for epoch in range(num_epochs):
    for i, (images, labels) in enumerate(train_loader):
        images = images.to(device)
        labels = labels.to(device)
        
        # Forward pass
        outputs = model(images)
        loss = criterion(outputs, labels)
        
        # Backward and optimize
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        
        if (i+1) % 100 == 0:
            print ("Epoch [{}/{}], Step [{}/{}] Loss: {:.4f}"
                   .format(epoch+1, num_epochs, i+1, total_step, loss.item()))
    # Decay learning rate
    if (epoch+1) % 20 == 0:
        curr_lr /= 3
        update_lr(optimizer, curr_lr)

# Test the model
model.eval()
with torch.no_grad():
    correct = 0
    total = 0
    for images, labels in test_loader:
        images = images.to(device)
        labels = labels.to(device)
        outputs = model(images)
        _, predicted = torch.max(outputs.data, 1)
        total += labels.size(0)
        correct += (predicted == labels).sum().item()

    print('Accuracy of the model on the test images: {} %'.format(100 * correct / total))

# Save the model checkpoint
torch.save(model.state_dict(), 'resnet.ckpt')
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180

3 程序输出

程序训练时间较长,在笔记本上跑了5~6个小时,但结果还是是挺好的。

Epoch [1/80], Step [100/500] Loss: 1.6903
Epoch [1/80], Step [200/500] Loss: 1.3618
Epoch [1/80], Step [300/500] Loss: 1.1689
Epoch [1/80], Step [400/500] Loss: 1.1824
Epoch [1/80], Step [500/500] Loss: 1.0652
Epoch [2/80], Step [100/500] Loss: 1.1884
Epoch [2/80], Step [200/500] Loss: 0.8875
Epoch [2/80], Step [300/500] Loss: 1.2085
Epoch [2/80], Step [400/500] Loss: 0.9510
Epoch [2/80], Step [500/500] Loss: 0.9154
Epoch [3/80], Step [100/500] Loss: 0.9555
Epoch [3/80], Step [200/500] Loss: 0.9859
Epoch [3/80], Step [300/500] Loss: 0.8526
Epoch [3/80], Step [400/500] Loss: 0.9229
Epoch [3/80], Step [500/500] Loss: 0.7979
Epoch [4/80], Step [100/500] Loss: 0.7952
Epoch [4/80], Step [200/500] Loss: 0.7257
Epoch [4/80], Step [300/500] Loss: 0.7596
Epoch [4/80], Step [400/500] Loss: 0.7599
Epoch [4/80], Step [500/500] Loss: 0.7675
Epoch [5/80], Step [100/500] Loss: 0.6990
Epoch [5/80], Step [200/500] Loss: 0.7193
Epoch [5/80], Step [300/500] Loss: 0.6661
Epoch [5/80], Step [400/500] Loss: 0.4435
Epoch [5/80], Step [500/500] Loss: 0.7103
Epoch [6/80], Step [100/500] Loss: 0.5538
Epoch [6/80], Step [200/500] Loss: 0.6042
Epoch [6/80], Step [300/500] Loss: 0.5002
Epoch [6/80], Step [400/500] Loss: 0.7440
Epoch [6/80], Step [500/500] Loss: 0.5007
Epoch [7/80], Step [100/500] Loss: 0.4111
Epoch [7/80], Step [200/500] Loss: 0.7357
Epoch [7/80], Step [300/500] Loss: 0.5855
Epoch [7/80], Step [400/500] Loss: 0.4054
Epoch [7/80], Step [500/500] Loss: 0.5780
Epoch [8/80], Step [100/500] Loss: 0.4854
Epoch [8/80], Step [200/500] Loss: 0.5895
Epoch [8/80], Step [300/500] Loss: 0.6393
Epoch [8/80], Step [400/500] Loss: 0.5335
Epoch [8/80], Step [500/500] Loss: 0.5157
Epoch [9/80], Step [100/500] Loss: 0.7645
Epoch [9/80], Step [200/500] Loss: 0.4416
Epoch [9/80], Step [300/500] Loss: 0.2846
Epoch [9/80], Step [400/500] Loss: 0.5031
Epoch [9/80], Step [500/500] Loss: 0.4851
Epoch [10/80], Step [100/500] Loss: 0.4381
Epoch [10/80], Step [200/500] Loss: 0.5167
Epoch [10/80], Step [300/500] Loss: 0.4316
Epoch [10/80], Step [400/500] Loss: 0.4434
Epoch [10/80], Step [500/500] Loss: 0.2888
Epoch [11/80], Step [100/500] Loss: 0.4379
Epoch [11/80], Step [200/500] Loss: 0.5113
Epoch [11/80], Step [300/500] Loss: 0.5018
Epoch [11/80], Step [400/500] Loss: 0.5127
Epoch [11/80], Step [500/500] Loss: 0.4458
Epoch [12/80], Step [100/500] Loss: 0.4463
Epoch [12/80], Step [200/500] Loss: 0.5314
Epoch [12/80], Step [300/500] Loss: 0.3969
Epoch [12/80], Step [400/500] Loss: 0.4774
Epoch [12/80], Step [500/500] Loss: 0.7450
Epoch [13/80], Step [100/500] Loss: 0.3637
Epoch [13/80], Step [200/500] Loss: 0.5757
Epoch [13/80], Step [300/500] Loss: 0.3526
Epoch [13/80], Step [400/500] Loss: 0.4996
Epoch [13/80], Step [500/500] Loss: 0.4953
Epoch [14/80], Step [100/500] Loss: 0.4351
Epoch [14/80], Step [200/500] Loss: 0.4416
Epoch [14/80], Step [300/500] Loss: 0.4317
Epoch [14/80], Step [400/500] Loss: 0.5490
Epoch [14/80], Step [500/500] Loss: 0.3177
Epoch [15/80], Step [100/500] Loss: 0.4292
Epoch [15/80], Step [200/500] Loss: 0.4585
Epoch [15/80], Step [300/500] Loss: 0.4706
Epoch [15/80], Step [400/500] Loss: 0.4133
Epoch [15/80], Step [500/500] Loss: 0.4701
Epoch [16/80], Step [100/500] Loss: 0.3542
Epoch [16/80], Step [200/500] Loss: 0.3555
Epoch [16/80], Step [300/500] Loss: 0.4832
Epoch [16/80], Step [400/500] Loss: 0.4278
Epoch [16/80], Step [500/500] Loss: 0.3719
Epoch [17/80], Step [100/500] Loss: 0.3845
Epoch [17/80], Step [200/500] Loss: 0.3643
Epoch [17/80], Step [300/500] Loss: 0.3886
Epoch [17/80], Step [400/500] Loss: 0.4406
Epoch [17/80], Step [500/500] Loss: 0.3347
Epoch [18/80], Step [100/500] Loss: 0.3234
Epoch [18/80], Step [200/500] Loss: 0.3385
Epoch [18/80], Step [300/500] Loss: 0.2918
Epoch [18/80], Step [400/500] Loss: 0.3457
Epoch [18/80], Step [500/500] Loss: 0.4655
Epoch [19/80], Step [100/500] Loss: 0.2944
Epoch [19/80], Step [200/500] Loss: 0.2373
Epoch [19/80], Step [300/500] Loss: 0.4930
Epoch [19/80], Step [400/500] Loss: 0.5316
Epoch [19/80], Step [500/500] Loss: 0.5754
Epoch [20/80], Step [100/500] Loss: 0.3018
Epoch [20/80], Step [200/500] Loss: 0.2551
Epoch [20/80], Step [300/500] Loss: 0.3542
Epoch [20/80], Step [400/500] Loss: 0.3928
Epoch [20/80], Step [500/500] Loss: 0.4278
Epoch [21/80], Step [100/500] Loss: 0.1989
Epoch [21/80], Step [200/500] Loss: 0.2735
Epoch [21/80], Step [300/500] Loss: 0.3500
Epoch [21/80], Step [400/500] Loss: 0.2338
Epoch [21/80], Step [500/500] Loss: 0.3547
Epoch [22/80], Step [100/500] Loss: 0.4918
Epoch [22/80], Step [200/500] Loss: 0.3486
Epoch [22/80], Step [300/500] Loss: 0.2192
Epoch [22/80], Step [400/500] Loss: 0.2795
Epoch [22/80], Step [500/500] Loss: 0.2835
Epoch [23/80], Step [100/500] Loss: 0.1862
Epoch [23/80], Step [200/500] Loss: 0.2384
Epoch [23/80], Step [300/500] Loss: 0.2469
Epoch [23/80], Step [400/500] Loss: 0.2181
Epoch [23/80], Step [500/500] Loss: 0.1682
Epoch [24/80], Step [100/500] Loss: 0.2193
Epoch [24/80], Step [200/500] Loss: 0.2292
Epoch [24/80], Step [300/500] Loss: 0.3381
Epoch [24/80], Step [400/500] Loss: 0.3113
Epoch [24/80], Step [500/500] Loss: 0.3083
Epoch [25/80], Step [100/500] Loss: 0.3507
Epoch [25/80], Step [200/500] Loss: 0.2731
Epoch [25/80], Step [300/500] Loss: 0.2989
Epoch [25/80], Step [400/500] Loss: 0.2334
Epoch [25/80], Step [500/500] Loss: 0.2419
Epoch [26/80], Step [100/500] Loss: 0.3797
Epoch [26/80], Step [200/500] Loss: 0.1951
Epoch [26/80], Step [300/500] Loss: 0.2815
Epoch [26/80], Step [400/500] Loss: 0.2826
Epoch [26/80], Step [500/500] Loss: 0.2990
Epoch [27/80], Step [100/500] Loss: 0.3639
Epoch [27/80], Step [200/500] Loss: 0.2609
Epoch [27/80], Step [300/500] Loss: 0.2676
Epoch [27/80], Step [400/500] Loss: 0.1734
Epoch [27/80], Step [500/500] Loss: 0.3016
Epoch [28/80], Step [100/500] Loss: 0.2848
Epoch [28/80], Step [200/500] Loss: 0.2427
Epoch [28/80], Step [300/500] Loss: 0.1968
Epoch [28/80], Step [400/500] Loss: 0.1952
Epoch [28/80], Step [500/500] Loss: 0.1763
Epoch [29/80], Step [100/500] Loss: 0.2007
Epoch [29/80], Step [200/500] Loss: 0.1410
Epoch [29/80], Step [300/500] Loss: 0.4305
Epoch [29/80], Step [400/500] Loss: 0.2089
Epoch [29/80], Step [500/500] Loss: 0.2162
Epoch [30/80], Step [100/500] Loss: 0.1495
Epoch [30/80], Step [200/500] Loss: 0.3787
Epoch [30/80], Step [300/500] Loss: 0.1756
Epoch [30/80], Step [400/500] Loss: 0.3060
Epoch [30/80], Step [500/500] Loss: 0.2241
Epoch [31/80], Step [100/500] Loss: 0.1697
Epoch [31/80], Step [200/500] Loss: 0.1628
Epoch [31/80], Step [300/500] Loss: 0.3507
Epoch [31/80], Step [400/500] Loss: 0.2836
Epoch [31/80], Step [500/500] Loss: 0.2011
Epoch [32/80], Step [100/500] Loss: 0.2271
Epoch [32/80], Step [200/500] Loss: 0.1603
Epoch [32/80], Step [300/500] Loss: 0.2815
Epoch [32/80], Step [400/500] Loss: 0.2119
Epoch [32/80], Step [500/500] Loss: 0.3992
Epoch [33/80], Step [100/500] Loss: 0.2674
Epoch [33/80], Step [200/500] Loss: 0.2192
Epoch [33/80], Step [300/500] Loss: 0.3333
Epoch [33/80], Step [400/500] Loss: 0.3045
Epoch [33/80], Step [500/500] Loss: 0.2048
Epoch [34/80], Step [100/500] Loss: 0.3307
Epoch [34/80], Step [200/500] Loss: 0.2005
Epoch [34/80], Step [300/500] Loss: 0.1421
Epoch [34/80], Step [400/500] Loss: 0.3129
Epoch [34/80], Step [500/500] Loss: 0.2223
Epoch [35/80], Step [100/500] Loss: 0.1297
Epoch [35/80], Step [200/500] Loss: 0.2528
Epoch [35/80], Step [300/500] Loss: 0.1897
Epoch [35/80], Step [400/500] Loss: 0.1646
Epoch [35/80], Step [500/500] Loss: 0.2214
Epoch [36/80], Step [100/500] Loss: 0.2536
Epoch [36/80], Step [200/500] Loss: 0.2118
Epoch [36/80], Step [300/500] Loss: 0.2619
Epoch [36/80], Step [400/500] Loss: 0.2337
Epoch [36/80], Step [500/500] Loss: 0.2445
Epoch [37/80], Step [100/500] Loss: 0.2530
Epoch [37/80], Step [200/500] Loss: 0.3891
Epoch [37/80], Step [300/500] Loss: 0.1512
Epoch [37/80], Step [400/500] Loss: 0.1894
Epoch [37/80], Step [500/500] Loss: 0.1827
Epoch [38/80], Step [100/500] Loss: 0.2923
Epoch [38/80], Step [200/500] Loss: 0.1250
Epoch [38/80], Step [300/500] Loss: 0.1928
Epoch [38/80], Step [400/500] Loss: 0.1417
Epoch [38/80], Step [500/500] Loss: 0.3942
Epoch [39/80], Step [100/500] Loss: 0.2499
Epoch [39/80], Step [200/500] Loss: 0.2135
Epoch [39/80], Step [300/500] Loss: 0.1442
Epoch [39/80], Step [400/500] Loss: 0.1517
Epoch [39/80], Step [500/500] Loss: 0.1288
Epoch [40/80], Step [100/500] Loss: 0.2396
Epoch [40/80], Step [200/500] Loss: 0.1906
Epoch [40/80], Step [300/500] Loss: 0.2148
Epoch [40/80], Step [400/500] Loss: 0.1330
Epoch [40/80], Step [500/500] Loss: 0.2699
Epoch [41/80], Step [100/500] Loss: 0.2349
Epoch [41/80], Step [200/500] Loss: 0.2214
Epoch [41/80], Step [300/500] Loss: 0.2116
Epoch [41/80], Step [400/500] Loss: 0.1685
Epoch [41/80], Step [500/500] Loss: 0.1872
Epoch [42/80], Step [100/500] Loss: 0.2569
Epoch [42/80], Step [200/500] Loss: 0.2714
Epoch [42/80], Step [300/500] Loss: 0.2066
Epoch [42/80], Step [400/500] Loss: 0.1444
Epoch [42/80], Step [500/500] Loss: 0.0552
Epoch [43/80], Step [100/500] Loss: 0.1116
Epoch [43/80], Step [200/500] Loss: 0.1501
Epoch [43/80], Step [300/500] Loss: 0.1773
Epoch [43/80], Step [400/500] Loss: 0.2420
Epoch [43/80], Step [500/500] Loss: 0.1612
Epoch [44/80], Step [100/500] Loss: 0.1579
Epoch [44/80], Step [200/500] Loss: 0.1696
Epoch [44/80], Step [300/500] Loss: 0.1036
Epoch [44/80], Step [400/500] Loss: 0.1548
Epoch [44/80], Step [500/500] Loss: 0.1066
Epoch [45/80], Step [100/500] Loss: 0.1976
Epoch [45/80], Step [200/500] Loss: 0.2174
Epoch [45/80], Step [300/500] Loss: 0.1403
Epoch [45/80], Step [400/500] Loss: 0.1959
Epoch [45/80], Step [500/500] Loss: 0.1865
Epoch [46/80], Step [100/500] Loss: 0.2320
Epoch [46/80], Step [200/500] Loss: 0.1705
Epoch [46/80], Step [300/500] Loss: 0.2263
Epoch [46/80], Step [400/500] Loss: 0.2542
Epoch [46/80], Step [500/500] Loss: 0.1033
Epoch [47/80], Step [100/500] Loss: 0.2302
Epoch [47/80], Step [200/500] Loss: 0.1596
Epoch [47/80], Step [300/500] Loss: 0.1005
Epoch [47/80], Step [400/500] Loss: 0.1006
Epoch [47/80], Step [500/500] Loss: 0.1116
Epoch [48/80], Step [100/500] Loss: 0.1319
Epoch [48/80], Step [200/500] Loss: 0.1520
Epoch [48/80], Step [300/500] Loss: 0.2459
Epoch [48/80], Step [400/500] Loss: 0.1722
Epoch [48/80], Step [500/500] Loss: 0.0748
Epoch [49/80], Step [100/500] Loss: 0.1780
Epoch [49/80], Step [200/500] Loss: 0.2386
Epoch [49/80], Step [300/500] Loss: 0.1755
Epoch [49/80], Step [400/500] Loss: 0.1421
Epoch [49/80], Step [500/500] Loss: 0.1742
Epoch [50/80], Step [100/500] Loss: 0.1197
Epoch [50/80], Step [200/500] Loss: 0.1366
Epoch [50/80], Step [300/500] Loss: 0.1197
Epoch [50/80], Step [400/500] Loss: 0.0538
Epoch [50/80], Step [500/500] Loss: 0.1342
Epoch [51/80], Step [100/500] Loss: 0.2153
Epoch [51/80], Step [200/500] Loss: 0.2324
Epoch [51/80], Step [300/500] Loss: 0.2810
Epoch [51/80], Step [400/500] Loss: 0.1406
Epoch [51/80], Step [500/500] Loss: 0.1157
Epoch [52/80], Step [100/500] Loss: 0.2096
Epoch [52/80], Step [200/500] Loss: 0.2195
Epoch [52/80], Step [300/500] Loss: 0.1312
Epoch [52/80], Step [400/500] Loss: 0.0810
Epoch [52/80], Step [500/500] Loss: 0.1606
Epoch [53/80], Step [100/500] Loss: 0.1212
Epoch [53/80], Step [200/500] Loss: 0.1205
Epoch [53/80], Step [300/500] Loss: 0.1437
Epoch [53/80], Step [400/500] Loss: 0.3666
Epoch [53/80], Step [500/500] Loss: 0.2045
Epoch [54/80], Step [100/500] Loss: 0.1223
Epoch [54/80], Step [200/500] Loss: 0.1315
Epoch [54/80], Step [300/500] Loss: 0.1991
Epoch [54/80], Step [400/500] Loss: 0.2255
Epoch [54/80], Step [500/500] Loss: 0.1454
Epoch [55/80], Step [100/500] Loss: 0.1639
Epoch [55/80], Step [200/500] Loss: 0.1156
Epoch [55/80], Step [300/500] Loss: 0.1618
Epoch [55/80], Step [400/500] Loss: 0.1641
Epoch [55/80], Step [500/500] Loss: 0.2260
Epoch [56/80], Step [100/500] Loss: 0.2281
Epoch [56/80], Step [200/500] Loss: 0.1662
Epoch [56/80], Step [300/500] Loss: 0.2069
Epoch [56/80], Step [400/500] Loss: 0.0737
Epoch [56/80], Step [500/500] Loss: 0.1835
Epoch [57/80], Step [100/500] Loss: 0.1179
Epoch [57/80], Step [200/500] Loss: 0.1045
Epoch [57/80], Step [300/500] Loss: 0.1651
Epoch [57/80], Step [400/500] Loss: 0.0943
Epoch [57/80], Step [500/500] Loss: 0.1596
Epoch [58/80], Step [100/500] Loss: 0.0833
Epoch [58/80], Step [200/500] Loss: 0.2477
Epoch [58/80], Step [300/500] Loss: 0.2278
Epoch [58/80], Step [400/500] Loss: 0.1701
Epoch [58/80], Step [500/500] Loss: 0.1955
Epoch [59/80], Step [100/500] Loss: 0.1032
Epoch [59/80], Step [200/500] Loss: 0.1117
Epoch [59/80], Step [300/500] Loss: 0.1026
Epoch [59/80], Step [400/500] Loss: 0.1879
Epoch [59/80], Step [500/500] Loss: 0.1909
Epoch [60/80], Step [100/500] Loss: 0.1473
Epoch [60/80], Step [200/500] Loss: 0.1257
Epoch [60/80], Step [300/500] Loss: 0.1188
Epoch [60/80], Step [400/500] Loss: 0.1970
Epoch [60/80], Step [500/500] Loss: 0.1073
Epoch [61/80], Step [100/500] Loss: 0.0380
Epoch [61/80], Step [200/500] Loss: 0.1633
Epoch [61/80], Step [300/500] Loss: 0.1887
Epoch [61/80], Step [400/500] Loss: 0.1534
Epoch [61/80], Step [500/500] Loss: 0.1583
Epoch [62/80], Step [100/500] Loss: 0.1815
Epoch [62/80], Step [200/500] Loss: 0.2359
Epoch [62/80], Step [300/500] Loss: 0.0880
Epoch [62/80], Step [400/500] Loss: 0.1913
Epoch [62/80], Step [500/500] Loss: 0.1737
Epoch [63/80], Step [100/500] Loss: 0.2746
Epoch [63/80], Step [200/500] Loss: 0.2172
Epoch [63/80], Step [300/500] Loss: 0.1420
Epoch [63/80], Step [400/500] Loss: 0.0832
Epoch [63/80], Step [500/500] Loss: 0.1406
Epoch [64/80], Step [100/500] Loss: 0.1082
Epoch [64/80], Step [200/500] Loss: 0.2194
Epoch [64/80], Step [300/500] Loss: 0.1298
Epoch [64/80], Step [400/500] Loss: 0.1292
Epoch [64/80], Step [500/500] Loss: 0.2240
Epoch [65/80], Step [100/500] Loss: 0.0852
Epoch [65/80], Step [200/500] Loss: 0.1746
Epoch [65/80], Step [300/500] Loss: 0.1457
Epoch [65/80], Step [400/500] Loss: 0.1916
Epoch [65/80], Step [500/500] Loss: 0.1710
Epoch [66/80], Step [100/500] Loss: 0.1483
Epoch [66/80], Step [200/500] Loss: 0.1403
Epoch [66/80], Step [300/500] Loss: 0.1248
Epoch [66/80], Step [400/500] Loss: 0.2174
Epoch [66/80], Step [500/500] Loss: 0.1166
Epoch [67/80], Step [100/500] Loss: 0.1912
Epoch [67/80], Step [200/500] Loss: 0.1391
Epoch [67/80], Step [300/500] Loss: 0.1057
Epoch [67/80], Step [400/500] Loss: 0.2443
Epoch [67/80], Step [500/500] Loss: 0.0956
Epoch [68/80], Step [100/500] Loss: 0.1503
Epoch [68/80], Step [200/500] Loss: 0.1808
Epoch [68/80], Step [300/500] Loss: 0.1577
Epoch [68/80], Step [400/500] Loss: 0.1655
Epoch [68/80], Step [500/500] Loss: 0.1433
Epoch [69/80], Step [100/500] Loss: 0.0869
Epoch [69/80], Step [200/500] Loss: 0.0797
Epoch [69/80], Step [300/500] Loss: 0.0691
Epoch [69/80], Step [400/500] Loss: 0.2408
Epoch [69/80], Step [500/500] Loss: 0.2299
Epoch [70/80], Step [100/500] Loss: 0.2348
Epoch [70/80], Step [200/500] Loss: 0.0870
Epoch [70/80], Step [300/500] Loss: 0.2738
Epoch [70/80], Step [400/500] Loss: 0.1694
Epoch [70/80], Step [500/500] Loss: 0.0701
Epoch [71/80], Step [100/500] Loss: 0.0986
Epoch [71/80], Step [200/500] Loss: 0.1484
Epoch [71/80], Step [300/500] Loss: 0.1326
Epoch [71/80], Step [400/500] Loss: 0.1700
Epoch [71/80], Step [500/500] Loss: 0.1414
Epoch [72/80], Step [100/500] Loss: 0.1369
Epoch [72/80], Step [200/500] Loss: 0.2127
Epoch [72/80], Step [300/500] Loss: 0.1270
Epoch [72/80], Step [400/500] Loss: 0.1060
Epoch [72/80], Step [500/500] Loss: 0.0910
Epoch [73/80], Step [100/500] Loss: 0.0716
Epoch [73/80], Step [200/500] Loss: 0.1834
Epoch [73/80], Step [300/500] Loss: 0.1633
Epoch [73/80], Step [400/500] Loss: 0.0962
Epoch [73/80], Step [500/500] Loss: 0.2256
Epoch [74/80], Step [100/500] Loss: 0.1159
Epoch [74/80], Step [200/500] Loss: 0.1332
Epoch [74/80], Step [300/500] Loss: 0.0759
Epoch [74/80], Step [400/500] Loss: 0.1058
Epoch [74/80], Step [500/500] Loss: 0.0650
Epoch [75/80], Step [100/500] Loss: 0.1064
Epoch [75/80], Step [200/500] Loss: 0.2050
Epoch [75/80], Step [300/500] Loss: 0.1925
Epoch [75/80], Step [400/500] Loss: 0.1693
Epoch [75/80], Step [500/500] Loss: 0.1059
Epoch [76/80], Step [100/500] Loss: 0.1471
Epoch [76/80], Step [200/500] Loss: 0.2312
Epoch [76/80], Step [300/500] Loss: 0.1288
Epoch [76/80], Step [400/500] Loss: 0.1261
Epoch [76/80], Step [500/500] Loss: 0.0943
Epoch [77/80], Step [100/500] Loss: 0.1281
Epoch [77/80], Step [200/500] Loss: 0.1067
Epoch [77/80], Step [300/500] Loss: 0.1063
Epoch [77/80], Step [400/500] Loss: 0.0741
Epoch [77/80], Step [500/500] Loss: 0.0722
Epoch [78/80], Step [100/500] Loss: 0.1122
Epoch [78/80], Step [200/500] Loss: 0.1469
Epoch [78/80], Step [300/500] Loss: 0.1075
Epoch [78/80], Step [400/500] Loss: 0.1998
Epoch [78/80], Step [500/500] Loss: 0.1542
Epoch [79/80], Step [100/500] Loss: 0.1831
Epoch [79/80], Step [200/500] Loss: 0.0942
Epoch [79/80], Step [300/500] Loss: 0.0833
Epoch [79/80], Step [400/500] Loss: 0.1147
Epoch [79/80], Step [500/500] Loss: 0.2378
Epoch [80/80], Step [100/500] Loss: 0.1526
Epoch [80/80], Step [200/500] Loss: 0.1237
Epoch [80/80], Step [300/500] Loss: 0.0743
Epoch [80/80], Step [400/500] Loss: 0.1042
Epoch [80/80], Step [500/500] Loss: 0.1178
Accuracy of the model on the test images: 88.14 %
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • 250
  • 251
  • 252
  • 253
  • 254
  • 255
  • 256
  • 257
  • 258
  • 259
  • 260
  • 261
  • 262
  • 263
  • 264
  • 265
  • 266
  • 267
  • 268
  • 269
  • 270
  • 271
  • 272
  • 273
  • 274
  • 275
  • 276
  • 277
  • 278
  • 279
  • 280
  • 281
  • 282
  • 283
  • 284
  • 285
  • 286
  • 287
  • 288
  • 289
  • 290
  • 291
  • 292
  • 293
  • 294
  • 295
  • 296
  • 297
  • 298
  • 299
  • 300
  • 301
  • 302
  • 303
  • 304
  • 305
  • 306
  • 307
  • 308
  • 309
  • 310
  • 311
  • 312
  • 313
  • 314
  • 315
  • 316
  • 317
  • 318
  • 319
  • 320
  • 321
  • 322
  • 323
  • 324
  • 325
  • 326
  • 327
  • 328
  • 329
  • 330
  • 331
  • 332
  • 333
  • 334
  • 335
  • 336
  • 337
  • 338
  • 339
  • 340
  • 341
  • 342
  • 343
  • 344
  • 345
  • 346
  • 347
  • 348
  • 349
  • 350
  • 351
  • 352
  • 353
  • 354
  • 355
  • 356
  • 357
  • 358
  • 359
  • 360
  • 361
  • 362
  • 363
  • 364
  • 365
  • 366
  • 367
  • 368
  • 369
  • 370
  • 371
  • 372
  • 373
  • 374
  • 375
  • 376
  • 377
  • 378
  • 379
  • 380
  • 381
  • 382
  • 383
  • 384
  • 385
  • 386
  • 387
  • 388
  • 389
  • 390
  • 391
  • 392
  • 393
  • 394
  • 395
  • 396
  • 397
  • 398
  • 399
  • 400
  • 401

Loss随着训练次数增加的变换情况,可以看出loss下降还是很明显的,但到后面阶段,也趋于收敛。
在这里插入图片描述

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/煮酒与君饮/article/detail/863126
推荐阅读
相关标签
  

闽ICP备14008679号