当前位置:   article > 正文

基于Pytorch的卷积神经网络代码(CIFAR图像分类)及基本构架_pytorch卷积神经网络代码

pytorch卷积神经网络代码

一、须知

1.本代码所用数据集为CIFAR10,可通过以下代码段进行下载并加载
需要引用 import torchvision

  1. train_data = torchvision.datasets.CIFAR10("../input/cifar10-python", train=True, transform=torchvision.transforms.ToTensor())
  2. test_data = torchvision.datasets.CIFAR10("../input/cifar10-python", train=False, transform=torchvision.transforms.ToTensor())

2.网络不支持数据集中各图片尺寸相互不一的情况,若自行构建数据集或加载别的数据集,请先对数据集尺寸做成统一格式。推荐更改为3 * 32 * 32 ,若更改为其他格式,自行计算nn.Flatten()之后的像素总数,并替换掉nn.Linear(1024, 10)中的1024

3.测试集结果不输出最终类别判断,仅支持正确率(正确个数/测试集总个数) 的输出

4.支持tensorboard

5.加入每100次迭代计算时间差

6.未加入激活函数,需要自行添加

7.由于基础框架比较简单,模型表现效果略差,运行165epoch时,测试集取得最高准确率68.5%

二、网络模型框架

基本构架思路为

读取数据→构建minibacth→选择GPU或CPU训练→选择损失函数→构建前向传递网络→选择GSD模型进行下降并设置超参→开始迭代→计算损失函数→反向传播→更新参数→输出结果→测试

三、完整代码

可直接在kaggle的code上运行,数据集选择cifar10-python即可

  1. import torch
  2. import torchvision
  3. from torch import nn
  4. from torch.utils.data import DataLoader
  5. from torch.utils.tensorboard import SummaryWriter
  6. import time
  7. train_data = torchvision.datasets.CIFAR10("../input/cifar10-python", train=True, transform=torchvision.transforms.ToTensor())
  8. test_data = torchvision.datasets.CIFAR10("../input/cifar10-python", train=False, transform=torchvision.transforms.ToTensor())
  9. train_dataloader = DataLoader(train_data, batch_size=64, drop_last=True)
  10. test_dataloader = DataLoader(test_data, batch_size=64, drop_last=True)
  11. # print(len(train_dataloader)) #781
  12. device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
  13. test_data_size = len(test_dataloader) * 64
  14. print(f'测试集大小为:{test_data_size}')
  15. writer = SummaryWriter("../model_logs")
  16. loss_fn = nn.CrossEntropyLoss(reduction='mean')
  17. loss_fn = loss_fn.to(device)
  18. time_able = False # True
  19. class Model(nn.Module):
  20. def __init__(self):
  21. super(Model, self).__init__()
  22. self.model1 = nn.Sequential(
  23. nn.Conv2d(3, 32, 5, padding=2),
  24. nn.MaxPool2d(2),
  25. nn.Conv2d(32, 32, 5, padding=2),
  26. nn.MaxPool2d(2),
  27. nn.Conv2d(32, 64, 5, padding=2),
  28. nn.MaxPool2d(2),
  29. nn.Flatten(),
  30. nn.Linear(1024, 64),# 182528
  31. nn.Linear(64, 10)
  32. )
  33. def forward(self, x):
  34. x = self.model1(x)
  35. return x
  36. model = Model()
  37. model = model.to(device)
  38. optimizer = torch.optim.SGD(model.parameters(), lr=0.001)
  39. epoch = 50
  40. running_loss = 0
  41. total_train_step = 0
  42. total_test_step = 0
  43. if time_able:
  44. str_time = time.time()
  45. for i in range(epoch):
  46. print(f'第{i + 1}次epoch')
  47. for data in train_dataloader:
  48. imgs, targets = data
  49. imgs = imgs.to(device)
  50. targets = targets.to(device)
  51. output = model(imgs)
  52. loss = loss_fn(output, targets)
  53. optimizer.zero_grad()
  54. loss.backward()
  55. optimizer.step()
  56. total_train_step += 1
  57. if total_train_step % 100 == 0:
  58. if time_able:
  59. end_time = time.time()
  60. print(f'{str_time-end_time}')
  61. print(f'第{total_train_step}次训练,loss = {loss.item()}')
  62. writer.add_scalar("train_loss", loss.item(), total_train_step)
  63. # 测试
  64. total_test_loss = 0
  65. total_accuracy = 0
  66. with torch.no_grad():
  67. for data in test_dataloader:
  68. imgs, targets = data
  69. imgs = imgs.to(device)
  70. targets = targets.to(device)
  71. outputs = model(imgs)
  72. loss = loss_fn(outputs, targets)
  73. total_test_loss = total_test_loss + loss
  74. accuracy = (outputs.argmax(1) == targets).sum()
  75. total_accuracy += accuracy
  76. total_test_loss = total_test_loss / test_data_size
  77. print(f'整体测试集上的loss = {total_test_loss}')
  78. print(f'整体测试集正确率 = {total_accuracy / test_data_size}')
  79. writer.add_scalar("test_loss", total_test_loss.item(), total_test_step)
  80. writer.add_scalar("test_accuracy", total_accuracy / test_data_size, total_test_step)
  81. total_test_step += 1
  82. writer.close()

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/194177
推荐阅读
相关标签
  

闽ICP备14008679号