赞
踩
计算量对应时间复杂度(时间的长短),参数量对应空间复杂度(占用显存大小)
1GFPLOs = 10^9FLOPs
即:10亿次浮点运算
方法一
1、安装模块
pip install thop
2、计算
- import torch
- import torchvision
- from thop import profile
-
- # Model
- print('==> Building model..')
- model = torchvision.models.alexnet(pretrained=False)
-
- dummy_input = torch.randn(1, 3, 224, 224)
- flops, params = profile(model, (dummy_input,))
- print('flops: ', flops, 'params: ', params)
- print('flops: %.2f M, params: %.2f M' % (flops / 1000000.0, params / 1000000.0))
其中输入input的第一维度是批量(batch size),批量的大小不回影响参数量, 计算量是batch_size=1的倍数.
方法二
pip install ptflops
- import torchvision
- from ptflops import get_model_complexity_info
-
- model = torchvision.models.alexnet(pretrained=False)
- flops, params = get_model_complexity_info(model, (3, 224, 224), as_strings=True, print_per_layer_stat=True)
- print('flops: ', flops, 'params: ', params)
-
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。