赞
踩
本文记录pytorch模型在rk3588上的推理过程。
预备条件:
总体步骤:
目录
下载源代码
git clone https://github.com/rockchip-linux/rknn-toolkit2.git
创建一个新环境,这里选用了官方教程中的python3.6版本
conda create -n deploy python=3.6
安装rknn-toolkit2依赖:
- sudo apt-get install libxslt1-dev zlib1g-dev libglib2.0 libsm6 libgl1-mesa-glx libprotobuf-dev gcc
- cd /rknn-toolkit2-master/doc
- pip install -r requirements_cp36-1.5.2.txt #根据python版本和下载的rknn-toolkit2版本而定
安装rknn-toolkit2
- cd rknn-toolkit2-master/packages
- pip install rknn_toolkit2-1.5.2+b642f30c-cp36-cp36m-linux_x86_64.whl #根据自己版本来
检查是否安装成功
- conda activate deploy
- python
- from rknn.api import RKNN
没报错则为成功
安装onnx
pip install onnx
参考下述代码,运行成功后会在当前路径下生成一个model.onnx
- import torch.onnx
-
- model_path = "" #这里填写pth模型的路径
-
- model = torch.load(model_path, map_location=torch.device('cpu'))
- model.eval()
-
- input_names = ['input']
- output_names = ['output']
-
- x = torch.randn(1, 3, 224, 224, requires_grad=True)
- torch.onnx.export(model, x, 'model.onnx', verbose=True, input_names=input_names, output_names=output_names)
检查onnx模型,安装onnxruntime
pip install onnxruntime
运行下述代码,查看onnx模型和pth模型输出的推理结果是否一致
- import onnxruntime
- from onnxruntime.datasets import get_example
- import torch
-
- def to_numpy(tensor):
- return tensor.detach().cpu().numpy() if tensor.requires_grad else tensor.cpu().numpy()
-
- dummy_input = torch.randn(1, 3, 224, 224, device='cpu')
-
- onnx_model = get_example("") # 注意这里要写onnx模型的绝对路径
- sess = onnxruntime.InferenceSession(onnx_model)
-
- onnx_result = sess.run(None, {'input': to_numpy(dummy_input)})
- print(onnx_result)
-
- model_path = '' #这里写pth模型的路径
- model = torch.load(model_path, map_location=torch.device('cpu'))
- model.eval()
- pytorch_result = model(dummy_input)
- print(pytorch_result)
代码如下:
- from rknn.api import RKNN
- import os
-
- if __name__ == '__main__':
- platform = 'rk3588'
-
- '''step 1: create RKNN object'''
- rknn = RKNN()
-
- '''step 2: load the .onnx model'''
- rknn.config(target_platform='rk3588')
- print('--> Loading model')
- ret = rknn.load_onnx('') # 这里填写onnx模型的路径
- if ret != 0:
- print('load model failed')
- exit(ret)
- print('done')
-
- '''step 3: building model'''
- print('-->Building model')
- ret = rknn.build(do_quantization=False)
- if ret != 0:
- print('build model failed')
- exit()
- print('done')
-
- '''step 4: export and save the .rknn model'''
- RKNN_MODEL_PATH = 'out.rknn' # 这里填写rknn模型的名称
- ret = rknn.export_rknn(RKNN_MODEL_PATH)
- if ret != 0:
- print('Export rknn model failed.')
- exit(ret)
- print('done')
-
- '''step 5: release the model'''
- rknn.release()
运行上述代码,记得先进入deploy环境
- conda activate deploy
- python onnx2rknn.py
运行成功后会在当前路径下生成rknn模型
将pc上下载的rknn-toolkit2/rknn_toolkit_lite2拷到板端
将上一步转出的rknn模型拷到板端
下载地址:Miniconda — miniconda documentation
在下载目录下打开终端,输入下述命令进行安装
bash Miniconda3-latest-Linux-aarch64.sh
然后一路回车+yes(需要输入两次),完成后可以查看一下conda版本,确认是否安装成功
conda --version
新建并激活环境(python=3.8)
- conda create -n rknnlite python=3.8
- conda activate rknnlite
安装rknn_toolkit_lite2
- cd rknn_toolkit_lite2/packages
- pip install rknn_toolkit_lite2-1.5.2-cp38-cp38-linux_aarch64.whl # 根据实际版本
检查是否安装成功,如果没有报错则安装成功
- python
- from rknnlite.api import RKNNLite
使用官方例子测试
- cd rknn_toolkit_lite2/examples/inference_with_lite
- python test.py
如果遇到下述错误,解决方案见:rk3588初始化模型出错的解决方案
示例代码:
- import cv2
- import numpy as np
- import platform
- from rknnlite.api import RKNNLite
-
- # decice tree for rk356x/rk3588
- DEVICE_COMPATIBLE_NODE = '/proc/device-tree/compatible'
-
- INPUT_SIZE = 224
-
- RK3588_RKNN_MODEL = 'yourmodel.rknn' # 这里修改为前面转换得到的rknn
-
-
- if __name__ == '__main__':
-
- rknn_model = RK3588_RKNN_MODEL
- rknn_lite = RKNNLite()
-
- # load RKNN model
- print('--> Load RKNN model')
- ret = rknn_lite.load_rknn(rknn_model)
- if ret != 0:
- print('Load RKNN model failed')
- exit(ret)
- print('done')
-
- ori_img = cv2.imread('./pic/0/t0101d9da1203312457.jpg')
- resized_img = cv2.resize(ori_img, (224, 224))
- img = np.float32(resized_img)
- cv2.normalize(img, img, 1, 0, cv2.NORM_MINMAX)
-
- # init runtime environment
- print('--> Init runtime environment')
- ret = rknn_lite.init_runtime(core_mask=RKNNLite.NPU_CORE_0)
-
- if ret != 0:
- print('Init runtime environment failed')
- exit(ret)
- print('done')
-
- # Inference
- print('--> Running model')
- outputs = rknn_lite.inference(inputs=[img])
- print(outputs)
- print('done')
-
- rknn_lite.release()
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。