赞
踩
- Loading /home/inference/Amplitudemode_AI/all_model_and_pred/xxx/segment/train3/weights/last.onnx for ONNX OpenCV DNN inference...
- [ERROR:0@3.062] global onnx_importer.cpp:1051 handleNode DNN/ONNX: ERROR during processing node with 2 inputs and 2 outputs: [Split]:(onnx_node!/model.22/Split) from domain='ai.onnx'
- Traceback (most recent call last):
- File "/home/inference/Amplitudemode_AI/all_model_and_pred/AI_Ribfrac_ths/onnx_test_seg/infer-seg.py", line 167, in <module>
- model = AutoBackend(weights="/home/inference/Amplitudemode_AI/all_model_and_pred/xxx/segment/train3/weights/last.onnx", dnn=True)
- File "/home/inference/miniconda3/envs/yolov8/lib/python3.10/site-packages/ultralytics/nn/autobackend.py", line 124, in __init__
- net = cv2.dnn.readNetFromONNX(w)
- cv2.error: OpenCV(4.7.0) /io/opencv/modules/dnn/src/onnx/onnx_importer.cpp:1073: error: (-2:Unspecified error) in function 'handleNode'
- > Node [Split@ai.onnx]:(onnx_node!/model.22/Split) parse error: OpenCV(4.7.0) /io/opencv/modules/dnn/src/layers/slice_layer.cpp:274: error: (-215:Assertion failed) splits > 0 && inpShape[axis_rw] % splits == 0 in function 'getMemoryShapes'
- >
上述是尝试用opencv读取模型时的报错信息。
接着去github上的yolov8官方项目的问题区搜索,经过尝试最终搜索关键字如下:
ONNX DNN splits > 0 && inpShape[axis_rw] % splits == 0 in function 'getMemoryShapes
找到解决方法如下转换时要设置(关键是添加opset=11)
yolo mode=export model=runs/detect/train/weights/best.pt imgsz=[640,640] format=onnx opset=11
实际转化代码如下:
- from ultralytics import YOLO
-
- model = YOLO(
- "/home/inference/Amplitudemode_AI/all_model_and_pred/xxx/segment/train3/weights/last.pt")
- success = model.export(format="onnx", opset=11, simplify=True) # export the model to onnx format
- assert success
用转换好的onnx调用官方api推理如下:
- from ultralytics import YOLO
- model = YOLO("/home/inference/Amplitudemode_AI/all_model_and_pred/xxx/segment/train3/weights/last.onnx") # 模型加载
- results = model.predict(
- source='/home/inference/tt',imgsz=640, save=True, boxes=False) # save plotted images 保存绘制图片
正常推理成功。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。