赞
踩
我的代码
- with open("video.txt",'r',encoding='utf-8') as file:
-
- #video1=[]
- number1=[]
- number2 = []
- number3 = []
- number4 = []
- for i in file:
- #video1.append(i)
- n1=''
- n2=''
- t=0
-
- for j in i :
- #print(type(j))
- #print(type('[\s]'))
- if j !=" " and t == 0:
- n1 = n1 + j
- elif j ==" " and t == 0:
- t = 1
- elif j !=" " and t == 1:
- n2 = n2 + j
- elif j ==" " and t == 1:
- break
-
- number1.append(int(n1))
- number2.append(int(n2))
-
- for l in range(0, number1[len(number1)-1]):
- number3.append(0)
- number4.append(0)
-
- m =0
- for k in range(0, len(number1)-1):
- number3[number1[k]-1] = number3[number1[k]-1] + 1
- for k in range(0, len(number1)-1):
- if number2[k] > m:
- number4[number1[k]-1] = m
- # number3是每帧的人数,number4是累计人数
-
- import cv2
- import numpy as np
-
- lx = 1000
- ly = 1000
- image = np.zeros([lx,ly,0],dtype = np.uint8)
- #image=np.ascontiguousarray(image)
- for k in (1, len(number1)-1):
- #人流变化
- cv2.line(image,(k - 1,ly-number3[k-1]),(k,ly-number3[k]),(255,0,0),2)
- #累计人数
- cv2.line(image,(k - 1,ly-number4[k-1]),(k,ly-number4[k]),(0, 255,0),2)
-
- cv2.imshow("image", image)
- cv2.waitKey()
主要相关代码
-
- import cv2
- import numpy as np
-
- lx = 1000
- ly = 1000
- image = np.zeros([lx,ly,0],dtype = np.uint8)
- #image=np.ascontiguousarray(image)
- for k in (1, len(number1)-1):
- #人流变化
- cv2.line(image,(k - 1,ly-number3[k-1]),(k,ly-number3[k]),(255,0,0),2)
- #累计人数
- cv2.line(image,(k - 1,ly-number4[k-1]),(k,ly-number4[k]),(0, 255,0),2)
-
- cv2.imshow("image", image)
- cv2.waitKey()
我的报错
- Traceback (most recent call last):
- File "F:\建院\大二\srtp天眼识迹\pic\main.py", line 52, in <module>
- cv2.line(image,(k - 1,ly-number3[k-1]),(k,ly-number3[k]),(255,0,0),2)
- cv2.error: OpenCV(4.9.0) :-1: error: (-5:Bad argument) in function 'line'
- > Overload resolution failed:
- > - Layout of the output array img is incompatible with cv::Mat
- > - Expected Ptr<cv::UMat> for argument 'img'
翻了一些文章说是mat不连续,要用np.ascontiguousarray()函数,但是我作为小白研究不明白,在文中注释的地方插入,还是相同的报错。
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。