当前位置:   article > 正文

OpenCV运行gstreamer管道获取相机数据,处理以后,再交给gstreamer显示(QT实现)

OpenCV运行gstreamer管道获取相机数据,处理以后,再交给gstreamer显示(QT实现)

前言

        无意中发现,OpenCV也可以运行gstreamer的命令管道,然后使用appsink来与OpenCV连接起来进行处理,在不断测试之下,先后实现了以下功能:

        1. OpenCV运行gstreamer命令,通过appsink传递给OpenCV显示

        2. OpenCV运行gstreamer命令,然后再把Mat图像数据通过appsrc传递给gstreamer显示

        3. 增加OpenCV处理,然后使用gstreamer的overlay绑定QT的QWidget显示出来

一、环境安装以及简单的测试demo

注意:

  1. gstreamer和opencv的版本一定要匹配才行,比如目前使用的是gstreamer--1.16.3 ,opencv是4.2.0.  ,我没仔细去查具体的版本匹配,但是OpenCV3.4和gstreamer1.16肯定不匹配。

  2. 虚拟机的路径是/usr/lib/x86_64-linux-gnu/ orin上是/usr/local/lib,在pro文件需要区分

安装OpenCV环境:

方法一:源码安装opencv参考:Jetson Orin NX 开发指南(5): 安装 OpenCV 4.6.0 并配置 CUDA 以支持 GPU 加速_jetson安装opencv-CSDN博客

方法二:apt安装opencv开发库:sudo apt-get update sudo apt-get install libopencv-dev

我自己使用的是方法二,但是这个需要自己找到头文件、库那些位置。

安装gstreamer环境,请参考其他文章,或者以下:

  1. sudo apt-get install -y libgstreamer1.0-0 \
  2. gstreamer1.0-plugins-base \
  3. gstreamer1.0-plugins-good \
  4. gstreamer1.0-plugins-bad \
  5. gstreamer1.0-plugins-ugly \
  6. gstreamer1.0-libav \
  7. gstreamer1.0-doc \
  8. gstreamer1.0-tools \
  9. libgstreamer1.0-dev \
  10. libgstreamer-plugins-base1.0-dev
  11. sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libqt5gstreamer-dev libgtk-3-dev
  12. sudo apt-get install libpng12-0

源码功能:实现OpenCV运行gstreamer管道

pro文件:

  1. QT += core gui
  2. greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
  3. DESTDIR = $$PWD/bin
  4. CONFIG += c++11
  5. DEFINES += QT_DEPRECATED_WARNINGS
  6. SOURCES += \
  7. main.cpp
  8. #orin:
  9. CONFIG += link_pkgconfig
  10. PKGCONFIG += opencv4
  11. LIBS += -L/usr/local/lib -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs
  12. INCLUDEPATH += \
  13. /usr/include/opencv2/
  14. #虚拟机:
  15. #CONFIG += link_pkgconfig
  16. #PKGCONFIG += opencv4
  17. #LIBS += -L/usr/lib/x86_64-linux-gnu/ -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio
  18. #INCLUDEPATH += /usr/include/opencv4/opencv2

main.cpp文件:

注意:如果没有相机,可以把下面的 v4l2src device=/dev/video0 换成 videotestsrc

  1. #include <opencv2/opencv.hpp>
  2. int main() {
  3. // GStreamer pipeline字符串
  4. std::string pipeline = "v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1";
  5. // 使用 GStreamer 捕获视频流
  6. cv::VideoCapture cap(pipeline, cv::CAP_GSTREAMER);
  7. // 检查捕获是否成功打开
  8. if (!cap.isOpened()) {
  9. std::cout << "Error opening video stream" << std::endl;
  10. return -1;
  11. }
  12. // 读取并显示视频流
  13. cv::Mat frame;
  14. while (true) {
  15. cap >> frame; // 从视频流中读取帧
  16. if (frame.empty()) break; // 如果帧为空,则退出循环
  17. cv::imshow("Video", frame); // 显示帧
  18. if (cv::waitKey(1) == 27) break; // 按下 ESC 键退出循环
  19. }
  20. // 释放 VideoCapture 对象
  21. cap.release();
  22. cv::destroyAllWindows();
  23. return 0;
  24. }

如果可以实现播放视频,就说明成功。

二、OpenCV运行gstreamer命令,通过appsink传递给OpenCV显示,然后通过appsrc传递回gstreamer,通过overlay绑定QWidget

pro文件: (相关的库,需要按照自己的配)

  1. QT += core gui # multimediawidgets
  2. greaterThan(QT_MAJOR_VERSION, 4): QT += widgets
  3. DESTDIR = $$PWD/bin
  4. CONFIG += c++11
  5. DEFINES += QT_DEPRECATED_WARNINGS
  6. SOURCES += \
  7. main.cpp
  8. # Default rules for deployment.
  9. qnx: target.path = /tmp/$${TARGET}/bin
  10. else: unix:!android: target.path = /opt/$${TARGET}/bin
  11. !isEmpty(target.path): INSTALLS += target
  12. CONFIG += link_pkgconfig
  13. PKGCONFIG += opencv4
  14. LIBS += -L/usr/lib/x86_64-linux-gnu/ -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_imgcodecs -lopencv_videoio
  15. INCLUDEPATH += /usr/include/opencv4/opencv2
  16. #*************************************************
  17. CONFIG += link_pkgconfig
  18. # 虚拟机环境:gstreamer************************
  19. PKGCONFIG += gstreamer-1.0 gstreamer-plugins-base-1.0 opencv4 #gtk+-3.0
  20. LIBS += -lX11
  21. LIBS +=-lglib-2.0
  22. LIBS +=-lgobject-2.0
  23. LIBS +=-lgstreamer-1.0 # <gst/gst.h>
  24. LIBS +=-lgstvideo-1.0 # <gst/video/videooverlay.h>
  25. LIBS +=-L/usr/lib/x86_64-linux-gnu/gstreamer-1.0
  26. LIBS +=-lgstautodetect
  27. LIBS +=-lgstaudio-1.0
  28. LIBS +=-lgstapp-1.0
  29. LIBS += -L/usr/local/lib/ -lgstrtspserver-1.0
  30. LIBS += -ltesseract
  31. INCLUDEPATH += \
  32. /usr/include/glib-2.0 \
  33. /usr/lib/x86_64-linux-gnu/glib-2.0/include \
  34. /usr/include/gstreamer-1.0 \
  35. /usr/lib/x86_64-linux-gnu/gstreamer-1.0/include
  36. # ************************************

main.cpp:

实现思路:

如上图所示,以下是部分源码:

1. OpenCV运行gstreamer命令管道:

  1. // 开始捕获视频并发送到 appsrc v4l2src device=/dev/video0
  2. cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
  3. cv::Mat orinFrame;
  4. gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);

2. OpenCV显示接收到的图片,并且发送数据到appsrc,以供下一个gstreamer接收:

  1. while (capture.isOpened()) {
  2. capture.read(orinFrame);
  3. if (orinFrame.empty()) {
  4. break;
  5. }
  6. cv::imshow("Video", orinFrame); // 显示原始帧
  7. // 其他处理:
  8. cv::Mat frame;
  9. frame = orinFrame;
  10. //cv::bitwise_not(orinFrame, frame); // 反色
  11. //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0); //高斯模糊
  12. // 创建 GStreamer 缓冲区
  13. GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
  14. GstMapInfo info;
  15. gst_buffer_map(buffer, &info, GST_MAP_WRITE);
  16. memcpy(info.data, frame.data, frame.total() * frame.elemSize());
  17. gst_buffer_unmap(buffer, &info);
  18. // 发送缓冲区到 appsrc
  19. GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
  20. if (ret != GST_FLOW_OK) {
  21. g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
  22. }
  23. // 延时一段时间,模拟视频流
  24. cv::waitKey(30);
  25. }

3. 创建接收用的gstreamer管道,然后交给QWidget显示

    这里创建管道,有两种方式:

        1. 第一种是使用gst_parse_launch,然后通过gst_bin_get_by_name查找获取元素指针。

        2. 第二种是使用各种GstElement组合为管道。

    第一种:

  1. // 使用gst_parse_launch
  2. // 创建 GStreamer 管道
  3. std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";
  4. GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
  5. // 获取 appsrc 元素
  6. GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
  7. GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");

    第二种:

  1. // 组合:
  2. GstElement* pipeline = gst_pipeline_new("test-pipeline");
  3. // 设置视频参数
  4. int width = 640;
  5. int height = 480;
  6. int fps = 30;
  7. // 创建 appsrc 元素
  8. GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
  9. // 创建 sink 元素
  10. GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
  11. GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink"); //glimagesink ximagesink xvimagesink
  12. GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
  13. GstElement* converter = gst_element_factory_make("videoconvert", "converter");
  14. g_object_set(G_OBJECT(appsrc), "caps",
  15. gst_caps_new_simple("video/x-raw",
  16. "format", G_TYPE_STRING, "BGR",
  17. "width", G_TYPE_INT, width,
  18. "height", G_TYPE_INT, height,
  19. "framerate", GST_TYPE_FRACTION, fps, 1,
  20. NULL),
  21. "is-live", TRUE,
  22. "format", GST_FORMAT_TIME,
  23. NULL);
  24. // 连接 appsrc 元素和 glimagesink 元素
  25. gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
  26. if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
  27. g_printerr("Failed to link appsrc and fpssink!\n");
  28. gst_object_unref(pipeline);
  29. return -1;
  30. }

4. 链接QT界面,并且启动管道:

  1. // 链接QT界面
  2. gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);
  3. // 启动管道
  4. GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
  5. if (state_ret == GST_STATE_CHANGE_FAILURE) {
  6. g_printerr("Failed to start pipeline!\n");
  7. gst_object_unref(pipeline);
  8. return -1;
  9. }

注意事项:

  1. 接收用的gstreamer管道,其实不用QWidget链接也可,链接了只是为了方便以后扩展添加各种控件的,比如按钮和标签那些。
  2. 接收用的gstreamer管道,caps一定是给appsrc设置的。
  3. 接收用的gstreamer管道,fpsdisplaysink、timeoverlay等很多插件,都无法正常使用,目前不知道是哪里的问题。

完整代码:

  1. #include <QApplication>
  2. #include <QMainWindow>
  3. #include <opencv2/opencv.hpp>
  4. #include <gst/gst.h>
  5. #include <gst/app/gstappsrc.h>
  6. #include <gst/video/videooverlay.h>
  7. int main(int argc, char *argv[]) {
  8. QApplication app(argc, argv);
  9. // 初始化 GStreamer
  10. GMainLoop *loop;
  11. // 设置 GStreamer 调试环境变量
  12. //g_setenv("GST_DEBUG", "4", TRUE);
  13. gst_init(&argc, &argv);
  14. loop = g_main_loop_new (NULL, FALSE);
  15. // 创建窗口
  16. QWidget *window = new QWidget();
  17. window->resize(640, 480);
  18. window->show();
  19. WId xwinid = window->winId();
  20. #if 1
  21. // 使用gst_parse_launch
  22. // 创建 GStreamer 管道
  23. std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";
  24. GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
  25. // 获取 appsrc 元素
  26. GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
  27. GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");
  28. //GstElement* vsink = gst_element_factory_make("glimagesink", "vsink");
  29. //g_object_set (vsink2, "video-sink", vsink, NULL);
  30. #else
  31. // 组合:
  32. GstElement* pipeline = gst_pipeline_new("test-pipeline");
  33. // 设置视频参数
  34. int width = 640;
  35. int height = 480;
  36. int fps = 30;
  37. // 创建 appsrc 元素
  38. GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
  39. // 创建 sink 元素
  40. GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
  41. GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink"); //glimagesink ximagesink xvimagesink
  42. GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
  43. GstElement* converter = gst_element_factory_make("videoconvert", "converter");
  44. g_object_set(G_OBJECT(appsrc), "caps",
  45. gst_caps_new_simple("video/x-raw",
  46. "format", G_TYPE_STRING, "BGR",
  47. "width", G_TYPE_INT, width,
  48. "height", G_TYPE_INT, height,
  49. "framerate", GST_TYPE_FRACTION, fps, 1,
  50. NULL),
  51. "is-live", TRUE,
  52. "format", GST_FORMAT_TIME,
  53. NULL);
  54. // 连接 appsrc 元素和 glimagesink 元素
  55. gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
  56. if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
  57. g_printerr("Failed to link appsrc and fpssink!\n");
  58. gst_object_unref(pipeline);
  59. return -1;
  60. }
  61. #endif
  62. // 链接QT界面
  63. gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);
  64. // 启动管道
  65. GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
  66. if (state_ret == GST_STATE_CHANGE_FAILURE) {
  67. g_printerr("Failed to start pipeline!\n");
  68. gst_object_unref(pipeline);
  69. return -1;
  70. }
  71. // 开始捕获视频并发送到 appsrc v4l2src device=/dev/video0
  72. cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
  73. cv::Mat orinFrame;
  74. gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);
  75. while (capture.isOpened()) {
  76. capture.read(orinFrame);
  77. if (orinFrame.empty()) {
  78. break;
  79. }
  80. cv::imshow("Video", orinFrame); // 显示原始帧
  81. // 其他处理:
  82. cv::Mat frame;
  83. frame = orinFrame;
  84. //cv::bitwise_not(orinFrame, frame); // 反色
  85. //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0); //高斯模糊
  86. // 创建 GStreamer 缓冲区
  87. GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
  88. GstMapInfo info;
  89. gst_buffer_map(buffer, &info, GST_MAP_WRITE);
  90. memcpy(info.data, frame.data, frame.total() * frame.elemSize());
  91. gst_buffer_unmap(buffer, &info);
  92. // 发送缓冲区到 appsrc
  93. GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
  94. if (ret != GST_FLOW_OK) {
  95. g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
  96. }
  97. // 延时一段时间,模拟视频流
  98. cv::waitKey(30);
  99. }
  100. // 停止视频捕获
  101. capture.release();
  102. // 启动管道
  103. gst_element_set_state(pipeline, GST_STATE_PLAYING);
  104. // 主循环
  105. app.exec();
  106. // 停止管道
  107. gst_element_set_state(pipeline, GST_STATE_NULL);
  108. gst_object_unref(pipeline);
  109. return 0;
  110. }

效果:

三、其他:灰度化、高斯模糊、人脸识别

  1. #include <QApplication>
  2. #include <QMainWindow>
  3. #include <opencv2/opencv.hpp>
  4. #include <gst/gst.h>
  5. #include <gst/app/gstappsrc.h>
  6. #include <gst/video/videooverlay.h>
  7. #include <opencv2/core.hpp>
  8. #include <opencv2/dnn.hpp>
  9. #include <opencv2/imgproc.hpp>
  10. #include <opencv2/highgui.hpp>
  11. using namespace std;
  12. using namespace cv;
  13. using namespace cv::dnn;
  14. int main(int argc, char *argv[]) {
  15. QApplication app(argc, argv);
  16. // 初始化 GStreamer
  17. GMainLoop *loop;
  18. // 设置 GStreamer 调试环境变量
  19. //g_setenv("GST_DEBUG", "4", TRUE);
  20. gst_init(&argc, &argv);
  21. loop = g_main_loop_new (NULL, FALSE);
  22. // 创建窗口
  23. QWidget *window = new QWidget();
  24. window->resize(640, 480);
  25. window->show();
  26. WId xwinid = window->winId();
  27. #if 1
  28. // 使用gst_parse_launch
  29. // 创建 GStreamer 管道
  30. std::string pipeline_str = "appsrc name=source caps=\"video/x-raw,format=BGR,width=640,height=480\" ! videoconvert ! xvimagesink name=vsink2";
  31. GstElement* pipeline = gst_parse_launch(pipeline_str.c_str(), NULL);
  32. // 获取 appsrc 元素
  33. GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");
  34. GstElement* vsink2 = gst_bin_get_by_name(GST_BIN(pipeline), "vsink2");
  35. //GstElement* vsink = gst_element_factory_make("glimagesink", "vsink");
  36. //g_object_set (vsink2, "video-sink", vsink, NULL);
  37. #else
  38. // 组合:
  39. GstElement* pipeline = gst_pipeline_new("test-pipeline");
  40. // 设置视频参数
  41. int width = 640;
  42. int height = 480;
  43. int fps = 30;
  44. // 创建 appsrc 元素
  45. GstElement* appsrc = gst_element_factory_make("appsrc", "video-source");
  46. // 创建 sink 元素
  47. GstElement* fpssink = gst_element_factory_make("fpsdisplaysink", "fpssink");
  48. GstElement* vsink = gst_element_factory_make("xvimagesink", "vsink"); //glimagesink ximagesink xvimagesink
  49. GstElement* overlay = gst_element_factory_make("timeoverlay", "overlay");
  50. GstElement* converter = gst_element_factory_make("videoconvert", "converter");
  51. g_object_set(G_OBJECT(appsrc), "caps",
  52. gst_caps_new_simple("video/x-raw",
  53. "format", G_TYPE_STRING, "BGR",
  54. "width", G_TYPE_INT, width,
  55. "height", G_TYPE_INT, height,
  56. "framerate", GST_TYPE_FRACTION, fps, 1,
  57. NULL),
  58. "is-live", TRUE,
  59. "format", GST_FORMAT_TIME,
  60. NULL);
  61. // 连接 appsrc 元素和 glimagesink 元素
  62. gst_bin_add_many(GST_BIN(pipeline),appsrc, converter, vsink, NULL);
  63. if (!gst_element_link_many(appsrc, converter, vsink, NULL)) {
  64. g_printerr("Failed to link appsrc and fpssink!\n");
  65. gst_object_unref(pipeline);
  66. return -1;
  67. }
  68. #endif
  69. // 链接QT界面
  70. gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (vsink2), xwinid);
  71. // 启动管道
  72. GstStateChangeReturn state_ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
  73. if (state_ret == GST_STATE_CHANGE_FAILURE) {
  74. g_printerr("Failed to start pipeline!\n");
  75. gst_object_unref(pipeline);
  76. return -1;
  77. }
  78. // 开始捕获视频并发送到 appsrc v4l2src device=/dev/video0
  79. cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink sync=0 drop=1", cv::CAP_GSTREAMER);
  80. cv::Mat orinFrame;
  81. gst_app_src_set_stream_type(GST_APP_SRC(appsrc), GST_APP_STREAM_TYPE_STREAM);
  82. // 加载CNN模型
  83. //Net net = readNetFromTensorflow("tensorflow_inception_graph.pb");
  84. while (capture.isOpened()) {
  85. capture.read(orinFrame);
  86. if (orinFrame.empty()) {
  87. break;
  88. }
  89. cv::imshow("Video", orinFrame); // 显示原始帧
  90. // 其他处理:
  91. cv::Mat frame;
  92. frame = orinFrame;
  93. //cv::bitwise_not(orinFrame, frame); // 反色
  94. //cv::GaussianBlur(orinFrame, frame, cv::Size(5, 5), 0); // 高斯模糊
  95. //cv::cvtColor(orinFrame, frame, cv::COLOR_BGR2GRAY); // 灰度化
  96. // 识别:*********************************************************
  97. CascadeClassifier faceCascade;
  98. if (!faceCascade.load("/usr/share/opencv4/haarcascades/haarcascade_frontalface_alt.xml"))
  99. {
  100. std::cerr << "Error loading face cascade file!" << std::endl;
  101. return -1;
  102. }
  103. //Mat frame(480, 640, CV_8UC3, Scalar(0, 0, 255)); // 模拟输入图片数据,这里使用红色图片代替
  104. resize(frame, frame, Size(640, 480));
  105. Mat frameGray;
  106. cvtColor(frame, frameGray, COLOR_BGR2GRAY);
  107. equalizeHist(frameGray, frameGray);
  108. std::vector<Rect> faces;
  109. faceCascade.detectMultiScale(frameGray, faces);
  110. for (const auto& face : faces)
  111. {
  112. rectangle(frame, face, Scalar(255, 0, 0), 2); // 绘制人脸框
  113. }
  114. //*****************************************************************
  115. // 创建 GStreamer 缓冲区
  116. GstBuffer* buffer = gst_buffer_new_allocate(NULL, frame.total() * frame.elemSize(), NULL);
  117. GstMapInfo info;
  118. gst_buffer_map(buffer, &info, GST_MAP_WRITE);
  119. memcpy(info.data, frame.data, frame.total() * frame.elemSize());
  120. gst_buffer_unmap(buffer, &info);
  121. // 发送缓冲区到 appsrc
  122. GstFlowReturn ret = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);
  123. if (ret != GST_FLOW_OK) {
  124. g_printerr("Error pushing buffer to appsrc: %s\n", gst_flow_get_name(ret));
  125. }
  126. // 延时一段时间,模拟视频流
  127. cv::waitKey(30);
  128. }
  129. // 停止视频捕获
  130. capture.release();
  131. // 启动管道
  132. gst_element_set_state(pipeline, GST_STATE_PLAYING);
  133. // 主循环
  134. app.exec();
  135. // 停止管道
  136. gst_element_set_state(pipeline, GST_STATE_NULL);
  137. gst_object_unref(pipeline);
  138. return 0;
  139. }

效果:

四、其他待补充

        时间有限,待补充,大家有想法,欢迎一起交流!

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/不正经/article/detail/119723
推荐阅读
相关标签
  

闽ICP备14008679号