Stream frame opencv via rtsp using gstreamer c++

Hi everyone,

I am using opencv to read video from USBCamera.
I want to send frame video in opencv via rtsp using gstreamer.
This is my code to read video by opencv:

#include <iostream>
#include <string>
#include <unistd.h>

#include <opencv2/opencv.hpp>
#include <gst/gst.h>

int main(int argc, char *argv[]) {

    cv::VideoCapture cap("/dev/video2", cv::CAP_V4L2); 
    if (!cap.isOpened()) {
        std::cerr << "Error: Unable to open camera." << std::endl;
        return -1;
    }
    
    while (true){
      cap >> frame;
      if (frame.empty()){
         break;
      }
      
      // send "frame" via rtsp (rtsp://localhost:8554/live)
      
      
      //imshow("Video Player", frame);
    }	

	return 0;
}

Can you help my slove this problem!
Thanks

May you further tell the following ?

  • Do you need to process frames with opencv ? (A pure gstreamer pipeline may be more efficient for just streaming your camera)
  • Do you already have a RTSP server running, or do you want to create a RTSP server just for that application ? In latter case you may install gst-rtsp-server and build test-launch example for your gst version.
  • Assuming you’ve installed v4l-utils package, what gives: v4l2-ctl --device=/dev/video2 --list-formats-ext ?

I want to use the opencv to stream. I can try by command line, it is ok. Because i am using 2 camera, so i use opencv for 2 camera and stream them.

With gstreamer you can use compositor for stitching or composing, if you don’t want to further process frames with opencv, this might be better.

For better further advice, please also reply to points 2 and 3.

You can have the example with task: “read frame from camera and stream them by rtsp (rtsp://127.0.0.1:8554/test”. I tried find but i didn’t find the example for this problem

So I suppose you don’t already have a RTSP server running. You may try the following example. Note that this requires that you have installed gst rtsp server package (such as sudo apt install libgstrtspserver-1.0-dev with Ubuntu), and that your opencv build has gstreamer support (the python code below checks that):

C++ version

#include <iostream>
#include <thread>
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#include <opencv2/opencv.hpp>
#include <opencv2/videoio.hpp>


int main() {

    /********************************************************
     * First create a RTSP server that would read RTP/H264
     * from localhost UDP port 5004 and stream as RTSP to
     * rtsp://127.0.0.1:8554/test
     ********************************************************/
    gst_init(NULL, NULL);
    GMainLoop *serverloop = g_main_loop_new(NULL, FALSE);
    GstRTSPServer *server = gst_rtsp_server_new();
    GstRTSPMountPoints *mounts = gst_rtsp_server_get_mount_points(server);
    GstRTSPMediaFactory *factory = gst_rtsp_media_factory_new();
    gst_rtsp_media_factory_set_launch(factory, "( udpsrc port=5004 ! application/x-rtp,encoding-name=H264 ! rtph264depay ! h264parse ! rtph264pay name=pay0 )");
    gst_rtsp_mount_points_add_factory(mounts, "/test", factory);
    gst_rtsp_server_attach(server, NULL);
    std::thread serverloopthread(g_main_loop_run, serverloop);
    std::cout << "stream ready at rtsp://127.0.0.1:8554/test" << std::endl;



    /********************************************************
     * Now RTSP server is running in its own thread, let's 
     * create an opencv application reading from the  
     * camera,encoding into H264 and sending as RTP/H264 
     * to localhost UDP/5004
     ********************************************************/
    //cv::VideoCapture camera("videotestsrc is-live=1 ! video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! queue ! appsink drop=1", cv::CAP_GSTREAMER);
    cv::VideoCapture camera(0);
    if (!camera.isOpened()) {
        std::cerr << "Failed to open camera. Exiting" << std::endl;
    	g_main_loop_quit(serverloop);
    	serverloopthread.join();
        return -1;
    }

    float fps = camera.get(cv::CAP_PROP_FPS);
    int w = camera.get(cv::CAP_PROP_FRAME_WIDTH);
    int h = camera.get(cv::CAP_PROP_FRAME_HEIGHT);
    if (w%4) {
    	std::cerr << "Width is not a multiple of 4. Unsupported" << std::endl;
    	g_main_loop_quit(serverloop);
    	serverloopthread.join();
    	return(-1);
    }
    if (h%2) {
    	std::cerr << "Height is not a multiple of 2. Unsupported" << std::endl;
    	g_main_loop_quit(serverloop);
    	serverloopthread.join();
    	return(-1);
    }

    cv::VideoWriter rtph264_writer("appsrc ! queue ! videoconvert ! video/x-raw,format=I420 ! x264enc key-int-max=30 insert-vui=1 tune=zerolatency ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5004", cv::CAP_GSTREAMER, 0, fps, cv::Size(w, h));
    if (!rtph264_writer.isOpened()) {
        std::cerr << "Failed to open writer. Exiting" << std::endl;
    	g_main_loop_quit(serverloop);
    	serverloopthread.join();
        return -1;
    }

    while (true) {
        cv::Mat frame;
        camera >> frame;
        if (frame.empty()) {
            break;
        }
        rtph264_writer.write(frame);
    }

    rtph264_writer.release();
    camera.release();

    return 0;
}

Built with:

gcc -Wall -o test test.cpp $(pkg-config --cflags --libs gstreamer-1.0) $(pkg-config --cflags --libs gstreamer-rtsp-server-1.0) $(pkg-config --cflags --libs opencv4)  -lstdc++

python version:

#######################################################
# First create a RTSP server that would read RTP/H264
# from localhost UDP port 5004 and stream as RTSP to
# rtsp://127.0.0.1:8554/test
#######################################################
import gi
gi.require_version('Gst','1.0')
gi.require_version('GstVideo','1.0')
gi.require_version('GstRtspServer','1.0')
from gi.repository import GLib, Gst, GstVideo, GstRtspServer
from threading import Thread

Gst.init(None)
serverloop = GLib.MainLoop()
server = GstRtspServer.RTSPServer()
mounts = server.get_mount_points()
factory = GstRtspServer.RTSPMediaFactory()
factory.set_launch('( udpsrc port=5004 ! application/x-rtp,encoding-name=H264 ! rtph264depay ! h264parse ! rtph264pay name=pay0 )')
mounts.add_factory("/test", factory)
server.attach(None)
server_loop_thread = Thread(target=serverloop.run)
print("stream ready at rtsp://127.0.0.1:8554/test")
server_loop_thread.start()



######################################################
# Now RTSP server is running in its own thread, let's 
# create an opencv application reading from the  
# camera,encoding into H264 and sending as RTP/H264 
# to localhost UDP/5004
######################################################
import cv2
import re
print('GStreamer support: %s' % re.search(r'GStreamer\:\s+(.*)', cv2.getBuildInformation()).group(1))
print('FFMPEG support: %s' % re.search(r'FFMPEG\:\s+(.*)', cv2.getBuildInformation()).group(1))
print('V4L/V4L2 support: %s' % re.search(r'v4l/v4l2\:\s+(.*)', cv2.getBuildInformation()).group(1))

# Initialize the camera
#camera = cv2.VideoCapture('videotestsrc is-live=1 ! video/x-raw,format=BGR,width=640,height=480,framerate=30/1 ! queue ! appsink drop=1', cv2.CAP_GSTREAMER)
camera = cv2.VideoCapture(0) 
if not camera.isOpened():
	print('Failed to open camera. Exiting');
	serverloop.quit()
	quit()
fps = float(camera.get(cv2.CAP_PROP_FPS))
w = int(camera.get(cv2.CAP_PROP_FRAME_WIDTH))
h = int(camera.get(cv2.CAP_PROP_FRAME_HEIGHT))
size = (w, h)
print(f"Camera opened: framing {(w)}x{(h)} @{fps} fps")
if w%4:
	print("Width is not a multiple of 4. Unsupported")
	serverloop.quit()
	quit()
if h%2:
	print("Height is not a multiple of 2. Unsupported")
	serverloop.quit()
	quit()

# Create a VideoWriter to localhost port 5004 as RTP/H264
rtph264_writer = cv2.VideoWriter('appsrc !  queue !  videoconvert ! video/x-raw,format=I420 ! x264enc key-int-max=30 insert-vui=1 tune=zerolatency ! queue ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5004', cv2.CAP_GSTREAMER, 0, fps, (w,h))
if not rtph264_writer.isOpened():
	print('Failed to open writer. Exiting');
	serverloop.quit()
	quit()
print('Writer opened, streaming RTP/H264 to localhost:5004')

# Forever loop
while True:
	# Read a frame from the camera
	ret, frame = camera.read()

	if not ret:
		break

	# Write the frame to the RTP stream
	rtph264_writer.write(frame)

# Clean up
rtph264_writer.release()
camera.release()

Don’t worry about the opencv warning about duration and position.

When your app is running, from another terminal you would test with:

gst-play-1.0 rtsp://127.0.0.1:8554/test

# Or for better latency:
gst-launch-1.0 playbin uri=rtsp://127.0.0.1:8554/test uridecodebin0::source::latency=0

Hi i am working for the same scenario but can we use directly gstreamer for writing the buffer on appsrc as the pipeline you have suggest same and i am using below code to push

data = frame.tobytes()
buf = Gst.Buffer.new_allocate(None, len(data), None)
buf.fill(0, data)
source.emit(‘push-buffer’, buf)

but when i try to connect with gst-play bleow logs are showing

Now playing rtsp://127.0.0.1:8554/test
Pipeline is live.
ERROR Unhandled error for rtsp://127.0.0.1:8554/test
ERROR debug information: …/gst/rtsp/gstrtspsrc.c(6795): gst_rtspsrc_send (): /GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Service Unavailable (503)
Reached end of play list.

Is there anything i am missing

This isn’t enough to specify a raw video frame, it’s missing at least width and height and framerate, and if your buffer has a rowstride other than the expected GStreamer default for the given width you may also have to attach a GstVideoMeta to the buffers.

As far as I understand, the width, height and framerate are given by VideoWriter arguments: ..., fps, cv::Size(w, h).
This allows the VideoWriter to adapt to the VideoCapture resolution and framerate.
This works fine in my case.
Though, this caps string may be omitted, as BGR is the default format for color frames with VideoWriter. I’ll edit my example and remove that.