Hi,
I want to stream framed readed from RealSense
camera. For that I’m using EmguCv (OpenCv) and Gstreamer pipeline. I have discovered MediaMtx RTSP server
and I’m using it as RTSP server.
I’m using qsvh264enc
instead of x264enc
because my laptop have Intel UHD620 graphics card and seems that with that my program uses GPU. I have also seen that I can set the pipeline as live with is-live
.
var command = "appsrc is-live=true ! videoconvert ! queue ! qsvh264enc low-latency=true target-usage=7 ! h264parse ! rtspclientsink location=rtsp://localhost:8554/mystream";
var size = new Size(1280, 720);
var backends = CvInvoke.WriterBackends;
var backend = backends.FirstOrDefault(back => back.Name == "GSTREAMER");
_videoWriter = new VideoWriter(command, backend.ID, 0, fps: 30, size, isColor: true);
if (!_videoWriter.IsOpened)
{
Console.WriteLine("Writer not opened!!");
}
With that pipeline seems to be working but only If I set network-caching in VLC to 500ms or something like that. If I let by default I’m having a noticeable latency.
What could be the best pipeline configuration in terms of latency to reduce it and show as less latency as possible in the viewer?
Thanks for your help. With regards,