Hi all, I have developed an SRT streaming pipeline for RGB-D sensor data streaming. Here is the server side:
std::stringstream pipe_ss;
pipe_ss << "appsrc name=colorsrc format=3 is-live=true stream-type=0 ! video/x-raw, format=BGRA";
pipe_ss << ", width=" << color.width;
pipe_ss << ", height=" << color.height;
pipe_ss << ", framerate=" << fps << "/1";
pipe_ss << " ! videoconvert ! ";
pipe_ss << "openh264enc name=h264enc bitrate=10000000 complexity=0 rate-control=0 gop-size=5 ! h264parse name=h264parse ! ";
pipe_ss << "video/x-h264 ! queue max-size-bytes=0 max-size-buffers=1000 max-size-time=0 ! mux.video_0 ";
max-size-time=0 leaky=downstream! mux.video_0 ";
// depth stream
pipe_ss << "appsrc name=depthsrc format=3 is-live=true ! video/x-raw, format=GRAY16_LE";
pipe_ss << ", width=" << depth.width;
pipe_ss << ", height=" << depth.height;
pipe_ss << ", framerate=" << fps << "/1";
pipe_ss << " ! rawvideoparse format=gray16-le width=" << depth.width;
pipe_ss << " height=" << depth.height;
pipe_ss << " framerate=" << fps << "/1";
pipe_ss << " ! queue max-size-bytes=0 max-size-buffers=1000 max-size-time=0 leaky=downstream ! mux.video_1 ";
pipe_ss << "matroskamux name=mux streamable=true ! queue max-size-bytes=0 max-size-buffers=1000 max-size-time=0 leaky=downstream ! ";
pipe_ss << "srtsink auto-reconnect=true latency=200 wait-for-connection=true uri=srt://" << host_ip << ":" << stream_port << "?mode=caller";
and here is the client side:
std::stringstream ss;
ss << "srtsrc uri=srt://" << host_ip << ":" << stream_port << "?mode=listener latency=200";
ss << " ! matroskaparse ! matroskademux name=demux ";
// color stream
ss << "demux.video_0 ! queue ! h264parse name=h264parse ! openh264dec name=h264dec ! videoconvert ";
ss << "! video/x-raw, format=BGRA, framerate=" << fps << "/1 ";
ss << "! appsink name=colorsnk sync=true ";
// depth stream
ss << "demux.video_1 ! queue ! rawvideoparse format=gray16-le";
ss << " width=" << depth.width;
ss << " height=" << depth.height;
ss << " framerate=" << fps << "/1";
ss << " ! video/x-raw, format=GRAY16_LE ! appsink name=depthsnk sync=true ";
So what I am trying to do is mux the RGB and depth stream in a matroska container on the server side and demux it on the client side.
While testing on the same PC, the above pipelines work really well. When I move the server on another PC, then streaming is OK for some seconds, but suddenly stops.GST_DEBUG=3,4
does not provide any meaningful information (at least I didn’t find any meaningful information to pinpoint the issue).
The server PC runs Windows 10, and the client PC runs Windows 11.
Any insights are really appreciated. Thanks in advance!