I’m streaming raw video frames in BGR format using GStreamer in C++ langugae. See below sender (in C++) and receiver (in terminal):
Sender (C++)
// Define pipeline
std::string pipeline = "appsrc name=mysource ! queue ! video/x-raw,format=BGR,width=1280,height=720 ! queue ! rtpvrawpay ! udpsink auto-multicast=0 host=Receiver_IP port=5004”;
// Configure appsrc
g_object_set(G_OBJECT(appsrc_), "caps", gst_caps_new_simple("video/x-raw", "format", G_TYPE_STRING, "BGR", "width", G_TYPE_INT, width_, "height", G_TYPE_INT, height_, "framerate", GST_TYPE_FRACTION, 5, 1, nullptr), nullptr);
Receiver (terminal)
$ gst-launch-1.0 udpsrc address=<Receiver_IP> port=5004 caps="application/x-rtp, media=(string)video, encoding-name=(string)RAW, sampling=(string)BGR, width=(string)1280, height=(string)720, framerate=(string)5/1, depth=(string)8, payload=(int)96" ! queue! rtpvrawdepay ! videoconvert ! autovideosink
I’ve noticed that when I run both the sender (written in C++) and the receiver (in the terminal) on my laptop (13th Gen Intel i7 processor with 32 GB RAM), I experience almost no latency (almost in real-time). However, when I move both sender and receiver to different devices and connect them via WiFi (a local network), I observe a significant latency of approximately 6-7 seconds. BTW, the sender is a small device equipped with an x86 processor (Dual Core Intel Processor) with 4GB RAM and standard WiFi capabilities. When streaming the video frames, the CPU usage of both cores on sender machine reaches up to 80-90%, and multiple instances of GStreamer are executing, as shown by the htop
command in the terminal.
In order to reduce this huge latency to a reasonable value, the first thing that comes to my mind is to downscale the video frame. I wonder what other tips and tricks are suggested to minimize latency in this environment?
Thanks for your suggestions