We are currently working on a project where we need to audio stream from a embedded board to different mobile phones.
As part of determining audio quality we also need to find the network latency. Can anyone please provide some pointers in this regard and help us how we can proceed with measuring network latency with such setup.
We already tried using Wireshark and tcpdump captures. With this method we can get some latency on Ubuntu machines but these tools cannot be used on android mobile phones as it needs rooting.
So looks like we need to send timestamp over RTP headers. Are there any example code available for sending timestamps over RTP/UDP headers in gstreamer?
If you’re using webrtc, you can use the get-stats signal on webrtcbin and look at round-trip-time. Otherwise, the answer depends on what protocol you’re using, what elements you’re using, etc.
Thanks for your response. We will try out webrtc also once.
Currently we are using RTP over UDP to keep things simple as the number of clients is high and we are streaming from a embedded board.
Are there any code examples in gstreamer to include timestamp in RTP/UDP header to measure latency at receiving end by comparing timestamps?
For RTP over UDP you have to use rtpbin and make sure that you are also sending and receiving RTCP. The documentation I linked above is a good resource, and so are the tests and examples in gst-plugins-good:
See also: rtpbin_buffer_list.c
Then you can use the get stats for each rtp session (one for each RTP stream being sent or received, so in your case there would be two).
Thanks for the inputs. We tried out using rtpbin and also obtained RTT using RTCP packets. But I see that RTT values are not consistent and keeps varying from 400 to 1600+ values even though there is not much going on in network.
Below are the pipelines we are using