Muxing metadata over UDP Stream

Hi,

I am working on a project that would send metadata from my drone together with my video.
I have a CSI camera connected to JetsonNX, and I was using OpenCV with Gstreamer pipeline until now, based on this post:

But I want to mux tracking box data into metadata so I can draw it later on the ground. I was searching around and found out that, for that .ts format is used. However, I failed to find any working example of someone using it using Gstreamer on my Ubuntu PC, not even mentioning JetsonNX.

I checked and GStreamer has some elements for doing that:

gst-inspect-1.0 | grep mpegtsmpegtsdemux:  tsparse: MPEG transport stream parser
mpegtsdemux:  tsdemux: MPEG transport stream demuxer
mpegtsmux:  mpegtsmux: MPEG Transport Stream Muxer
typefindfunctions: video/mpegts: ts, mts
libav:  avmux_mpegts: libav MPEG-TS (MPEG-2 Transport Stream) muxer (not recommended, use mpegtsmux instead)

It would be perfect if I could use it in OpenCV like in that post, as I want to add data that’s in my code.

Can someone help me achieve that?

Can anyone help me with that?

For muxing into MPEG-TS KLV metadata is probably most convenient. There are existing standards like MISB Standard 0903.6: Video Moving Target Indicator Metadata and other MISB standards for that.

You can also put custom metadata into the H.264/H.265 bitstream before muxing it into MPEG-TS, by inserting custom SEIs into the video bitstream.

All depends a bit on your interoperability requirements.

Also note that the referenced link in original post was using RTP-H264 over UDP.
If muxing KLV metadata within mpeg TS, you may switch to RTP-MP2T.

I might not further help for this, though for starting with KLV you may use your favorite web search for: gstreamer metadata mux demux narkive that may give some info for starting.

Sure that MISB standard looks fine to me, it’s always good to follow some standard I think. That KLV seems also easier for me and seems more flexible for future data I might add.
But how to do that in code? I am trying different approaches but I cannot get it to work :confused:

It’s hard to help if you don’t share details about what you’re trying to do :slight_smile:

If you’re using MPEG-TS, you can feed KLV metadata into the muxer via an appsrc element (may need to do the timestamping manually or let the appsrc do the timestamping if it’s a live pipeline). Also need to configure appsrc to operate in TIME format.

If you’re using RTP instead of MPEG-TS, you can send KLV as a separate RTP data stream using the rtpklvpay and rtpklvdepay elements, and send the H.264 data in a separate RTP stream. These can be sent from/to the same ports if needed (one would use different payload type (pt) numbers then so they can be peeled apart again on the receiver side if needded).

1 Like

Hey, I’m experimenting with synchronizing data with video frames on WebRTC. Is it possible to send KLV using RTP and decode it on a WebRTC peer to be used as a data stream with the video frames? I would be using GStreamer WebRTC send video/associated data and the WebRTC JS API to recieve and to display on a browser