Trying to multiplex sources with differing segment start times


I’m trying to solve an issue I’ve come across with GstSegments and segment events, but I’m just confused as to what I should actually be doing.

Currently, I have a muxer element that I’ve written which feeds buffers into a sink element that is also of my own creation. I’m trying to multiplex multiple audio and video feeds into a single transport connection. However, the segment start times for the pipelines don’t match, which is causing render problems in the sink as buffers are being clipped and dropped because the PTS/DTS values in the buffers from the “behind” source pipeline.

+--------------+ +--------------+ +---------+ +------------+  +--- Segment start:        
|              | |              | |         | |            |  v    1000:00:00.000000     
| videotestsrc +-> videoconvert +-> x264enc +-> rtph264pay +----+            |           
|              | |              | |         | |            |    |  +-------+ | +--------+
+--------------+ +--------------+ +---------+ +------------+    +-->       | v |        |
                                                                   | mymux +---> mysink |
                 +--------------+ +---------+ +------------+    +-->       |   |        |
                 |              | |         | |            |    |  +-------+   +--------+
                 | audiotestsrc +-> opusenc +-> rtpopuspay +----+                        
                 |              | |         | |            |  ^     Segment start:       
                 +--------------+ +---------+ +------------+  +---- 0:00:00.000000       

When I start using actual video and audio sources, I want them to be able to sync, so I’d assume that I need to make the segment events that the respective pipelines create match the start time, but I can’t figure out how to set the segment start time on the pipelines. Or is this not the right approach? Should I actually be applying offsets to the PTS/DTS values so they match in the muxer?

All sources are live with no seeking ability (at least, I don’t envision there ever being a need to perform actual seeking). I’m a bit lost, what should I do?

An element that combines multiple streams should always compare the times of those streams using ‘running time’ which should be in sync. You use gst_segment_to_running_time{,_full}() to convert a buffer timestamp to running time.

The 1000hour thing is so that x264enc can produce buffers before the initial start time and is a consequence of allowing negative DTS values.

Look at most other muxers in GStreamer for the general idea of how to mux multiple streams together.

Thanks for the pointer. I’ve been able to take another look at this and yes, your suggestion to use gst_segment_to_running_time is exactly what I was after. All the examples I could find were wrapped up with aggregator pads or RTP payloader base classes which made it not obvious what was needed to do, but effectively I’m getting the segment event from the sink pad that’s delivered the buffer with gst_pad_get_sticky_event, parsing the segment with gst_event_parse_segment and then doing GST_BUFFER_PTS (buf) = gst_segment_to_running_time (segment, GST_FORMAT_TIME, GST_BUFFER_PTS (buf)); which means the buffers get passed along immediately.

This also obviously needed me to make my own stream start event and segment event when the src pad from my element is linked, and I’m not forwarding stream start or segment events received on sink pads, but the next element is a sink that doesn’t need any of that information anyway, it’s just given buffers and immediately payloads them over the network.