Hello,
I’m trying to solve an issue I’ve come across with GstSegments and segment events, but I’m just confused as to what I should actually be doing.
Currently, I have a muxer element that I’ve written which feeds buffers into a sink element that is also of my own creation. I’m trying to multiplex multiple audio and video feeds into a single transport connection. However, the segment start times for the pipelines don’t match, which is causing render problems in the sink as buffers are being clipped and dropped because the PTS/DTS values in the buffers from the “behind” source pipeline.
+--------------+ +--------------+ +---------+ +------------+ +--- Segment start:
| | | | | | | | v 1000:00:00.000000
| videotestsrc +-> videoconvert +-> x264enc +-> rtph264pay +----+ |
| | | | | | | | | +-------+ | +--------+
+--------------+ +--------------+ +---------+ +------------+ +--> | v | |
| mymux +---> mysink |
+--------------+ +---------+ +------------+ +--> | | |
| | | | | | | +-------+ +--------+
| audiotestsrc +-> opusenc +-> rtpopuspay +----+
| | | | | | ^ Segment start:
+--------------+ +---------+ +------------+ +---- 0:00:00.000000
When I start using actual video and audio sources, I want them to be able to sync, so I’d assume that I need to make the segment events that the respective pipelines create match the start time, but I can’t figure out how to set the segment start time on the pipelines. Or is this not the right approach? Should I actually be applying offsets to the PTS/DTS values so they match in the muxer?
All sources are live with no seeking ability (at least, I don’t envision there ever being a need to perform actual seeking). I’m a bit lost, what should I do?