Record and re-stream multiple RTSP streams in sync

I’ve found a few posts that are kind of related to what I want to do, but haven’t really seen a solution. Basically I want to record two or more live RTSP streams to MP4 files with splitmuxsink. Then I want to be able to stream two or more of those streams over webrtc and I want them to be in sync with each other. They are security cameras, so they need to match pretty close.

Currently, I’m creating separate webrtc pipelines for each file. The streams are out of sync by multiple seconds which makes sense for two reasons. 1. The files didn’t necessarily start recording at the exact same time and 2. The webrtc playback pipelines don’t start at exactly the same time.

The only thing I can think to try is to use one big pipeline with multiple webrtc elements (one for each file) hoping that the shared clock can synchronize everything. Before I put the effort into testing that, I was just wondering if that seems possible?

With WebRTC if you want things to be synchronized, you need to use a single webrtcbin (or webrtcsink) element for all the streams. They will appear on the web side as multiple streams in the same RTCPeerConnection.

On the GStremaer side, if you files are not synchronized, you may want to set the “offset” property on the pad after the decoder to give the pipeline the right synchronization, otherwise it will assume that they all start at the same time.

Thanks, I’ll give that a try.

I’m making some progress. The problem now is how to know which stream is which.

I see various fields on the client side (web browser) that might be of use. MediaStreamTrack has id and label. Both of which are webrtctransceiver1, webrtctransceiver2, etc. RTCRtpTransceiver has a mid field such as video0 or video1.

My gut says label on MediaStreamTrack is the place to do this and I think that’s defined by the msid in the SDP. Assuming I’m on the right track, I don’t really understand how to set this. I imagine before I send the SDP to the client, I can replace the webrtctransceiver part of the msid values with my own identifiers. The problem is knowing which stream this represents. Unless it’s guaranteed to be in the same order as the elements in my pipeline. For instance, the first splitmuxsrc matches up to webrtctransceiver1, second one matches up to webrtctransceiver2, etc.

Yes, they’re numbered in the order in which you create them.

Looks like it’s working. Thank you so much for your help.

One minor hiccup. The synchronization sort of works out of the box. When I say sort of, all streams start at the latest start time of all the videos. It probably isn’t the end of the world, but it would be nice to:

  1. Start at the earliest start time of any of the videos.
  2. End at the latest end time of any of the videos.
  3. If a stream doesn’t have video for a certain time period, show something else. Perhaps using fallbackswitch.

And for reference, here is my current pipeline:

webrtcbin name=webrtcbin splitmuxsrc name=02847ad5-c952-4b01-a9f1-2af39083a66f 02847ad5-c952-4b01-a9f1-2af39083a66f.video_0 !
            parsebin !
            video/x-h264 !
            rtph264pay !
            webrtcbin. splitmuxsrc name=0e20f3ae-2d02-498f-b757-a6d424c0b807 0e20f3ae-2d02-498f-b757-a6d424c0b807.video_0 !
            parsebin !
            video/x-h264 !
            rtph264pay !
            webrtcbin. splitmuxsrc name=14078e8b-e6ba-407d-856c-787f5456c6ee 14078e8b-e6ba-407d-856c-787f5456c6ee.video_0 !
            parsebin !
            video/x-h264 !
            rtph264pay !
            webrtcbin. splitmuxsrc name=74cf888a-a871-4dab-865c-e6e5dde42a5a 74cf888a-a871-4dab-865c-e6e5dde42a5a.video_0 !
            parsebin !
            video/x-h264 !
            rtph264pay !
            webrtcbin. splitmuxsrc name=dabf98dc-6335-4170-ac47-024278ec795a dabf98dc-6335-4170-ac47-024278ec795a.video_0 !
            parsebin !
            video/x-h264 !
            rtph264pay !
            webrtcbin.

I’ve been trying to set an offset, but it seems to do something kind of odd. Here is my implementation (note that I’m using GStreamer-Sharp)…

    private void AddOffsets()
    {
        var earliest =
            RecordingViewRequest.RecordingViewRequestItems.Min(p =>
                p.RecordingSearchResults.RecordingFiles.Min(t => t.StartDate));

        foreach (var item in RecordingViewRequest.RecordingViewRequestItems)
        {
            var start = item.RecordingSearchResults.RecordingFiles.Min(p => p.StartDate);
            var offset = (long)(start - earliest).TotalNanoseconds;

            var splitMuxSrc = Pipeline!.GetElementByName(item.StreamID);
            splitMuxSrc.PadAdded += (
                _,
                args) =>
            {
                if (args.NewPad.Direction == PadDirection.Src)
                {
                    Logger.LogWarning("Setting offset: {offset}", offset);
                    args.NewPad.Offset = offset;
                }
                
            };

            foreach (Pad pad in splitMuxSrc.Pads)
            {
                if (pad.Direction == PadDirection.Src)
                {
                    Logger.LogWarning("Setting offset: {offset}", offset);
                    pad.Offset = offset;
                }
            }
        }
    }

What seems to be happening is that it takes whatever video starts first and replicates that across all streams.