Video play via Decklink with live monitoring via WebRTC

I’m building something that takes a h264/5 video with audio and playout using a Decklink. Everything works fine, however, I’m always wanting a way to remotely monitor what is happening live on a webpage. I’ve got WebRTC working for this, and it works ok. However, I believe my issues are around clocking and buffering… when in the paused state, the Decklink element will show the first buffer once it’s buffered. WebRTC will not. If I do not sync with the clock, I can get it to somewhat work, in that it’ll play what’s currently in the buffers and then pause.

Is there a way to make this work?

I’ve tried various different things – but currently, I am using proxysink/src and using another pipeline to play the video with WebRTC. Here’s a basic overview of the two pipelines:

gst-launch-1.0 urisourcebin name="filesrc0" uri="file://./video.ts" ! \
   decodebin3 name="dbin" dbin.video_0 ! queue name="q-dbin" ! \
   tee name="vidsrc" ! multiqueue   max-size-bytes=104857600   max-size-buffers=0   max-size-time=1000000000   name="mq" mq. ! \
   videoscale ! video/x-raw,width=1920,height=1080 ! queue ! \
   videoconvert ! queue name="q-dl-vsink0" ! \
   decklinkvideosink name="dl-vsink0" device-number=0 mode=1080p5994 sync=true \ 
   
   vidsrc. ! mq. mq. ! \
   identity sync=true ! queue leaky=2 ! proxysink name=psink
gst-launch-1.0 proxysrc name=psrc ! queue ! videoconvert !\
  queue ! videoscale ! video/x-raw,width=274,height=154 ! queue !\
  webrtcsink   run-signalling-server=false   do-retransmission=false   meta="meta,name=channelName"   congestion-control=disabled

For anyone coming finding this and hoping for an answer, it’s quite simple really. proxysink sends events down. What I was looking for was intervideosink/src. This will send whatever the last buffer is, and then I just set the timeout to -1 so I never get a black frame.

It’s a bit easier to use, too, since it automatically links.