What exactly does the qos property on appsink do? (rust)

This should be an easy question, but I’m struggling to find the answer. gstreamer-rs has a qos property, but I can’t find anything about it in the actual GStreamer docs. Setting it to true does magically fix my issues, so I’m curious about what it actually does…

What issue does it fix?

In GStreamer, Quality-of-Service (QoS) is a feedback system where sink elements that sync their output to the pipeline clock (sync=true) notify upstream elements for each buffer they receive about how early or late that buffer was.

This is useful for playback pipelines, so the pipeline can take measures when it’s not able to get data fast enough to the sink (e.g. skip some data to catch up if there was a temporary cpu spike elsewhere on the system, for example).

This is usually enabled for video sinks, so that video decoders and other converter/effect elements can drop buffers they know will arrive too late anyway (video sinks will accept and render late buffers up to a point, as specified by the max-lateness property).

I have a pipeline with a filesrc and a decodebin that feeds into an appsink, which then copies the buffer somewhere else (not really important). But for some files, the video part of the pipeline can’t keep up, i.e., after a while, it massively lags behind the audio. I first thought that my appsink callback is just too slow, but I’ve checked and the callback finishes in less than 0.5 ms. The command line version (with autovideosink) works fine.

So my pipeline should look something like this:

filesrc location={path} ! decodebin name=dec 
  dec. ! queue ! videoconvert ! queue ! appsink name=video caps=video/x-raw,format=RGBA
  dec. ! queue ! audioconvert ! audioresample ! autoaudiosink

Setting qos=true somehow fixed my issue. Looks like it is enabled by default in the command line tools, but not for the rust bindings?

I’m not sure what you mean by that. It’s an element property and whether it’s enabled by default will vary from element to element. For video sinks it will usually be enabled by default, whereas for appsink it’s not.

From your description it sounds like either the video decoder and/or the videoconvert are too slow and can’t keep up.

What kind of machine/hardware are you running this on?

What’s the format of the video? (codec/resolution/framerate)

What video decoder is picked in your case?

It’s an i7-14700 with 32GB RAM and a Radeon RX 6600, so I think the system should be more than capable of decoding the video in real-time. Regarding the decoder, is there a good way to get diagnostics from a GStreamer pipeline in Rust?

Video file information:
video #1: video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.2, profile=(string)high, codec_data=(buffer)01640020ffe1002a67640020ac2c85014016ec05a808080a000003000200000300c9c8c000e816000135735ef701f088451601000468f92bcb, width=(int)1280, height=(int)720, framerate=(fraction)50/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, coded-picture-structure=(string)frame, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
  Tags:
    video codec: H.264 / AVC
    maximum bitrate: 5064690
    bitrate: 5064690
    container-specific-track-id: 1

  Codec:
    video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.2, profile=(string)high, codec_data=(buffer)01640020ffe1002a67640020ac2c85014016ec05a808080a000003000200000300c9c8c000e816000135735ef701f088451601000468f92bcb, width=(int)1280, height=(int)720, framerate=(fraction)50/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)bt709, coded-picture-structure=(string)frame, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
  Stream ID: 2f842f4519d1d40d04467f9551d6f34cc077b19c775bfaaca94ca61d0d69d33d/001
  Width: 1280
  Height: 720
  Depth: 24
  Frame rate: 50/1
  Pixel aspect ratio: 1/1
  Interlaced: false
  Bitrate: 5064690
  Max bitrate: 5064690
audio #2: audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)2, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)1190, rate=(int)48000, channels=(int)2
  Tags:
    audio codec: MPEG-4 AAC audio
    maximum bitrate: 130994
    bitrate: 127999
    language code: en
    container-specific-track-id: 2

  Codec:
    audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)2, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)1190, rate=(int)48000, channels=(int)2
  Stream ID: 2f842f4519d1d40d04467f9551d6f34cc077b19c775bfaaca94ca61d0d69d33d/002
  Language: en
  Channels: 2 (front-left, front-right)
  Sample rate: 48000
  Depth: 16
  Bitrate: 127999
  Max bitrate: 130994

Another possibility is that the default queue sizes are too small by default (but seems unlikely at first glance given the mediainfo output) or perhaps the interleaving of the file is bad (what’s the container?).

Does using decodebin3 or setting the audio queue to queue max-size-time=0 max-size-buffers=0 max-size-bytes=0 change anything?

You could check the GST_DEBUG=videodecoder:LOG logs to see what decoder is being instantiated (the element name in the log will be a giveaway).