I want to remove the latency that is occurring in the Queue element.

Currently, we are using GstShark to measure latency per element, and the Queue element is experiencing latency of about 16ms.
Our expectation is that Queue should output without latency.

Is there any solution to this problem?


Let me guess, you have a video sink after that queue and your stream is 60fps ? Then it’s 100% normal.

Downstream (the sink) will be waiting for the target running time to render the buffer it is given. That interval is 16.6ms.

It also means that your frames are arriving “faster” than realtime (i.e. before the target running-time).


When the max-size-buffers, max-size-bytes, and max-size-time properties of the queue element are all set to 1, the latency in the queue is reduced to almost 0.
Is this the correct way to set up the queue?
Also, is there any risk of functional problems with setting everything to 1?

By doing that you are essentially reducing the queue to only allow one single buffer at a time.

  • If you are dealing with non-live sources, you just need to set max-size-buffers to 1, and all the others to 0 (unlimited).
  • If you are dealing with live sources, you need to set a limit in time for the overall pipeline latency calculations to work properly. Set the max-size-time to at least a frame duration and the other limits to 0 (unlimited).

Note that this is throttling the rate at which upstream can process data.