Does setting pad offset require mixer latency?

I think the important part to understand about latency in GStreamer is that in a live pipeline the running time of a buffer is the capture time. So at that point the buffer is already exactly just in time and delaying it any further would make it too late. The latency then gives an additional budget for the buffer to be processed further downstream until it arrives at its destination, so each element on the way (including the source as it needs a moment from capture to actually having the buffer ready for sending downstream) adds its own latency to the (minimum!) latency in the latency query.

When configuring the latency on a pipeline (by default) all sinks are queried for the latency, and the maximum of all minimum latencies is then configured on the whole pipeline so that streams with lower latency are delayed inside the sink by as much as is necessary to make them in sync with the stream with the highest latency.

The minimum latency in the latency query is also kind of confusingly named. It’s not the minimum latency that is introduced, but it is the maximum latency that is introduced in the worst case. It is the minimum latency that downstream has to compensate for to allow for buffers to be in time and not consider them all too late.

The maximum latency in the latency query on the other hand is the amount of buffering that is provided by the elements along the way. As delaying streams for a higher latency (e.g. video stream has 200ms latency, audio stream has 100ms latency, then you need to delay the audio for another 100ms to have both streams in sync) requires buffering somehow this always needs to be higher than the configured (and maximum of all minimum) latencies.