Hello everyone!
We run a pipeline with a custom-made filter element that drops buffers based on some condition. When the buffer is dropped, we push a gap event to our filter’s src pad:
let gap_event = Gap::builder(pts).duration(buffer.duration()).build();
self.obj().src_pads().first().unwrap().push_event(gap_event);
After the filter element, we have a jpegenc
to save a frame snapshot:
... ! custom-filter ! jpegenc ! ...
And with memory profiling, we see that the memory usage starts growing infinitely if we filter out all buffers. The function that allocates memory without further freeing is gst_event_new_gap
. After looking into the code, we saw that the base GstVideoEncoder
(subprojects/gst-plugins-base/gst-libs/gst/video/gstvideoencoder.c
) stores events, probably to send them out together with the next frame:
encoder->priv->current_frame_events =
g_list_prepend (encoder->priv->current_frame_events, event);
But the next frame never arrives because of our custom filter. And this list seems to grow infinitely unless our filter passes another frame further in the pipeline, in that case memory usage drops back.
In our case, the stream is very-very sparse. It is possible that all frames end up filtered out, and the pipeline may run for months. In such case, it is not different from a memory leak, even though, strictly speaking, it isn’t a leak.
Maybe we are misusing the GStreamer though. What is the correct way to drop a lot of buffers with a filter element (followed by jpegenc
) without increasing use of memory?