Framerate uncontrolled with NV12 data source

I’m trying to use appsrc to grab NV12 formatted data from a memory mapped device file. I’ve run into an issue where the framerate is not respected and the appsrc’s ‘need-data’ callback is called as fast as the system will allow. This doesn’t seem to be an issue with appsrc, but specifically with the format. If the source format is BGRx the pipeline runs at the correct speed.

Here are a couple of pipelines that demonstrate the issue.

Doesn’t work, get very high FPS:
gst-launch-1.0 multifilesrc location=/dev/encode_framebuffer ! rawvideoparse format=23 width=1024 height=768 framerate=30/1 ! omxh264enc ! fakesink

Works, get 30FPS:
gst-launch-1.0 multifilesrc location=/dev/fb0 ! rawvideoparse format=8 width=1024 height=768 framerate=30/1 ! videoconvert ! video/x-raw,format=NV12,width=1024,height=768,framerate=30/1 ! omxh264enc ! fakesink

Adding videorate and caps after it has no affect. I’ve also tried adding the videoconvert into the NV12 pipeline, but that didn’t affect it either. I created a proof of concept CPP application that uses the vaapih264enc element and the same thing occurs. I can upload that if needed.

How can I get the NV12 pipeline to run at 30FPS?

Is it possible that the RGB pipeline only runs at/around the desired framerate because the video conversion in software slows everything down?

appsrc with need-data callback isn’t really best-suited for a steady-framerate capture.

If the source doesn’t throttle the data automatically, as a video capture device usually would by only producing data N times per second (max), then you need to throttle elsewhere in the pipeline.

Perhaps a simple fakesink sync=true will do the trick for you (depends a bit on the behaviour of the encoder), or maybe a clocksync (or identity sync=true) element after the rawvideoparse will help (you should configure a min-latency on the appsrc then though, something like 1/framerate in nanoseconds).

Alternatively you could write a small custom source element based on GstPushSrc, then you can make the base class call the create function to capture a frame according to your desired timings.

@tpm thanks for the quick response!

Adding the sync=true parameter to an identity element after the rawvideoparse and setting the min-latency parameter on the appsrc did the trick. Thank you so much!