I tried QEMU D-Bus display instead which appears to be a better option going forward.
However, I cannot get GStreamer to work with DMA Buffers, as per the issue.
Let’s continue that discussion here.
We on purpose don’t support implicit modifiers (0xffffffffffffff). Explicitly specify your modifiers.
How can I get that modifier? 0xffffffffffffff is what I got from QEMU.
For the try with “linear”, your GL stack reports not supporting it, so that’s also not a bug
I would expect to actually see some explicit references to that in the logs. Can you please point me to where it actually says that? I added an additional debug message to gst_glimage_sink_get_caps and looks like it only supports these formats:
video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], texture-target=(string){ 2D, external-oes };
video/x-raw(memory:GLMemory, meta:GstVideoOverlayComposition), format=(string)RGBA, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], texture-target=(string){ 2D, external-oes }
That’s unfortunate and really strange. How is it even supposed to work then? This even contradicts with what docs and gst-inspect-1.0
says about glimagesink. Is there some specific hardware requirements for glimagesink to support DMA buffers? It’s not running in a virtualized environment btw, but rather as a host app.
Do you have any suggestions on how to make it work? In the future I would need to encode the video instead just displaying it, so it seems that I need some element that can handle DMA Buffers and convert them to whatever the next element would need (i.e. to system memory by mmap+read+unmap or something). Is there anything like that?