Hey, I am trying to write a flutter integration with gstreamer.
The options to render “native” stuff in flutter are
- Pass a pixel buffer pointer (from the RAM) to flutter which will then do the magic and create an opengl texture out of it. This is a convenience helper AFAICT.
- draw using OpenGL yourself.
- There is the new
Impller
engine which uses vulkan.
I would love to hear what you think would be the best path.
This is what I think (respectively):
- totally possible although IDRK how to get a videoframe pixels though this is probably doable.
- I think this would be ideal, but only if gstreamer already stores decoded videos on the GPU, otherwise it would be the same as 1. (seems like gtk4 integration does this)
- The
Impeller
is not fully cross-platform yet so I ditched that.