I’m deploying gstreamer to oculus quest, essentially an android device, as a native plugin for a unity 3D application. My plug-in works, mostly. Videotestsrc works, RTP video streaming works, h.264 decoding works. My problem happens when I try to utilize hardware video decoding with decodebin or decodebin3. Default behavior is software decoding is selected by either of those elements. After doing some research, I found a method of upgrading the omx decoder rankings with regard to decodebin and decodebin3. Doing so did result in omxqcomvideodecoderavc being instantiated to decode the video, but ended in pipeline failure and “Internal GStreamer error: code not implemented. Please file a bug at…”.
Here’s the latest pipeline I’m using:
udpsrc port=7331 ! application/x-rtp,media=video,payload=96 ! rtph264depay ! h264parse ! decodebin3 name=decoder ! video/x-raw(memory:GLMemory) ! gldownload name=downloader ! video/x-raw ! videoconvert name=converter ! video/x-raw,format=RGB ! appsink name=videoSink sync=0
After reading in the release notes for gstreamer 1.24 that omx had been removed, I downgraded to 1.22. However, I get the same results with either (which really doesn’t make a lot of sense?).
Gstreamer output asked me to, so I’ve filed an issue tracker report here:
However I do realize I may be doing something wrong. So if anyone has any suggestions I’m all ears. Right now I’m wondering if the required modules were stripped out of my libgstreamer_android.so, or something like that…