Gst_sample_get_caps() returns NULL with compositor but ok with videomixer

We are combining two camera sources on a Raspberry Pi into a single stream using the following pipeline.

libcamerasrc camera-name=%s ! videorate ! 
video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_0 

libcamerasrc camera-name=%s ! videorate ! 
video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_1 

videomixer name=m sink_1::xpos=640 ! 
video/x-raw,width=1280,height=480,format=BGR ! videoconvert ! videoscale ! appsink max-buffers=1 drop=true sync=false

This works fine, however the gstreamer documentation states that “videomixer” is deprecated and to use “compositor” instead. If we replace videomixer with compositor we get the following error.

(client:2108): GStreamer-CRITICAL **: 15:18:30.458: gst_sample_get_caps: assertion 'GST_IS_SAMPLE (sample)' failed
[ERROR:0@32.931] global cap_gstreamer.cpp:934 retrieveVideoFrame GStreamer: gst_sample_get_caps() returns NULL

Using “glvideomixer” also produces this same error.

Is this a bug or is some additional configuration required when switching to compositor?

You need to share a bit more code. The real problem here is, that the sample you retrieve is NULL or otherwise invalid:

Its not a code issue, rather something with Gstreamer itself. If I modify the pipeline to write a video file to disk and use gst-launch-1.0 we get the following result.

Using videomixer;

gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/ov5647@36 ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@0/ov5647@36 ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_1 videomixer name=m sink_1::xpos=640 ! video/x-raw,width=1280,height=480,format=BGR ! videoconvert ! videoscale ! x264enc ! flvmux ! filesink location=/tmp/testvid.flv

Output and successfully writes to /tmp/testvid.flv - I hit Ctrl-C to exit.

Setting pipeline to PAUSED ...
[8:19:28.684614390] [2871]  INFO Camera camera_manager.cpp:297 libcamera v0.0.5+83-bde9b04f
[8:19:28.720661831] [2872]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@0/ov5647@36 to Unicam device /dev/media0 and ISP device /dev/media3
[8:19:28.731953746] [2872]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@1/ov5647@36 to Unicam device /dev/media1 and ISP device /dev/media4
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
[8:19:28.794839988] [2878]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888
[8:19:28.795061726] [2879]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888
[8:19:28.795554295] [2872]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@1/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected unicam format: 640x480-pGAA
[8:19:28.797172036] [2872]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@0/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected unicam format: 640x480-pGAA
Redistribute latency...
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:08.411845431
Setting pipeline to NULL ...
Freeing pipeline ...

Using “compositor”

gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/ov5647@36 ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@0/ov5647@36 ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! m.sink_1 compositor name=m sink_1::xpos=640 ! video/x-raw,width=1280,height=480,format=BGR ! videoconvert ! videoscale ! x264enc ! flvmux ! filesink location=/tmp/testcomp.flv

Output

Setting pipeline to PAUSED ...
[8:20:20.403698494] [2893]  INFO Camera camera_manager.cpp:297 libcamera v0.0.5+83-bde9b04f
[8:20:20.448272486] [2894]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@0/ov5647@36 to Unicam device /dev/media0 and ISP device /dev/media3
[8:20:20.459615362] [2894]  INFO RPI vc4.cpp:437 Registered camera /base/soc/i2c0mux/i2c@1/ov5647@36 to Unicam device /dev/media1 and ISP device /dev/media4
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Redistribute latency...
[8:20:20.528974416] [2902]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888
[8:20:20.529747056] [2894]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@0/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected unicam format: 640x480-pGAA
[8:20:20.531147762] [2901]  INFO Camera camera.cpp:1033 configuring streams: (0) 640x480-XBGR8888
[8:20:20.531654183] [2894]  INFO RPI vc4.cpp:565 Sensor: /base/soc/i2c0mux/i2c@1/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected unicam format: 640x480-pGAA
Redistribute latency...
Redistribute latency...
WARNING: from element /GstPipeline:pipeline0/GstCompositor:m: GStreamer error: clock problem.
Additional debug info:
../libs/gst/base/gstaggregator.c(2069): gst_aggregator_query_latency_unlocked (): /GstPipeline:pipeline0/GstCompositor:m:
Impossible to configure latency: max 0:00:00.119710000 < min 0:00:00.120647000. Add queues or other buffering elements.
WARNING: from element /GstPipeline:pipeline0/GstCompositor:m: GStreamer error: clock problem.
Additional debug info:
../libs/gst/base/gstaggregator.c(2069): gst_aggregator_query_latency_unlocked (): /GstPipeline:pipeline0/GstCompositor:m:
Impossible to configure latency: max 0:00:00.119710000 < min 0:00:00.120436000. Add queues or other buffering elements.
WARNING: from element /GstPipeline:pipeline0/GstCompositor:m: GStreamer error: clock problem.

The error message and warning from the last few lines repeat.

A critical warning is always a code issue, either in your code or in GStreamer itself. As this is from the handling of the appsink samples, it’s most likely in your code.

This is a different problem than from your initial message (it has nothing to do with gst_sample_get_caps() and there’s no appsink involved here) but it says quite clearly what’s wrong here. You’ll need to add some queue elements (upstream of the compositor) of suitable size in your pipeline to allow compensating for the latency.

Adding queue elements gets rid of the error thanks, however its now just a black screen.

gst-launch-1.0 compositor name=comp sink_1::xpos=640 ! videoconvert ! videoscale ! x264enc ! flvmux ! filesink location=/tmp/testcomp.flv \
libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/ov5647@36 !  videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! queue ! comp.sink_0 \
libcamerasrc camera-name=/base/soc/i2c0mux/i2c@0/ov5647@36 !  videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! queue ! comp.sink_1

Does it work with a single stream and without the compositor in the middle?

As the original thread mentions it works with the deprecated “videomixer” and a straight pipeline to a single camera works too.

My issues are with “compositor”.

This sounds like a latency problem then. From the debug logs (GST_DEBUG=aggregator:9,videoaggregator:9,compositor:9) you should be able to see why compositor is discarding the frames. They probably all arrive too late, meaning that the source is not reporting the proper latency.

videomixer would be working fine because it simply does not have any support for live streams, which comes with a whole set of other problems.

Thanks for your help.

I have attached a log of the debug/trace output. Does that tell you anything about the problem?

https://raw.githubusercontent.com/swdee/gstreamer-log/main/comp-log.txt - terminal output with ANSI control characters.

https://raw.githubusercontent.com/swdee/gstreamer-log/main/comp-ansi-stripped.txt - plain TXT with ANSI characters removed

Interestingly when I change from “libcamerasrc” to “videotestsrc” the recorded MP4 video file has the thumbnail showing the two test patterns, but video playback is black.

gst-launch-1.0 compositor name=comp sink_1::xpos=640 ! queue ! videoconvert ! videoscale ! x264enc ! mp4mux ! filesink location=/tmp/testcomp.mp4 -e \
videotestsrc pattern=snow ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! queue ! comp.sink_0 \
videotestsrc pattern=pinwheel ! videorate ! video/x-raw,width=640,height=480,framerate=10/1,format=RGBx ! queue ! comp.sink_1

Doing a single test stream works fine, eg:

gst-launch-1.0 videotestsrc pattern=snow ! videorate ! video/x-raw,width=640,height=480,framerate=10/1 !  videoconvert ! videoscale ! x264enc ! mp4mux ! filesink location=/tmp/testcomp-sing.mp4 -e