Gstreamer with GigE Vision Camera - Wonky Image

Hi there,

I have an Imperx GigE Vision camera and I am trying to create a simple pipeline that listens to its UDP port and displays the video stream, but the image is not displaying correctly. I verified that the image is coming through correctly in the Imperx utility, in the Imperx sample python scripts, and also using Aravis Viewer.

I’ve tried a zillion different combinations of caps and elements, pixel formats, buffer-size, etc, but I obviously won’t post all the things I’ve tried. Basically, +/- some variations, my pipeline looks like this.
I currently have the camera outputting just 8-bit grayscale

gst-launch-1.0 -e -v udpsrc port=62000 ! rawvideoparse width=2064 height=1544 framerate=10/1 format=25 ! queue ! "video/x-raw, format=(string)GRAY8" ! videoconvert ! ximagesink

I’m getting an output that looks like this - slanted and somewhat repeated.

Most of the examples I’ve found online with udpsrc also use RTP, but i’m fairly confident this camera does not use RTP (it’s not mentioned anywhere in the manual, and my attempts to run a pipeline with rtp-related elements have all failed with “not a valid RTP payload” or whatever)

I WAS able to use the aravis plugin to get the pipeline to work, but it doesn’t seem to support color (all color options in aravis viewer are grayed out). And I do think this should work with just a udpsrc (right?)

Here’s the verbose output from my pipeline cited above

/GstPipeline:pipeline0/GstRawVideoParse:rawvideoparse0.GstPad:src: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1, format=(string)BGRx
/GstPipeline:pipeline0/GstXImageSink:ximagesink0.GstPad:sink: caps = video/x-raw, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1, format=(string)BGRx
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1

it does seem like there’s some inexplicable conversion to BGRx that I was trying unsuccessfully to make go away… maybe that’s an issue?

and the verbose output from this pipline (which does work):
gst-launch-1.0 -e -v aravissrc ! videoconvert ! queue ! xvimagesink

/GstPipeline:pipeline0/GstAravis:aravis0.GstPad:src: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)2064, height=(int)1544, framerate=(fraction)10/1, format=(string)YV12
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, width=(int)2064, height=(int)1544, framerate=(fraction)10/1, format=(string)YV12
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, width=(int)2064, height=(int)1544, framerate=(fraction)10/1, format=(string)YV12
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw, width=(int)2064, height=(int)1544, framerate=(fraction)10/1, format=(string)YV12
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)GRAY8, width=(int)2064, height=(int)1544, framerate=(fraction)10/1

I’ve been working on this for days and would really appreciate any guidance or suggestions for debugging.

This looks like the stride of the frames is more than 2064 bytes. Try with something like rawvideoparse plane-strides='<2080>' maybe, and experiment a bit with the value until it looks ok. I’m assuming you don’t have any docs or anything to actually look up the correct value.

hi @slomo - thanks for the suggestion!

I was able to use the debug option (see below) while running the Aravis pipeline to get some metadata, such as plane strides and plane offsets. I think i’m getting closer but the image stiill looks weird. Any further suggestions? Do I need to define more udpsrc parameters?

Ran Aravis plugin with DEBUG option:
GST_DEBUG=2,videometa:7 gst-launch-1.0 -e -v aravissrc ! videoconvert ! queue ! xvimagesink

Result - there are 3 planes, seems like there is some padding at the end

0:00:00.587224097 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 1, offset 3186816, stride 1032
0:00:00.587232081 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 2, offset 3983520, stride 1032
0:00:00.587244414 15355 0x56100b797760 LOG                videometa gstvideometa.c:453:gst_video_meta_set_alignment: Set alignment on meta: padding 0-0x0-0
0:00:00.587807980 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 0, offset 0, stride 2064
0:00:00.587827224 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 1, offset 3186816, stride 1032
0:00:00.587833611 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 2, offset 3983520, stride 1032
0:00:00.587843343 15355 0x56100b797760 LOG                videometa gstvideometa.c:453:gst_video_meta_set_alignment: Set alignment on meta: padding 0-0x0-0
Redistribute latency...
0:00:12.048830824 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 0, offset 0, stride 2064
0:00:12.048851921 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 1, offset 3186816, stride 1032
0:00:12.048857970 15355 0x56100b797760 LOG                videometa gstvideometa.c:336:gst_buffer_add_video_meta_full: plane 2, offset 3983520, stride 1032
0:00:12.048868656 15355 0x56100b797760 LOG                videometa gstvideometa.c:453:gst_video_meta_set_alignment: Set alignment on meta: padding 0-0x0-0

So after that, tried running with plane-offsets and plane-strides parameters, using the values from above

gst-launch-1.0 -e -v udpsrc port=62000 ! rawvideoparse plane-strides="<2064, 1032, 1032>" plane-offsets="<0,3186816,3983520>" width=2064 height=1544 format=25 framerate=10/1 ! queue ! "video/x-raw, format=(string)GRAY8" ! videoconvert ! jpegenc snapshot=TRUE ! queue ! filesink location=/home/mtolfa/test-media/output-grey.jpg

The resulting image now looks like this:

If you have 3 planes then it’s unlikely that this is format=25 (i.e. GRAY8). This is probably I420 or so. You need to configure the correct format on rawvideoparse.

huh, I see. The camera is definitely configured to output 8-bit grayscale so I’m a bit confused (their documentation appears to be lacking), but thank you! I will try a few different things

GigE Vision uses GVSP, so the data coming over UDP isn’t just video data, there’s a leader and trailer packet, and each packet also has a header. Wireshark comes with a GVSP dissector, so I’d suggest listening to your traffic with that to learn more about your stream (right click on a packet on port 62000 or whatever you’re using, click Decode As…, change Current to GVSP, and look for pixel_format in the image data leader packet). I don’t see how you can get away with just using rawvideoparse, especially in the general case, unless you are ok with the GVSP header being present every N bytes (N==packet size).

I would suggest determining the pixel format, and adding support to Aravis.

3 Likes

Hi @joshtp
thanks for you suggestion. I decoded as GVSP in Wireshark as you suggested, but there unfortunately does not appear to be a gvsp.pixel parameter.

Also, I do have some confidence the pixel format is 8-bit grayscale since I set it in the camera’s utility (i seem to be unable to upload a screenshot of the utility, so you’ll just have to trust me on that)

The packet you show is a payload packet, usually image date. Look before that for the image leader packet which is format==0x01, which should contain the pixel format and resolution.

@joshtp ah i see. yes there’s a leader and a trailer packet.

here’s the leader packet

Ok, so certainly should work in Aravis. Just using udpsrc won’t work unless you can fine tune packet size or image size carefully, even then you’d have to crop out the undesired packets and payload packet header, but you’d still have syncing issues to worry about, dropped frames, etc. In other words not worth it.

Revisit Aravis. Mono8 must be the default of your camera, try changing to color by using something like:

gst-launch-1.0 -e -v --gst-debug=aravissrc:9 aravissrc ! video/x-raw,format=RGB ! videoconvert ! queue ! xvimagesink

Change RGB to whatever format your camera supports (realizing that GEV pixel format names don’t correspond directly to GStreamer pixel format names, e.g. Mono8==GRAY8). See here for the mapping between pixel formats. And note the line above should show some helpful debug output

thanks @joshtp .

I can only get aravis to work with Mono, and now I’m realizing this is probably because I don’t have the bayer2rgb plugin. I am using gsreamer core 1.20 and gst-plugins-bad version 1.20.3 (which doesn’t appear to have the bayer2rgb plugin). maybe it’s deprecated? can i install it standalone? documentation here is quite sparse.

anyway, I also have a docker container with gst core 1.16 installed, and it does have the bayer2rgb plugin. i think my plan is to install the aravis plugin in the container, and see if that enables the Bayer options. right now, running the pipeline you suggested doesn’t work with video/x-bayer etc, nor does launching aravis viewer (gstreamer bayer plugin missing")

we can close this question if you’re growing weary of it, since I think we’ve sort of figured out what the problem(s) are, but I do appreciate your support a lot!!

bayer2rgb isn’t deprecated, not sure why you’re missing it, all my binary distributions of gst-plugins-bad have included it. You certainly can build it from source if you’d like.

When using a capsfilter be sure to specify the format, like video/x-bayer, format=grbg, making sure it’s a format supported by your camera (all cameras I’ve seen only support one Bayer format).

Yay it’s working! installed the bayer2rgb plugin.

thanks so much!

1 Like