Webrtc setup for zedmini, jetson

I am setting up a webrtc hosting stereo video streaming from zedmini on jetson. I already have the signaling server. Also I had sender.html and receiver.html working previous.

Now I want to use gstreamer for the sender because of easy utilizing nvenc to accelerate video encoding. Until now I haven’t been able to make it. Does anyone have experience on this, any example code available?

My main blocker is not able to setup the pipeline correctly. based on my preliminary knowledge of zed-gstreamer, I should use the following commands,

  1. zedsrc to capture camera
  2. videoconvert1: convert raw streaming to BGRA
  3. videoconvert2: convert BGRA to I420 for the comnpetibility of nvenc
  4. wrbrtcbin to send the streaming out
  5. fakesink to setup a buffer and cleanup regularly
  6. I want the stereo video, with left/right both 720P and 60fps

I wonder if there is a simple string to setup the pipeline?

Thanks!

I’m unsure I correctly understand what you’re trying to achieve, sorry, though if it can help I’d advise for Jetson side:

  1. Checking what modes your camera provides. Assuming the zed mini is /dev/video0, see what gives:
v4l2-ctl --device=/dev/video0 --list-formats-ext

such as (here using a ZED camera):

ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'YUYV' (YUYV 4:2:2)
		Size: Discrete 2560x720
...
			Interval: Discrete 0.033s (30.000 fps)
...

So the camera provides concatenated (left and right 1280+1280=2560) 2560x720 video at 30 fps. YUYV format is referred as YUY2 with gstreamer.

  1. For encoding, most jetson models (all but Orin Nano) embed a NVENC HW encoder, so instead of nvenc using GPU, there are gstreamer elements such as nvv4l2h264enc (and nvv4l2decoder for NVDEC HW). These elements use video frames from NVMM memory (DMAable buffers). You can use element nvvidconv for copying from system memory to NVMM memory while converting format with Jetson VIC HW. So you may first try to just encode into h264 video for a few seconds and stop it (Ctrl-C once):
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! 'video/x-raw(memory:NVMM),format=NV12' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! queue ! h264parse ! matroskamux ! filesink location=testZed.mkv -e

and check result with (assuming your Jetson has a display and GUI working):

gst-launch-1.0 filesrc location=testZed.mkv ! matroskademux ! h264parse ! nvv4l2decoder ! nv3dsink

You may have to adjust encoding, bitrate, profiles for your needs.

  1. If ok so far you may try to stream as well. Note that for webrtcbin gstreamer plugin nice might be mandatory (see gst-inspect-1.0 nice), in failing case you may have to build a recent version of gstreamer enabling it. Be sure to have libnice installed (sudo apt search libnice) before.
    Then you may try something like:
gst-launch-1.0 \
v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! 'video/x-raw(memory:NVMM),format=NV12' ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! tee name=h264video  \
h264video. ! queue ! h264parse ! matroskamux ! filesink location=testZed.mkv \
h264video. ! queue ! h264parse ! rtph264pay ! queue leaky=2 ! webrtcbin name=sendonly   

Someone more skilled might help for the webrtc/server setup.

1 Like

Thanks. It worked. Do you know the equivalent command using zed-gstreamer plugin, e.g. zedsrc. I hope to expose to some low level control of zed sdk.

Again, not sure, it would depend on what you’re trying to do and how much you would need the video to be accurate.

You may use zedsrc property stream-type=4 that would send concatenated image_mean (top) and depth image (bottom, seems mainly blue component). Then you can record and stream with:

gst-launch-1.0 zedsrc camera-resolution=3 camera-fps=15 stream-type=4 ! queue ! videoconvert ! queue ! video/x-raw,format=RGBA ! nvvidconv ! nvv4l2h264enc idrinterval=15 insert-sps-pps=1 insert-vui=1 ! h264parse ! tee name=H264_video   H264_video. ! queue leaky=2 ! h264parse ! matroskamux ! filesink location=test_zed_h264.mkv   H264_video. ! queue leaky=2 ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5004 

RTP/H264 stream over UDP could be received by:

gst-launch-1.0 udpsrc address=127.0.0.1 port=5004 ! application/x-rtp,encoding-name=H264 ! rtpjitterbuffer ! rtph264depay ! h264parse ! decodebin ! queue ! autovideoconvert ! autovideosink