WebRTCSink with Jetson Nano

I am trying to use the webrtcsink element on a jetson nano and display live camera feed in client computer’s google chrome browser. However, I can’t seem to get the webrtcsink to show up as a “remote-stream” in the demo gstwebrtc-api web application.

I’m using GStreamer version 1.26.0 built from source on the jetson nano and my pipeline is:

GST_DEBUG=webrtc*:9 WEBRTCSINK_SIGNALLING_SERVER_LOG=debug gst-launch-1.0 videotestsrc is-live=true ! nvvidconv ! nvv4l2h264enc ! h264parse ! webrtcsink name=ws meta="meta,name=gst-stream" run-signalling-server=true video-caps=video/x-h264

I cloned the gstwebrtc-api repository and ran npm start on the client machine. I did change the index.html file to point to the other machines signalling server (i.e. signalingServerUrl: "ws://192.168.xx.xx:8443".

On the jetson device, I get the following:

Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
2023-03-02T20:16:14.883879Z DEBUG ThreadId(01) spawn: gst_plugin_webrtc_signalling::server: new
2023-03-02T20:16:14.883938Z DEBUG ThreadId(01) spawn:new: gst_plugin_webrtc_signalling::handlers: new
2023-03-02T20:16:14.883959Z DEBUG ThreadId(01) spawn:new: gst_plugin_webrtc_signalling::handlers: close time.busy=3.02µs time.idle=19.0µs
2023-03-02T20:16:14.884018Z DEBUG ThreadId(01) spawn: gst_plugin_webrtc_signalling::server: close time.busy=71.0µs time.idle=75.2µs
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
NVMEDIA_ENC: bBlitMode is set to TRUE
Redistribute latency...
Redistribute latency...
2023-03-02T20:16:15.359904Z DEBUG ThreadId(01) accept_async: gst_plugin_webrtc_signalling::server: new
2023-03-02T20:16:15.363019Z DEBUG ThreadId(01) accept_async: tungstenite::handshake::server: Server handshake done.
2023-03-02T20:16:15.364066Z  INFO ThreadId(01) accept_async: gst_plugin_webrtc_signalling::server: New WebSocket connection this_id=ebcd44c6-9cc9-456c-bbbb-8abf358daf4e
2023-03-02T20:16:15.364271Z DEBUG ThreadId(01) accept_async: gst_plugin_webrtc_signalling::server: close time.busy=3.55ms time.idle=846µs
2023-03-02T20:16:15.406678Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Text(Utf8Bytes(b"{\"type\":\"setPeerStatus\",\"roles\":[\"listener\"],\"meta\":{\"name\":\"WebClient-1744660204937\"}}")))
2023-03-02T20:16:15.407648Z DEBUG ThreadId(01) set_peer_status{peer_id="ebcd44c6-9cc9-456c-bbbb-8abf358daf4e" status=PeerStatus { roles: [Listener], meta: Some(Object {"name": String("WebClient-1744660204937")}), peer_id: None }}: gst_plugin_webrtc_signalling::handlers: new
2023-03-02T20:16:15.408096Z  INFO ThreadId(01) set_peer_status{peer_id="ebcd44c6-9cc9-456c-bbbb-8abf358daf4e" status=PeerStatus { roles: [Listener], meta: Some(Object {"name": String("WebClient-1744660204937")}), peer_id: None }}: gst_plugin_webrtc_signalling::handlers: registered as a producer peer_id=ebcd44c6-9cc9-456c-bbbb-8abf358daf4e
2023-03-02T20:16:15.408473Z DEBUG ThreadId(01) set_peer_status{peer_id="ebcd44c6-9cc9-456c-bbbb-8abf358daf4e" status=PeerStatus { roles: [Listener], meta: Some(Object {"name": String("WebClient-1744660204937")}), peer_id: None }}: gst_plugin_webrtc_signalling::handlers: close time.busy=530µs time.idle=294µs
2023-03-02T20:16:15.412198Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Text(Utf8Bytes(b"{\"type\":\"list\"}")))
2023-03-02T20:16:15.412742Z DEBUG ThreadId(01) list_producers{peer_id="ebcd44c6-9cc9-456c-bbbb-8abf358daf4e"}: gst_plugin_webrtc_signalling::handlers: new
2023-03-02T20:16:15.413029Z DEBUG ThreadId(01) list_producers{peer_id="ebcd44c6-9cc9-456c-bbbb-8abf358daf4e"}: gst_plugin_webrtc_signalling::handlers: close time.busy=31.2µs time.idle=287µs
2023-03-02T20:16:45.416057Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Pong(b""))
2023-03-02T20:17:15.420336Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Pong(b""))
2023-03-02T20:17:45.421880Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Pong(b""))
2023-03-02T20:18:15.423618Z  INFO ThreadId(01) gst_plugin_webrtc_signalling::server: Received message Ok(Pong(b""))

I added some log statements on the client side and I get

It seems like the webrtcsink element isn’t registered as a producer? I’m just not sure why its not showing up in remote streams?

Looking through the implementation of webrtcsink, I cannot seem to find where itself is regestered as a producer? Its possible that I misunderstand the webrtc protocol but I expected the standalone webrtcsink with built in signaller to be a producer.

Some more information:

Based on the documentation for webrtcsink I expected to be able to pass a video/x-h264 stream to the sink pad and it know that it only needs to payload that and not encode it? However, the gitlab readme states “It is important to note that at this moment, encoding is not shared between
consumers. While this is not on the roadmap at the moment, nothing in the
design prevents implementing this optimization.” Which makes me think that that webrtcsink has to control the encoder?

Either way, I figured out that on the jetson, this pipeline works successfully:

GST_DEBUG=webrtc*:9 WEBRTCSINK_SIGNALLING_SERVER_LOG=debug gst-launch-1.0 videotestsrc is-live=true ! nvvidconv ! webrtcsink meta="meta,name=gst-stream" run-signalling-server=true congestion-control=disabled

However, simply adding x264enc element in front of webrtcsink breaks things. The producer still shows up in the web interface under remote streams but when click to start the stream, the GStreamer pipeline outputs a ton of information but ultimately fails to link the pipeline with:

webrtcsink net/webrtc/src/webrtcsink/imp.rs:3640:gstrswebrtc::webrtcsink::imp::BaseWebRTCSink::on_remote_description_set:<webrtcsink0> Failed to connect input stream video_0 for session 5e1211de-8e2f-4bbb-b67c-178d4e63d431: Failed to link elements 'capsfilter5' and 'capsfilter3'

If I use nvv4l2h264enc instead of x264enc, I’m back to where I started and the producer isn’t list under “remote streams” in the web interface anymore.

After investigating some more, I think this boils down to a simpler question, is the webrtcsink element designed to handle encoded input such as video/x-h264? In which case, it should only handle payloading and networking for each connected consumer?

I found out what the issue was here (for nvv4l2h264enc). Just another lesson on how invaluable GST_DEBUG environment variable is.

TLDR;
nvh264enc repeat-sequence-header=true
nvv4l2h264enc idrinterval=30 insert-sps-pps=true

I discovered that this is an issue with both jetson encoders (nvv4l2h264enc) and discrete nvidia GPU encoders e.g. nvh264enc.

Basically, the webrtcsink element runs some “fancy” GstDiscovery pipelines at runtime to determine which encoder + payloader to use for the input stream. When passing video/h-264 to sink pad of webrtcsink, it was logging warnings that the encoded buffer “NAL units” couldn’t be parsed.

I won’t claim to understand the low level details of how these video/h-264 are structured or parsed but there are some special “NAL” blocks (SPS/PPS) that are required for the discovery pipeline to read the blocks correctly. You can enable these special block units to repeat using the properties listed above.

Hope this helps someone else!