I’m working on a project that requires streaming video output from a GStreamer pipeline to a web browser. I am looking into WebRTC as this seems to be the simplest solution to implement.
I want to build a GStreamer pipeline that passes the video output to the browser with WebRTC.
I’ve reviewed several GStreamer and WebRTC examples but I’m still struggling with creating a working end-to-end example or complete working chain.
If you have an example that works, configuration hints, or any other tips on how to connect GStreamer WebRTC elements to a browser client, I would appreciate the help.
If all you want to do is stream h264 to the web browser you could use external software for the webrtc part. For example: go2rtc
It’s a standalone binary and handles all of the webrtc stuff. You can run gstreamer via go2rtc either directly or by splitting your pipeline in two and running one half in go2rtc and one separately and feeding into go2rtc via something like tcpserversink and tcpclientsrc (or the new unixfd if you’re on gst 1.26. I haven’t used this personally yet.) Running two pipelines performs better in my experience.
Example: Put this in either go2rtc.yaml or in the go2rtc webgui. Note! The -q (–quiet) flag on gst-launch-1.0 is mandatory!