Where to start with whepsrc

I’m experimenting with the new webrtchttp package. Love it so far. Props to the developers who got this released and excellent gconf talk on it.

I’ve been able to successfully use whipsink to stream video to a Dolby.io server following their blog post instructions.

I’m wondering… Does anyone have good examples or resources to read for using the whepsrc element?

I wasn’t able to simply point the whepsrc whep-endpoint to the Dolby IO server. Thinking I’ll need to run server code on localhost to get any further but have not found anything online. Hoping there is a minimum example somewhere and I won’t need to fully bring up a Janus instance for experimenting with it.

FYI my whipsink pipeline is something like:

gst-launch-1.0 videotestsrc ! videoconvert ! vaapih264enc ! rtph264pay ! whip.sink_0 audiotestsrc wave=5 ! audioconvert ! opusenc ! rtpopuspay ! whip.sink_1 whipsink name=whip auth-token=$TOKEN whip-endpoint=$ENDPOINT

Thanks for any tips!

We have not been able to test H264 with Dolby so far. VP8 is known to work.

For WHEP, you can see a sample Python application here. Just change the whep-endpoint property to DOLBY_WHEP_ENDPOINT or based on what you want to test.

Other examples with mediamtx or live777 and gst-launch can be found in this issue.

1 Like

Thank you @Sanchayan

I was able to get VP8 and VP9 working with Dolby using gst-launch-1.0 thanks to the the gst-plugins-rs issue #414 that you linked. This is perfect for my learning purposes.

FYI, I did test H264 and it is not working for me. Here is my progress in case it helps anyone. I’ll try to reply to this thread if I do get it functional, but I don’t plan to pursue this much longer since VP8/9 are working:

# H264 encode, visible in Dolby Live View web app
gst-launch-1.0 -v videotestsrc pattern=ball is-live=true ! video/x-raw, format=NV12, width=640, height=480 ! vaapih264enc ! video/x-h264, stream-format=byte-stream, profile=constrained-baseline ! h264parse ! rtph264pay pt=127 ! queue ! "application/x-rtp,media=video,encoding-name=H264,payload=127,clock-rate=90000" ! whipsink auth-token=$DOLBYIO_BEARER_TOKEN whip-endpoint=$DOLBYIO_WHIP_ENDPOINT?codec=h264

# H264 decode, fails to stream
DISPLAY=:0 gst-launch-1.0 -v whepsrc whep-endpoint=$DOLBYIO_WHEP_ENDPOINT audio-caps="application/x-rtp,payload=96,encoding-name=OPUS,media=audio,clock-rate=48000" video-caps="application/x-rtp,payload=127,encoding-name=H264,media=video,clock-rate=90000" ! rtph264depay ! h264parse ! video/x-h264, stream-format=byte-stream, profile=constrained-baseline ! vaapih264dec ! video/x-raw, format=NV12, width=640, height=480 ! vaapipostproc ! glimagesink sync=false

Finally, thanks for links to mediamtx and live777. I am only using Dolby because it popped up in a google search but I much prefer an open source SFU that is also compatible with gstreamer whip/whep. I will be testing with these other projects soon.

Good research @jcap !
Have you made any further progress on using h264 in your whep pipeline?

We’re currently in the process of extending a GstRtspServer to be able to serve video (only) with h264 through either the webrtcbin or using this whepsrc if possible in our python-code.

No, I was not successful getting h264 streaming from Dolby to whepsrc. That shouldn’t discourage you though; mediamtx and live777 were confirmed to work with h264. So GstRtspServer probably will too. If you have a specific question the regulars on this board can probably help you out