Environment
- Ubuntu Server 22.04
- GStreamer 1.20
- Xvfb for virtual display
- PulseAudio for virtual audio
What I’m trying to achieve
I’m trying to stream a game running in a headless environment to a WHIP endpoint (Dolby.io). The game runs in Xvfb (virtual display) and uses PulseAudio for sound. I need help creating a proper GStreamer pipeline that:
- Captures video from Xvfb (display :99)
- Captures audio from PulseAudio virtual sink
- Encodes both streams appropriately
- Sends them to a WHIP endpoint
Current Setup
# Display setup
Xvfb :99 -screen 0 1920x1080x24 &
export DISPLAY=:99
# Audio setup
pulseaudio --start
pactl load-module module-null-sink sink_name=game_audio
pactl set-default-sink game_audio
Questions:
What's the correct GStreamer pipeline structure for WHIP streaming?
How should I handle the WHIP endpoint URL and authentication token in the pipeline?
What are the recommended encoding parameters for both video (H.264) and audio (Opus) in this context?
How do I ensure proper synchronization between audio and video streams?
What's the best way to monitor the streaming status and debug potential issues?