Hello,
I’m working on streaming an RTSP camera feed to a web browser using Media Source Extensions (MSE).
I use the following GStreamer pipeline to process the RTSP stream and send it over TCP:
gst-launch-1.0 rtspsrc location=‘rtspt://…’ latency=100 ! rtph264depay ! h264parse ! h264timestamper ! isofmp4mux fragment-duration=2000000000 chunk-duration=500000000 interleave-time=100000000 offset-to-zero=true ! queue leaky=downstream max-size-buffers=10 max-size-time=500000000 ! tcpserversink host=0.0.0.0 port=35017 recover-policy=keyframe sync-method=latest-keyframe sync=false
Then, I have a python WebSocket server that reads data from the TCP stream and sends it to connected clients
The stream only works for the first client who connects before pipeline starts.
Users fail to stream if they connect to the stream later (about 1-2s) after the pipeline start
I think the fmp4 is broken or miss some information so mse can’t decode.
I tried set sync=true in tcpserversink but it doesn’t work.