I have a source streaming raw JPEG over TCP. I am trying to encode this stream as H264 and serve it as an RTSP stream. I set the do-timestamp property on the tcpclientsrc element to TRUE. However, the tcpclientsrc is not a live source and it is producing buffers in a paused state. Because there’s no clock until the pipeline gets in a playing state, buffers have no timestamp. This is causing the x264enc to complain about timestamps and not produce any output for a very long time (about 60 frames). When it eventually produces frames and the pipeline gets in a playing state only the first frame is being rendered.
For testing simplicity, I am running this pipeline, replacing the h264 rtp payloader with a decoder and renderer:
gst-launch-1.0.exe tcpclientsrc host=localhost port=10010 do-timestamp=true ! jpegparse ! jpegdec ! x264enc ! avdec_h264 ! autovideosink
For example, this pipeline is working as expected:
gst-launch-1.0.exe videotestsrc is-live=true ! x264enc ! h264_avdec ! autovideosink
Actually this pipeline works as expected even when is-live is not set to true because the videotestsrc always generates timestamps.
Does anyone have any suggestions on how to approach this problem? I believe that if the tcpclientsrc was a live source this would work without issues.