I found that the reason for the long start-up time of gst server is related to the time when the server sends sdp. The following is the debugging information of the client. It can be seen that it takes about 3s for the client to receive the sdp description sent by the server.
I want to know why this is, and what is it related to?
From what I understand, the pipeline uses mpph264enc, a Rockchip hardware encoding plugin for MPP, do you mean that the server needs to parse the pipe before it can send the sdp to the client?
The rtsp-server generates the SDP based on the output caps from the RTP payloader(s).
The RTP payloader is probably waiting for the video encoder to send output caps or even an initial buffer in order to generate output caps with the sprop-parameter-sets configuration data (which contains the SPS/PPS).
I’m not familiar with this particular encoder element, so you’d have to check the implementation to see what it does exactly.
In case the encoder can output both stream-format=avc or stream-format=byte-stream it might help to force it to output avc format (where the sps/pps are in the caps then in the codec_data field).
I would like to ask further, in which step does the rtsp-server send the sdp to the client, and specifically in which function process? For example, in one of these functions? If so, which one?
I don’t know what you said, I want to trace the source code to see how the server sends to sdp, which library file should I look at, I trace the function I mentioned before, but I don’t understand the steps of sending sdp
I would recommend enabling some GST_DEBUG=rtsp*:6 debug logging for rtsp-server and that will then give you the source files, function names and line numbers where things are happening.
Now I’m using x264enc to code, but unfortunately, it also takes about 3s to get the interface. I ran the rtsp-server example code with the following command and got the following output: Can you help me see what’s wrong? Thank you very much.
neardi@LPA3588:~/Downloads/gst-rtsp-server/examples$ ./test-192.168.3.40 --gst-debug-level=3 "( v4l2src device=/dev/video11 ! video/x-raw,format=NV12,width=640,height=480 ! x264enc tune=zerolatency ! rtph264pay name=pay0 pt=96 )"
0:00:14.998236691 107450 0x7fb4006360 FIXME default gstutils.c:4025:gst_pad_create_stream_id_internal:<appsrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:15.006969567 107450 0x55a861b300 WARN v4l2src gstv4l2src.c:835:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
0:00:15.008569613 107450 0x55a861b300 WARN v4l2src gstv4l2src.c:835:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
0:00:15.008738192 107450 0x7fac01b5e0 WARN v4l2 gstv4l2object.c:4589:gst_v4l2_object_probe_caps:<v4l2src0:src> Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: Invalid argument
0:00:15.009059893 107450 0x55a861b300 WARN v4l2src gstv4l2src.c:835:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
0:00:15.009520132 107450 0x55a861b300 WARN v4l2src gstv4l2src.c:835:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
0:00:18.164756065 107450 0x7fac01b5e0 WARN v4l2bufferpool gstv4l2bufferpool.c:1373:gst_v4l2_buffer_pool_dqbuf:<v4l2src0:pool0:src> Driver should never set v4l2_buffer.field to ANY
0:00:18.236626362 107450 0x55a861b000 FIXME rtspmedia rtsp-media.c:3425:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:18.264401996 107450 0x55a861b000 FIXME rtspmedia rtsp-media.c:3425:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:18.264466452 107450 0x55a861b000 WARN rtspmedia rtsp-media.c:3451:gst_rtsp_media_suspend: media 0x7fb40471a0 was not prepared
0:00:18.346979664 107450 0x7fac01b5e0 WARN v4l2src gstv4l2src.c:1164:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 1 - ts: 0:00:03.300771464
0:00:18.404740546 107450 0x7fac01b5e0 WARN v4l2src gstv4l2src.c:1164:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 1 - ts: 0:00:03.380834005
0:00:18.564615592 107450 0x7fac01b5e0 WARN v4l2src gstv4l2src.c:1164:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 1 - ts: 0:00:03.540818172
0:00:18.764546021 107450 0x7fac01b5e0 WARN v4l2src gstv4l2src.c:1164:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 1 - ts: 0:00:03.740775463
If the source really takes that long to startup you can work around it either by
always having a local client connected from the start (that just throws away the data it receives, but that way the pipeline will be up and running before external clients connect); this is only useful if it’s a shared media.
start a producer (source) pipeline outside of rtsp-server and feed the data into the rtsp-server media pipeline using elements like intervideosink/intervideosrc or intersink/intersrc (or manually via appsink/appsrc)
Using this pipeline, I was unable to receive data on the pull side, and based on the debug information on the pull side, I was unable to fetch the sdp.
05-11 18:50:30.860 [00003218] <info> [SP] Run out of StartInternal--
05-11 18:50:34.221 [00007cc8] <info> [SP] describe s_c:503
05-11 18:50:34.221 [00007cc8] <error> [SP] RTSP Failed to get a SDP description:503 Service Unavailable
This method was tried when I read the rtsp-server sample application, as you said, if you set up shared pipes, when one client is connected, another client can open the video interface in millimeters. However, this does not fully meet my use situation, I will switch different sources to observe different pictures in use, in order to achieve the randomness and real-time switching, I cannot ask other clients to pull streams in advance.This makes me very upset.
Well, I think so. I have a little doubt whether the source element is slow to start is related to the camera hardware, or to the gst framework? Will gst not be compatible with the hardware? In view of the present situation, should I give priority to finding the cause from the hardware device or CPU? I look forward to your reply.