Appsink and Appsrc with RSTP streams

Hi, I’m building a custom gstreamer pipeline to consume an RTSP H.264 stream, and I want to access and process the encoded frames before decoding, then pass those frames into a separate decode pipeline.

This approach works as expected when I test with the offline source. but I am running into issues when using rtspsrc.

What I am trying to do

  • Pull out frames from an RTSP stream using appsink.
  • Perform some custom processing on the encoded frame.
  • Push the same frames into a second pipeline via appsrc for decoding.
GstElement *capture_pipeline = gst_parse_launch(
          "rtspsrc name=rtsp_src location=<rstp_src> "
          "latency=50 protocols=tcp "
          "! rtph264depay name=depay "
          "! h264parse "
          "! appsink name=capture_sink sync=false drop=true max-buffers=5",
          NULL);

GstElement *decode_pipeline =
        gst_parse_launch("appsrc name=decode_src is-live=true "
                         "! h264parse "
                         "! avdec_h264 "
                         "! videoconvert "
                         "! video/x-raw,format=BGR "
                         "! appsink name=decode_sink sync=false",
                         NULL);

// Launch pipelines
gst_element_set_state(w->capture_pipeline, GST_STATE_PLAYING);
gst_element_set_state(w->decode_pipeline, GST_STATE_PLAYING);

// try to get frame from appsink
GstElement *capture_sink = gst_bin_get_by_name(GST_BIN(capture_pipeline), "capture_sink");

GstSample *sample = gst_app_sink_try_pull_sample(GST_APP_SINK(capture_sink), 10 * GST_MSECOND);

// but here I am not able to pull out the frames

if (!sample) {
  printf("Not able to pull out frames\n");
  continue;
}

Tell me If I am trying the sink and src in the wrong way.

When you set the pipeline to PLAYING state, that’s not something that happens instantaneously, but it’s something that happens asynchronously in the background.

GStreamer will spawn some threads, and establish a TCP connection to the RTSP server, and then do some back and forth and then request the server to stream some data, and then it will take some time for the data to arrive, etc. We call this process “prerolling” in GStreamer: Filling the pipeline with data until each sink has a buffer.

Since you put a 10ms timeout on your appsink pull, that’s probably not enough time for the pipeline to preroll and produce data, and that’s why you get no sample back there.

Also, I would not use appsink drop=true max-buffers=5, especially not on encoded data, since if you drop an encoded frame you basically mess up the stream for the next couple of frames/seconds because there’s now suddenly data missing.

Hi @vasucp1207 ,

1- I would suggest to make sure the buffers are getting to the appsink first, this can be done by running the pipeline in gst-launch on the command-line with GST_DEBUG=2 level and replacing the appsink with fakesink, this will print the buffers as they arrive into the element, you can paste this in the command line:
GST_DEBUG=2 gst-launch-1.0 rtspsrc name=rtsp_src location=<rstp_src> latency=50 protocols=tcp ! rtph264depay name=depay ! h264parse ! fakesink silent=false -v

If there are buffers, you should get something like:
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (115200 bytes, dts: none, pts: 0:09:25.766666666, duration: 0:00:00.033333334, offset: 16973, offset_end: 16974, flags: 00000000 , meta: none) 0x561c53ee86c0

2-
In addition to the previous response, to be sure that the pipeline is in a playing state, you could add the following command right after the gst_element_set_state commands:

ret = gst_element_get_state(capture_pipeline,
&state,
&pending,
5 * GST_SECOND);

More info on the return values:
gst_element_get_state

3- I agree on not dropping encoded frames