Encountering difficulties in establishing a connection between the rtspsrc and depayloader elements

Hi Team,

I’m currently working on recording video and audio from an RTSP stream using the Raw pipeline. I initially had success setting up the GStreamer pipeline, but when I converted it to C++ code, I encountered difficulties linking elements like rtspsrc to rtph264depay and similar issues with rtppcmudepay. I have attached the raw pipeline and the C++ code below. Could you please provide guidance and solutions to address these issues in my code?

Raw Pipeline

gst-launch-1.0 -e rtspsrc location=rtsp://192.168.0.10/cam0_0 latency-time=0 ! rtph264depay ! avdec_h264 ! videoconvert ! queue ! x264enc tune=zerolatency ! mux_av. rtspsrc location=rtsp://192.168.0.10/cam0_0 latency-time=0 ! rtppcmudepay ! mulawdec ! audioconvert ! audioresample ! avenc_aac ! mp4mux name=mux_av fragment-duration=2000 ! queue ! filesink location=file.mp4

The raw pipeline working properly.

Code:

//Start IP camera recording.
int Recorder::record_IPCamera() {
// Initialize GStreamer
gst_init(NULL, NULL);

// Create a new GStreamer Ip_Pipeline
Ip_Pipeline = gst_pipeline_new("my-Ip_Pipeline");

// Create elements
GstElement *rtspsrc_video = gst_element_factory_make("rtspsrc", "rtspsrc-video");
GstElement *rtph264depay = gst_element_factory_make("rtph264depay", "rtph264depay");
GstElement *avdec_h264 = gst_element_factory_make("avdec_h264", "avdec-h264");
GstElement *videoconvert = gst_element_factory_make("videoconvert", "videoconvert");
GstElement *queue = gst_element_factory_make("queue", "queue -video");
GstElement *queue_video = gst_element_factory_make("queue", "queue-video");
GstElement *x264enc = gst_element_factory_make("x264enc", "x264enc");
GstElement *rtspsrc_audio = gst_element_factory_make("rtspsrc", "rtspsrc-audio");
GstElement *rtppcmudepay = gst_element_factory_make("rtppcmudepay", "rtppcmudepay");
GstElement *mulawdec = gst_element_factory_make("mulawdec", "mulawdec");
GstElement *audioconvert = gst_element_factory_make("audioconvert", "audioconvert");
GstElement *audioresample = gst_element_factory_make("audioresample", "audioresample");
GstElement *avenc_aac = gst_element_factory_make("avenc_aac", "avenc-aac");
GstElement *mp4mux = gst_element_factory_make("mp4mux", "mp4mux");
GstElement *queue_mux = gst_element_factory_make("queue", "queue-mux");
GstElement *filesink = gst_element_factory_make("filesink", "filesink");

if (!Ip_Pipeline || !rtspsrc_video || !rtph264depay || !avdec_h264 || !videoconvert || !queue || !queue_video || !x264enc ||
    !rtspsrc_audio || !rtppcmudepay || !mulawdec || !audioconvert || !audioresample || !avenc_aac || !mp4mux ||
    !queue_mux || !filesink) {
    g_print("Not all elements could be created.\n");
    return -1;
}

// Set element properties
// Set element properties
g_object_set(rtspsrc_video, "location", "rtsp://192.168.0.10/cam0_0", "buffer-mode", 0, NULL);
g_object_set(rtspsrc_audio, "location", "rtsp://192.168.0.10/cam0_0", "buffer-mode", 0, NULL);	
g_object_set(x264enc, "tune", 4, NULL);
g_object_set(mp4mux, "name", "mux_av", "fragment-duration", 2000, NULL);
g_object_set(filesink, "location", "file.mp4", NULL);

GstCaps *caps = gst_caps_from_string("application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264");
g_object_set(rtspsrc_video, "location", "rtsp://192.168.0.10/cam0_0", "filter", caps, NULL);

// Add elements to the Ip_Pipeline
gst_bin_add_many(GST_BIN(Ip_Pipeline), rtspsrc_video,queue, rtph264depay, avdec_h264, videoconvert, queue_video, x264enc,
                 rtspsrc_audio, rtppcmudepay, mulawdec, audioconvert, audioresample, avenc_aac, mp4mux, queue_mux,
                 filesink, NULL);

// Link elements
if (!gst_element_link_many(rtspsrc_video,queue, rtph264depay, avdec_h264, videoconvert, queue_video, x264enc, mp4mux,
                           NULL)) {
    g_print("Video elements could not be linked.\n");
    gst_object_unref(Ip_Pipeline);
    return -1;
}

if (!gst_element_link_many(rtspsrc_audio, rtppcmudepay, mulawdec, audioconvert, audioresample, avenc_aac, mp4mux,
                           queue_mux, filesink, NULL)) {
    g_print("Audio elements could not be linked.\n");
    gst_object_unref(Ip_Pipeline);
    return -1;
}

// Set the Ip_Pipeline to playing state
gst_element_set_state(Ip_Pipeline, GST_STATE_PLAYING);

return 0;

}

Regards,
Sulthan

rtspsrc has so-called dynamic pads which are only created after the pipeline starts up and data is flowing.

That means you can’t link rtspsrc to any depayloaders before data flow starts and the pads have been created.

gst-launch-1.0 (or gst_parse_launch()) handle this automagically for you under the hood, but if you link manually you have to handle it yourself by connecting to the "pad-added" signal.

Hi Tim,

Thanks for your help, it’s working.

Regards,
Sulthan