Using intersink/intersrc with gst-rtsp-server

I am trying to create an rtsp restream server using rust and intersink/intersrc. The idea is to have a base pipeline that connects to the source and potentially modifies the image and multiple client pipelines that are created, once new rtsp clients connect.

I did this in Rust:

use anyhow::Error;
use derive_more::derive::{Display, Error};

use gstreamer::prelude::*;
use gstreamer as gst;

use gstreamer_rtsp_server::prelude::*;
use futures::prelude::*;

#[derive(Debug, Display, Error)]
#[display("Could not get mount points")]
struct NoMountPoints;

#[tokio::main]
async fn main() -> Result<(), Error> {
    gst::init()?;

    let main_loop = glib::MainLoop::new(None, false);
    
    let src_pipeline = gst::parse::launch("rtspsrc location=rtsp://localhost:8554/test latency=0 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! videorate ! video/x-raw,framerate=10/1,width=640,heigh=480 ! queue ! intersink producer-name=test").unwrap();

    src_pipeline.set_state(gstreamer::State::Playing)?;
    
    let mut stream = src_pipeline.bus().unwrap().stream();
    while let Some(msg) = stream.next().await {
        use gstreamer::MessageView;
        match msg.view() {
            MessageView::StateChanged(..) => {
                if let Some(o) = msg.src() {
                    if o.name() == src_pipeline.name()
                    {
                        let k: gstreamer::State =  msg.structure().unwrap().value("new-state").unwrap().get()?;
                        if k == gstreamer::State::Playing {
                            break;
                        }
                    }
                }
            }
            _ => ()
        }
    }

    let server = gstreamer_rtsp_server::RTSPServer::default();

    let mounts = gstreamer_rtsp_server::RTSPMountPoints::default();
    server.set_mount_points(Some(&mounts));
    server.set_service("8555");

    let mounts = server.mount_points().ok_or(NoMountPoints)?;

    let factory = gstreamer_rtsp_server::RTSPMediaFactory::default();
    factory.set_launch("intersrc producer-name=test ! queue ! videoconvert ! x264enc speed-preset=veryfast tune=zerolatency ! queue ! rtph264pay name=pay0 pt=96 config-interval=-1");
    factory.set_suspend_mode (gstreamer_rtsp_server::RTSPSuspendMode::None);

    mounts.add_factory("/test", factory);

    let id = server.attach(None)?;

    println!(
        "Stream ready at rtsp://127.0.0.1:{}/test",
        server.bound_port()
    );

    main_loop.run();
    id.remove();

    Ok(())
}

(this is a shortened down version to show the problem)

When trying to connect to this server using ffmpeg:

ffmpeg -i rtsp://localhost:8555/test -hide_banner -c:v copy -rtsp_transport tcp "%Y-%m-%d-%H-%M-%S_Test.mp4"

I get a ton of “Dropping old item buffer” Warnings from the underlying appsrc element of the intersrc.

The reason I am using rsinter in contrast to intervideosink is, because I require potential metadata from the source and secondly because I want to move the encoding to the base-pipeline as well.

I tried the following:

  • Setting the underlying appsrc of intersrc to is-live=true and do-timestamp=true
  • Increasing the buffer size
  • Setting the pipelines to identical clocks
  • Replacing the rtspsrc with videotestsrc is-live=true

But (as one may have noticed), Im tapping in the dark.

Is there something I have missed?

The problem seems to be, that the intersrc element does not (as i expected) reset the frame timestamp. This works just fine, if the two connected pipelines are started simultaneously, but as that’s not the case in my example.

I switched to the intervideosink/src for now, which does the timestamping and (against my expectations) transfers metadata just fine.