Webrtcsink cli to rust code

Im new to rust and Im struggle to convert a CLI gstreamer command to rust code.
I have this working fine:

gst-launch-1.0 webrtcsink enable-data-channel-navigation=true name=ws meta=“meta,name=gst-stream” videotestsrc ! video/x-raw,width=1280,height=720,format=RGBx,framerate=60/1,pt=96 ! queue ! videoconvert ! x264enc ! ws. pipewiresrc ! audioconvert ! ws.

but when I try to add the same into the code as for the rust example

I keep getting different errors.

there is anyway using rust to add the pipeline as a string? or there is a way I can print the generated pipeline in code? so I can compare it with the CLI command

thanks

Which errors do you get with which code?

You can use gst::parse::launch() for example.

thanks for that > [gst::parse::launch()] , works!.

I would like to get that working pipeline via cli into rs but I getting errors in adding the audio part “pipewiresrc ! audioconvert ! ws.”, the video is working fine.

my code (I tried several combinations of it):

 //AUDIO WORKING//
            let mypipe1 =  gst::parse::bin_from_description("pipewiresrc ! audioconvert ! audioresample", true);
            let audiomypipe = mypipe1?.dynamic_cast::<gst::Element>().unwrap();

                self.pipeline().add(&audiomypipe).context("Adding audiosrc")?;

            audiomypipe
                .link_pads(None, &webrtcsink, Some("audio_%u"))
                  .context("Linking pipewiresrc")?;
            //AUDIO WORKING//

            //AUDIO NOT WORKING//
            /*
             let pipewiresrc = gst::ElementFactory::make("pipewiresrc")
                   .build()
                   .context("Creating pipewiresrc")?;


            let audioconvert  = gst::ElementFactory::make("audioconvert")
                .build()
                .context("Creating audioconvert")?;

            let audioresample  = gst::ElementFactory::make("audioresample")
                .build()
                .context("Creating audioresample")?;

            self.pipeline()
                .add_many([&pipewiresrc, &audioconvert, &audioresample])
                .expect("Adding audiosrc");

            pipewiresrc
                .link_pads(None, &webrtcsink, Some("audio_%u"))
                .context("Linking pipewiresrc")?;

             */
            //AUDIO NOT WORKING//

Errors:
0:00:00.366949421 57675 0x75b148002180 ERROR audio-info audio-info.c:282:gst_audio_info_from_caps: no format given
0:00:00.366964821 57675 0x75b148002180 ERROR audioresample gstaudioresample.c:548:gst_audio_resample_set_caps: invalid incaps
0:00:00.366978461 57675 0x75b148002180 WARN basetransform gstbasetransform.c:1379:gst_base_transform_setcaps: FAILED to configure incaps audio/x-raw, layout=(string)interleaved, rate=(int)48000, channels=(int)2 and outcaps audio/x-raw, rate=(int)48000, format=(string)S16LE, channels=(int)2, layout=(string)interleaved, channel-mask=(bitmask)0x0000000000000003

with link_filtered added I get:

2024-05-28T23:19:39.551109Z ERROR ThreadId(01) webrtc_send_v3: Shutting down due to application error: Preparing: Linking audiorc to timeoverlay: Failed to link elements ‘pipewiresrc0’ and ‘pipewiresrc0’ with filter ‘Caps(audio/x-raw(memory:SystemMemory))’

You’re linking pipewiresrc there directly to webrtcsink and don’t do anything with audioconvert / audioresample. With the code at the top you put all those elements in a bin with a ghostpad, link those elements together, and link that bin to webrtcsink.

Thanks for you reply.
but what should be the correct setup for the //AUDIO NOT WORKING// chunk,
also how would you add queues in that scenario ?

I tried to add queues this way but it dosent seems to be added:

 let queue = gst::ElementFactory::make("queue")
                .property("max-size-buffers", 1u32)
                .property("max-size-bytes", 0u32)
                .property("max-size-time", 0u64)
                .build()
                .expect("Checked in prepare()");


            let videoconvert = gst::ElementFactory::make("videoconvert")
                .build()
                .context("Creating videoconvert")?;


            let video_overlay = gst::ElementFactory::make("timeoverlay")
                .property_from_str("time-mode", "running-time")
                .build()
                .context("Creating timeoverlay")?;

            self.pipeline()
                .add_many([&videosrc, &queue, & videoconvert, &video_overlay])
                .expect("adding video elements");

please help to extend send example.
base on the following code in webrtc-precise-sync-send.rs

 let videosrc = gst::ElementFactory::make("videotestsrc")
                .property("is-live", true)
                .property_from_str("pattern", VIDEO_PATTERNS[idx % VIDEO_PATTERNS.len()])
                .build()
                .context("Creating videotestsrc")?;
            let video_overlay = gst::ElementFactory::make("timeoverlay")
                .property_from_str("time-mode", "running-time")
                .build()
                .context("Creating timeoverlay")?;

            self.pipeline()
                .add_many([&videosrc, &video_overlay])
                .expect("adding video elements");

            videosrc
                .link_filtered(
                    &video_overlay,
                    &gst::Caps::builder("video/x-raw")
                        .field("width", 800i32)
                        .field("height", 600i32)
                        .build(),
                )
                .context("Linking videosrc to timeoverlay")?;

            video_overlay
                .link_pads(None, &webrtcsink, Some("video_%u"))
                .context("Linking video overlay")?;

I would like to add the following elements:

            let videosrc = gst::ElementFactory::make("videotestsrc")
                .build()
                .context("Creating videotestsrc")?;

            let queue = gst::ElementFactory::make("queue")
                .build()
                .expect("Checked in prepare()");

            let queue1 = gst::ElementFactory::make("queue")
                .property("max-size-buffers", 1u32)
                .property("max-size-bytes", 0u32)
                .property("max-size-time", 0u64)
                .build()
                .expect("Checked in prepare()");

            let videoconvert = gst::ElementFactory::make("videoconvert")
                .build()
                .context("Creating videoconvert")?;

            let x264enc = gst::ElementFactory::make("x264enc")
                .property("bitrate", DEFAULT_START_BITRATE / 1000)
                .property_from_str("tune", "zerolatency")
                .property_from_str("speed-preset", "ultrafast")
                .property("threads", 4u32)
                .property("key-int-max", 2560u32)
                .property("b-adapt", false)
                .property("vbv-buf-capacity", 120u32)
                .build()
                .context("Creating x264enc")?;

            let video_overlay = gst::ElementFactory::make("timeoverlay")
                .property_from_str("time-mode", "running-time")
                .build()
                .context("Creating timeoverlay")?;

            let cps: gst::Caps = gst::Caps::builder("video/x-raw")
                .field("width", 1280i32)
                .field("height", 720i32)
                .field("format",gst_video::VideoFormat::Rgbx.to_str())
                .field("framerate", gst::Fraction::new(30, 1))
                .field("pt",96i32)
                .build();

     
        self.pipeline()
                .add_many([&videosrc, &video_overlay])
                .expect("adding video elements");


        video_overlay
               .link_pads(None, &webrtcsink, Some("video_%u"))
                .context("Linking video overlay")?;


     videosrc
                .link_filtered(
                    &video_overlay,
                    &gst::Caps::builder("video/x-raw")
                        .field("width", 1280i32)
                        .field("height", 720i32)
                        .field("format",gst_video::VideoFormat::Rgbx.to_str())
                        .field("framerate", gst::Fraction::new(60, 1))
                        .field("pt",96i32)
                        .build())
                .context("Linking videosrc to timeoverlay")?;

 video_overlay
                .link_pads(None, &webrtcsink, Some("video_%u"))
                .context("Linking video overlay")?;

How can I add queue, x264enc, videoconvert to the pipeline and links following that exmaple?

I try this way below and other several combinations but I can’t get it to work

 self.pipeline()
            .add_many([&videosrc, &video_overlay, &queue, &videoconvert, &x264enc])
            .expect("adding video elements");

I hope you can please give me light on this.
Thanks

@marcosbis, please clean up the code to show what you actually test and indicate as complete symptoms as possible (compiler error, relevant GST_DEBUG output, rust backtrace, …)

In the last snippet, you seem to add the elements as expected, though the queue might not be needed. But we don’t know how and which elements you linked eventually.

1 Like

thanks for looking into this.
this is my pipeline, which is working perfectly via cli command:

gst-launch-1.0 webrtcsink  signaller::uri="ws:/127.0.0.1:8443" enable-data-channel-navigation=true \
 name=ws meta="meta,name=gst-stream" videotestsrc  ! video/x-raw,width=1280,height=720,format=RGBx,framerate=60/1,pt=96 \
 ! timeoverlay time-mode=running-time \
 ! queue ! videoconvert ! x264enc bitrate=2048 tune=zerolatency speed-preset=ultrafast threads=4 key-int-max=2560 b-adapt=false vbv-buf-capacity=120 \
 ! ws. pipewiresrc ! audioconvert ! audioresample ! queue ! ws.

now Im trying to create that pipeline in rust using the example file webrtc-precise-sync-send.rs.

let videosrc = gst::ElementFactory::make("videotestsrc")
                .property("is-live", true)
                .property_from_str("pattern", VIDEO_PATTERNS[idx % VIDEO_PATTERNS.len()])
                .build()
                .context("Creating videotestsrc")?;

            let video_overlay = gst::ElementFactory::make("timeoverlay")
                .property_from_str("time-mode", "running-time")
                .build()
                .context("Creating timeoverlay")?;

            let queue = gst::ElementFactory::make("queue")
                .build()
                .expect("Checked in prepare()");

           let videoconvert = gst::ElementFactory::make("videoconvert")
                .build()
                .context("Creating videoconvert")?;

           let x264enc = gst::ElementFactory::make("x264enc")
                .property("bitrate", DEFAULT_START_BITRATE / 1000)
                .property_from_str("tune", "zerolatency")
                .property_from_str("speed-preset", "ultrafast")
                .property("threads", 4u32)
                .property("key-int-max", 2560u32)
                .property("b-adapt", false)
                .property("vbv-buf-capacity", 120u32)
                .build()
                .context("Creating x264enc")?;
                
                
                self.pipeline()
                .add_many([&videosrc, &video_overlay])
                .expect("adding video elements");

            videosrc
                .link_filtered(
                    &video_overlay,
                    &gst::Caps::builder("video/x-raw")
                       .field("width", 1280i32)
                        .field("height", 720i32)
                        .field("format",gst_video::VideoFormat::Rgbx.to_str())
                        .field("framerate", gst::Fraction::new(30, 1))
                        .field("pt",96i32)
                        .build(),
                )
                .context("Linking videosrc to timeoverlay")?;

            video_overlay
                .link_pads(None, &webrtcsink, Some("video_%u"))
                .context("Linking video overlay")?;

I tried different combinations to add queue/videoconvert and x264enc but I couldn’t work it out the right way and Im stuck. my Question is more about how to alter the last few steps to include those element via rust to match the pipeline.
Thanks

You need to add all elements to the pipeline (in your snippet you only added videosrc & video_overlay):

                self.pipeline()
                    .add_many([&videosrc, &video_overlay, &queue, &videoconvert, &x264enc])
                    .expect("adding video elements");

then link all elements as you do with ! in your gst-launch invocation (first expressions unchanged from your code snippet, 2d one added, 3d one fixed to link x264enc to a requested webrtcsink video pad):

            videosrc
                .link_filtered(
                    &video_overlay,
                    &gst::Caps::builder("video/x-raw")
                       .field("width", 1280i32)
                        .field("height", 720i32)
                        .field("format",gst_video::VideoFormat::Rgbx.to_str())
                        .field("framerate", gst::Fraction::new(30, 1))
                        .field("pt",96i32)
                        .build(),
                )
                .context("Linking videosrc to timeoverlay")?;

            gst::Element::link_many([&video_overlay, &queue, &videoconvert, &x264enc])
                .unwrap();

            x264enc
                .link_pads(None, &webrtcsink, Some("video_%u"))
                .context("Linking x264enc")?;

works perfectrly now!, thanks so much to take the time in explaining the logic behind it.