I’m currently experimenting with libwebrtc in rust on MacOS. I’ve been able to establish the peer connection and data channel is working too. However I’m not able to play the video frames.
let on_track = |track_event: libwebrtc::peer_connection::TrackEvent| {
let mut pipeline_str = String::from("appsrc name=src is-live=true format=time ! ");
// hard coded the width and height for now
pipeline_str.push_str(
"rawvideoparse format=i420 width=2940 height=1836 framerate=15/1 ! ",
);
pipeline_str.push_str("videoconvert ! queue ! autovideosink");
println!("The pipeline is:\n{}", pipeline_str);
let pipeline = gstreamer::parse::launch(&pipeline_str).unwrap();
let pipeline = pipeline.dynamic_cast::<gstreamer::Pipeline>().unwrap();
let appsrc = pipeline.by_name("src").unwrap();
let appsrc = appsrc.downcast::<gstreamer_app::AppSrc>().unwrap();
appsrc.connect("need-data", false, |args| {
println!("Needs data");
None
});
pipeline.set_state(gstreamer::State::Playing).unwrap();
println!("Set to playing");
if let libwebrtc::prelude::MediaStreamTrack::Video(track) =
track_event.track
{
println!("Its a video track");
thread::spawn(move || {
let mut stream = NativeVideoStream::new(track);
println!("Came here");
while let Some(frame) = block_on(stream.next()) {
println!("Got frame");
let i420_buffer = frame.buffer.to_i42
let width = i420_buffer.width() as u32;
let height = i420_buffer.height() as
let data_yuv = i420_buffer.data();
let strides_yuv = i420_buffer.stride
let chroma_width = i420_buffer.chroma_width();
let chroma_height = i420_buffer.chroma_heigh
let mut raw_data =
Vec::with_capacity((width * height * 3 / 2) as usi
for row in 0..height {
let start = (row * strides_yuv.0) as usize;
let end = start + width as usize;
raw_data.extend_from_slice(&data_yuv.0[start..end]);
// Copy U plane
for row in 0..chroma_height {
let start = (row * strides_yuv.1) as usize;
let end = start + chroma_width as usize;
raw_data.extend_from_slice(&data_yuv.1[start..end]);
// Copy V plane
for row in 0..chroma_height {
let start = (row * strides_yuv.2) as usize;
let end = start + chroma_width as usize;
raw_data.extend_from_slice(&data_yuv.2[start..end]);
}
let mut gst_buffer =
gstreamer::Buffer::with_size(raw_data.len()).unwra
{
let buffer_ref = gst_buffer.get_mut().unwrap();
buffer_ref.set_pts(gstreamer::ClockTime::from_useconds(
frame.timestamp_us as u64,
));
println!("pts: {}", frame.timestamp_us);
let mut map = buffer_ref
.map_writable()
.expect("Failed to map buffer writable");
map.as_mut_slice().copy_from_slice(&raw_data);
if let Err(e) = appsrc.push_buffer(gst_buffer) {
println!("Error {:?}", e);
}
}
});
thread::spawn(move || {
let bus = pipeline.bus().unwrap();
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
match msg.view() {
MessageView::Eos(_) => {
println!("End of stream");
}
MessageView::Error(e) => {
println!("stream error {}", e);
}
_ => (),
}
}
});
It opens the videosink, shows a single frame and nothing else happens after that, even though I get the log for Got frame continuously. The log for need-data only gets printed couple of times before it stops. I’m not getting any log Eos or anything else. Running with GST_DEBUG=3 gives me only a single error: 0:00:03.410209333 93409 0x600002c14030 ERROR glcaopengllayer gstglcaopengllayer.m:161:-[GstGLCAOpenGLLayer copyCGLContextForPixelFormat:]: failed to retrieve GStreamer GL context in CAOpenGLLayer
There’s obviously something I’m doing wrong. Could you please point it out?
Does it work if you replace the input side of your pipeline with a videotestsrc? Does it work if you use do-timestamp=true on the appsrc instead of setting your own timestamps (which is probably done wrong here, depending on how timestamps work in libwebrtc).
Apart from that you might want to look a) at the examples for some useful patterns (e.g. you can use gst::Buffer::from_mut_slice(your_vec) to create the buffer without copying twice, or maybe even directly from the input data), b) GStreamer also has WebRTC support so you might want to use that instead of working with libwebrtc, depending on your requirements.
I’m only trying to receive video so I can’t test with videotestsrc. But you are right! The way in which I’m doing timestamps is incorrect. With do-timestamp=true, I’m seeing some progress…
I’m getting live data (frames are sliding), but it’s coming like this and the signal need-data keeps getting triggered. Does this mean I’m pushing data slowly?
let on_track = |track_event: libwebrtc::peer_connection::TrackEvent| {
let mut pipeline_str = String::from(
"appsrc name=src is-live=true format=time do-timestamp=true ! queue !",
);
pipeline_str.push_str(
"rawvideoparse name=rawvideoparse format=i420 framerate=30/1 ! ",
);
pipeline_str.push_str("videoconvert ! autovideosink");
println!("The pipeline is:\n{}", pipeline_str);
let pipeline = gstreamer::parse::launch(&pipeline_str).unwrap();
let pipeline = pipeline.dynamic_cast::<gstreamer::Pipeline>().unwrap();
let pipeline_clone = pipeline.clone();
let appsrc = pipeline.by_name("src").unwrap();
let appsrc = appsrc.downcast::<gstreamer_app::AppSrc>().unwrap();
appsrc.connect("need-data", false, |args| {
println!("Needs data");
None
});
if let libwebrtc::prelude::MediaStreamTrack::Video(track) =
track_event.track
{
println!("Its a video track");
thread::spawn(move || {
let mut stream = NativeVideoStream::new(track);
println!("Came here");
let mut done = false;
while let Some(frame) = block_on(stream.next()) {
let i420_buffer = frame.buffer.to_i420();
//
let width = i420_buffer.width() as u32;
let height = i420_buffer.height() as
if !done {
let rawvideoparse =
pipeline.by_name("rawvideoparse").unwrap();
rawvideoparse.set_property("width", width as i32);
rawvideoparse.set_property("height", height as i32);
pipeline.set_state(gstreamer::State::Playing).unwrap();
println!("Set to playing");
}
done = t
let data_yuv = i420_buffer.data();
let strides_yuv = i420_buffer.strides()
let chroma_width = i420_buffer.chroma_width();
let chroma_height = i420_buffer.chroma_height();
let mut raw_data =
Vec::with_capacity((width * height * 3 / 2) as usize
for row in 0..height {
let start = (row * strides_yuv.0) as usize;
let end = start + width as usize;
raw_data.extend_from_slice(&data_yuv.0[start..end]);
// Copy U plane
for row in 0..chroma_height {
let start = (row * strides_yuv.1) as usize;
let end = start + chroma_width as usize;
raw_data.extend_from_slice(&data_yuv.1[start..end]);
// Copy V plane
for row in 0..chroma_height {
let start = (row * strides_yuv.2) as usize;
let end = start + chroma_width as usize;
raw_data.extend_from_slice(&data_yuv.2[start..end]);
}
let gst_buffer = gstreamer::Buffer::from_mut_slice(raw_data);
//
if let Err(e) = appsrc.push_buffer(gst_buffer) {
println!("Error {:?}", e);
}
println!("Pushed");
}
});
thread::spawn(move || {
let bus = pipeline_clone.bus().unwrap();
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
match msg.view() {
MessageView::Eos(_) => {
println!("End of stream");
}
MessageView::Error(e) => {
println!("stream error {}", e);
}
_ => (),
}
}
});
}
}
I followed your advice and used gst::Buffer::from_mut_slice(). I’m not sure if it’s possible to directly pass the input data as I’m not really knowledgeable about this.
let on_track = |track_event: libwebrtc::peer_connection::TrackEvent| {
let mut pipeline_str = String::from(
"appsrc name=src is-live=true format=time do-timestamp=true ! queue !",
);
pipeline_str.push_str(
"rawvideoparse name=rawvideoparse format=argb framerate=60/1 ! ",
);
pipeline_str.push_str("videoconvert ! autovideosink");
println!("The pipeline is:\n{}", pipeline_str);
let pipeline = gstreamer::parse::launch(&pipeline_str).unwrap();
let pipeline = pipeline.dynamic_cast::<gstreamer::Pipeline>().unwrap();
let pipeline_clone = pipeline.clone();
let appsrc = pipeline.by_name("src").unwrap();
let appsrc = appsrc.downcast::<gstreamer_app::AppSrc>().unwrap();
if let libwebrtc::prelude::MediaStreamTrack::Video(track) =
track_event.track
{
println!("Its a video track");
thread::spawn(move || {
let mut stream = NativeVideoStream::new(track);
let mut done = false;
while let Some(frame) = block_on(stream.next()) {
let i420_buffer = frame.buffer.to_i420();
let width = i420_buffer.width() as u32;
let height = i420_buffer.height() as u32;
let stride = width * 4;
let mut raw_data = vec![0u8; (stride * height) as usize];
frame.buffer.to_argb(
VideoFormatType::BGRA,
raw_data.as_mut_slice(),
stride,
width as i32,
height as i32,
if !done {
let rawvideoparse =
pipeline.by_name("rawvideoparse").unwrap();
rawvideoparse.set_property("width", width as i32);
rawvideoparse.set_property("height", height as i32);
pipeline.set_state(gstreamer::State::Playing).unwrap();
println!("Set to playing");
}
done = true;
let gst_buffer = gstreamer::Buffer::from_mut_slice(raw_data);
if let Err(e) = appsrc.push_buffer(gst_buffer) {
println!("Error {:?}", e);
}
}
});
thread::spawn(move || {
let bus = pipeline_clone.bus().unwrap();
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
match msg.view() {
MessageView::Eos(_) => {
println!("End of stream");
}
MessageView::Error(e) => {
println!("stream error {}", e);
}
_ => (),
}
}
});
}
};
Idk why the previous pipeline with i420 was failing. So I used the helper function to convert it to BGRA. The other thing that I’m curious about is why do I have to specify the gstreamer format as argb and while converting to ARGB, I need to specify VideoFormatType::BGRA where
That looks like you’re getting the raw video frame configuration wrong (stride, plane offsets, etc).
Related to that, you shouldn’t need rawvideoparse if you receive one video frame per buffer. All you need to do is to provide the correct caps from your appsrc, and to make sure strides/etc are correct (which is easy to ensure because you copy anyway).
Please create issues in gitlab about that. For the second, with a way to reproduce the problem.
There are two different ways of naming used by different software for RGBA component order: GStreamer (and e.g. ffmpeg) uses memory order, i.e. you’ll have an array of [r, g, b, a, r, g, b, a, ...] bytes for RGBA. Other software (apparently libwebrtc, but also Qt or GTK for example) use MSB-to-LSB order in integers, i.e. you’ll have a 32 bit integer of 0xrrggbbaa for an RGBA pixel, which on little endian systems maps to a byte array in the reverse order [a, b, g, r, ...].
Thanks for the tip! I’m directly configuring the caps for appsrc. Unfortunately, I’m not able to figure out proper configuration for i420. I’ll read up more about it and try it sometime later.