Hi,
I’m using Gstreamer
to write EmguCv
UMat
frames to rtspclientsink
to create RTSP streaming with Mediamtx
server.
I want to know what is the best way to write frames to Gst.Buffer
. I want to have the best performance and avoid as possible temporal memory copies. Here are my results using two different ways to create the Gst.Buffer
.
This is my pipeline.
var command = new StringBuilder();
command.Append("appsrc name=appSource is-live=true do-timestamp=true");
command.Append(" ! videoconvert");
command.Append(" ! qsvh264enc bitrate=10000 low-latency=true target-usage=7");
command.Append(" ! h264parse ");
command.AppendFormat(" ! rtspclientsink location={0}", _streamUrl);
_pipeline = Parse.Launch(command.ToString()) as Pipeline;
_pipeline.Bus.AddWatch(OnBusMessage);
var appSrcElement = _pipeline.GetByName("appSource");
_appSrc = new AppSrc(appSrcElement.Handle);
_appSrc.StreamType = AppStreamType.Stream;
_appSrc.Block = true;
_appSrc.Format = Format.Time;
var caps = Caps.FromString("video/x-raw, format=BGR, width=640, height=480, framerate=30/1");
_appSrc.Caps = caps;
Creating the Gst.Buffer
using byte[]
seems to be working well.
public FlowReturn WriteFrame(UMat img)
{
FlowReturn res;
CvInvoke.Resize(img, img, new System.Drawing.Size(640, 480));
using (var mat = img.GetMat(Emgu.CV.CvEnum.AccessType.Read))
{
var length = mat.Rows * mat.Cols * mat.NumberOfChannels;
var data = new byte[length];
Marshal.Copy(mat.DataPointer, data, 0, length);
var buffer = new Gst.Buffer(data);
res = _appSrc.PushBuffer(buffer);
}
}
But before I have try to use Map
, with that the stream works and there is no error but the streaming shows black image.
var buffer = new Gst.Buffer(null, (ulong)length, AllocationParams.Zero);
if (buffer.Map(out var mapInfo, MapFlags.Write))
{
try
{
Marshal.Copy(mat.DataPointer, mapInfo.Data, 0, length);
}
finally
{
buffer.Unmap(mapInfo);
}
}
With regards,