One of our custom plugins (video filter) in our pipeline takes a video frame and produces additional data. Some of it is metadata which is used further down the pipeline and can easily be attached as such to the buffer, some of it is larger data (nx128 arrays) used only by the running application. This data does not need to be passed on within the pipeline, but could be passed to an app_sink element.
How would one establish the connection between the plugin and the app_sink for this?
Thanks for reacting to my post. I had actually started with suggestion 1, but I don’t see any way how to actually forward the produced data to the second source pad.
The function allows for one outgoing video frame (buffer) only.
Continuing on path 1, I am trying to get the source_pad of the element from within the transform_frame function, create the data and push it to the pad.
So I’ve added this snippet to see if it was possible
if let Some(mut person_meta_pad) = self.static_pad("src_person_metadata"){
let buffer = Buffer::new(); //empty for now
person_meta_pad.push(buffer);
} else {
event!(Level::WARN, "src_person_metadata pad not available");
}
But the compiler wouldn’t like it:
error[E0599]: the method `static_pad` exists for reference `&AdvTrtOdMtl`, but its trait bounds were not satisfied
--> protos/advtrtplugins/src/advtrtodmtl/imp.rs:531:49
Shouldn’t it be possible to get a reference on the pad and push data to it? As this seems to be gstreamer-rs specific, maybe @slomo could point me in the right direction?
Thanks for the hint. I did not know about the obj() function (pointer to the docs would be highly appreciated). This compiles and runs now, revealing that Objects extending BaseTransform only support one src and sink pad, is that correct? What Element would you recommend as a starting point to implement a plugin, that processes video frames AND produces additional data to be passed on to an appsink?
Should it really be a separate stream or could it just be a GstMeta on the video buffers?
The data resides in GPU Memory. Would it be possible to wrap it in a buffer and then pass a pointer through GstMeta to the appsink for further processing? The data is rather part of the stream than metadata, so this feels a bit hacky… ?
You can put whatever you want into a GstMeta. If it’s part of the stream then it seems like that would be the conceptually correct approach. Otherwise you would have to somehow match buffers of both streams again later.
Okay, I will follow your advice and go with the GstMeta. After all, the data is tied to the frame and would have to be matched to it later anyway (as you mentioned). Now, let’s see if I can get the implementation details right…