Buffer reidentification between elements

Hi all!

I’m trying to develop an application (on nvidia jetson platform) consisting of a server that embeds a gstreamer+deepstream pipeline and a client. The server receives video (protocols and codecs may be variuos, souphttpsrc/filesrc/rstpsrc, mjpeg/h264/h265), does image processing with deepstream plugins (several nvinfer’s, tracking and so on) and sends the image processing results via some protocol (I use json over tcp socket) for each frame.

The client receives the data like
{"infer_results": [...some inference stuff here...], "id": 30000}
and sometimes, based on the image processing results, it may want to query the exact image corresponding to the image processing results based on the id above. It may happen in some time after the image processing results were received, like 10-20 seconds or so.

The problem is to match on the server side (more precisely – inside the gstreamer pipeline) the inference results with the image it was made on.

I implemented the server like the following

  • pipeline
filesrc location=./video.mp4 ! qtdemux ! h264parse ! nvv4l2decoder 
    ! tee name=t 
    t.src_0 ! queue ! mux.sink_0 nvstreammux name=mux batch-size=1 width=1280 height=720 ! nvvideoconvert 
    ! nvinfer name=pgie config-file-path=./config.txt name=nvinfer 
    ! fakesink 
    t.src_1 ! queue ! nvvideoconvert ! video/x-raw,width=1920,height=1080,format=I420 ! nvjpegenc name=jpg ! fakesink
  • callbacks
    • buffer type probe on nvinfer (pgie) src gets the inference data and sends it out
    • buffer type probe on nvjpegenc (jpg) src gets the jpeg data and keeps it in some buffer in case the client will request it
  • http server that respones with the jpeg data
  • tcp socket that continuously sends out the invifer results

I’m using the buffer PTS as ID

std::map<guint64, std::vecor<byte>> jpeg_datas;

GstPadProbeReturn nvjpegenc_src_buffer_probe(GstPad* pad, GstPadProbeInfo* info, gpointer u_data)
{
    GstBuffer* buf = (GstBuffer*)info->data;
    const auto id = GST_BUFFER_PTS(buf);
    ...
    // map the buffer, get raw jpeg data, put it to the jpeg_datas
    jpeg_datas[id] = ...here comes the data from the buffer...
}

but I understand that PTS is not an ID, i.e. they never match between the callbacks above (maybe because nvinfer happens later then jpegenc, or because of nvvideoconvert before nvjpegenc).

I tried to add metadata for the buffer before it enters the tee element, but the metadata gets lost when the buffer is completely replaced (i.e. after nvvideoconvert), as I understand - it’s “by metadata design”, because the metadata is bound to the buffer

There are some extra difficulties

  • I’m totally new to gstreamer/deepstream and probably doing something weird
  • GStreamer version is 1.14.5 and there is no simple way to upgrade it on jetson nano without losing the deepstream compatibility

I would guess that I’m doing somethong wrong with the metadata (or nvvideoconvert erases it completely (?)) and there is a straight way to make it persistant between diffenet elements

Any help will be appreciated!

I have found out that I used the GstMeta incorrectly. I had to register a meta type with “copy constuctor” (specify the GstMetaTransformFunction)
This code sample helped me to understand the correct usage. So now I see that meta is flowing through the whole pipeline (limited to my use cases), even through tee, decoders, encoders and so on.