Running-time in appsrc callback

Hi there,

I would get the running-time in the need-data callback.

I can’t use gst_segment_to_running_time because I don’t known how obtain the segment.

May be better if I explain the problem I’m facing: my pipeline get a video stream h264 and send it to a splitmuxsink with muxer-factory=mpegtsmux. splitmuxsink will split for me the file every XMb with a progressive number in the filename. There is also an appsrc to push a buffer as subtitle_0 to the muxer. Every second I have to write in a file a timestamp and the progressive number of the video file so in the future I can associate a timestamp with a certain video file.

So my idea is get the running-time in the appsrc callback and compare it with the running-time in the "format-location-full” callback of splitmuxsink, because this is the place where I’m sure the actual video file will be closed and starts the new one.

I know, my explanation is a crap :slight_smile:

I’m not sure I really understand the problem you’re trying to solve or what you’d like to achieve - could you rephrase your explanation or explain what behaviour you get now that you don’t want?

There might be considerable buffering between the source and the output of splitmuxsink.

Yes I agree, my explanation is very confusing.

I have many questions, but it’s better clarify just one at a time.

So I will start from the simplest (I think):

in my appsrc need-data callback I want to push a buffer at 30Hz, so I set GST_BUFFER_DURATION this way:

void needDataCallback(GstAppSrc* appsrc, guint length, gpointer user_data)
{
    GstBuffer *buffer;
    guint8 *ptr;
    gint size;
    GstFlowReturn ret;

    ptr = <MY_DATA_BUFFER>;
    size = <MY_DATA_SIZE>;

    buffer = gst_buffer_new_wrapped_full((GstMemoryFlags)0, (gpointer)(some_data), size, 0, size, NULL, NULL);

    GST_BUFFER_DURATION(buffer) = gst_util_uint64_scale_int(1, GST_SECOND, 30);

    ret = gst_app_src_push_buffer(appsrc, buffer);
}

My appsrc got is-live=1 do-timestamp=0 format=3.

It’s correct and enough? Or I have to set manually GST_BUFFER_PTS/DTS?

Just to be sure, there’s no obligation to use the “need-data” callback at all.

You can just push a buffer into appsrc whenever you have a new one and ignore the “need-data” stuff entirely, and if your input isn’t paced at 30Hz you can set up a timer I suppose. I’m not sure if GstBaseSrc/AppSrc will do the pacing for you even if you set the duration on the buffer. I think they will just emit “need-data” as soon as the buffer has been pushed and processed downstream?

Thank you Tim,

I choosed the “need-data“ stuff to get, let me say, a pull data mechanism instead of a push one, so instead of use a timer I prefer that gstreamer asks me for more data at the right time, according to te pipeline time. My appsrc generate a klv stream with a time inside (wallclock) and get muxed by the “subtitle” pad of splitmuxsink. So one of my needs is obtain the synchronization between the video e the klv.