Unable to Convert H264 Encoded Buffer to MP4 File

Dear Team,

I have encountered an issue while attempting to convert an H264 encoded buffer to an MP4 file using the appsrc element. Unfortunately, I am unable to successfully perform this conversion, and I have attached the relevant pipeline and code snippet for your reference:


gst-launch-1.0 -e v4l2src device=“/dev/video0” ! videoconvert ! queue ! x264enc tune=zerolatency ! identity ! fakesink

I’m utilizing the identity element to capture the buffers and store them in a buffer list. Subsequently, I am using the following code snippet to convert the H264 encoded buffer to an MP4 file:

Code Snippet:

void convertToMP4(GstBufferList *buflist)
// Create pipeline for MP4 conversion.
GstElement *pipeline, *appsrc, *videoconvert, *x264enc, *h264parse, *videoencode, *muxer, *file_sink;
pipeline = gst_pipeline_new(“MP4-pipeline”);
appsrc = gst_element_factory_make(“appsrc”, “source”);
h264parse = gst_element_factory_make(“h264parse”, “app-h264parse”);
muxer = gst_element_factory_make(“mp4mux”, “mux”);
file_sink = gst_element_factory_make(“filesink”, “filesink”);

             "stream-type", 0,
             "format", GST_FORMAT_TIME, NULL);

g_object_set(muxer, "fragment-duration", 2000, NULL);
// g_object_set(h264parse, "config-interval", -1, NULL);
// g_object_set(x264enc, "tune", 4, NULL);

// Set the resolution and framerate caps
GstCaps *caps = gst_caps_new_simple("video/x-h264",
                                    "stream-format", G_TYPE_STRING, "byte-stream",
                                    "width", G_TYPE_INT, 1280,
                                    "height", G_TYPE_INT, 720,
                                    "framerate", GST_TYPE_FRACTION, 10, 1,

gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);


g_object_set(file_sink, "location", "DIA_VIDEO_AUDIO.mp4", NULL);

gst_bin_add_many(GST_BIN(pipeline), appsrc, h264parse, muxer, file_sink, NULL);
if (gst_element_link_many(appsrc, h264parse, muxer, file_sink, NULL) != TRUE)
    g_printerr("Elements could not be linked in the pipeline.\n");

copy_buflist = gst_buffer_list_copy_deep(buflist);
g_print("isbuffered is filled and Buffer size is %d\n", gst_buffer_list_length(copy_buflist));
// Set the pipeline to the PLAYING state.
gst_element_set_state(pipeline, GST_STATE_PLAYING);

// Push video and audio buffers into appsrc.
GstFlowReturn retval = gst_app_src_push_buffer_list(GST_APP_SRC(appsrc), copy_buflist);
if (retval != GST_FLOW_OK) {
    g_printerr("Error pushing buffers to appsrc: %s\n", gst_flow_get_name(retval));
    // Handle error and cleanup if needed.

bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
g_signal_connect(bus, "message::error", G_CALLBACK(cb_message_error), NULL);
g_signal_connect(bus, "message::eos", G_CALLBACK(cb_message_eos), NULL);


However, I am encountering the following error:


is buffered is filled, and the Buffer size is 70

MP4 converted /n0:00:08.391637077 18811 0x7fcd6c0040f0 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:source:src Creating a random stream-id; consider implementing a deterministic way of creating a stream-id

I kindly request your assistance in resolving this issue promptly. Your guidance in rectifying this problem will be greatly appreciated.

Why are you using a buffer list here? I wouldn’t do that, it complicates things and might mess up certain semantics like timestamps.

Have you considered using an appsink instead of a fakesink? You could push the samples or buffers into the appsrc directly that way (might want to change adjust timestamps, depending on what you need).

Don’t make up caps, grab the caps from fakesink’s sink pad or from the GstSample you get from appsink.

If you use appsink (which you should imo) you may want to set appsink caps=video/x-h264,alignment=au,stream-format=avc on the sink caps, because that’s the format mp4mux will want, so that way you avoid extra conversions in h264parse.

You’ll also want to call gst_app_src_end_of_stream when you want to finalise the mp4 file.

I don’t see the actual error. Please get it from the error message or from the debug log.

This is harmless.

One more: you’ll probably want to set appsrc to format=time (via property and/or GstBaseSrc API)

where I need to capture video buffers and store them in a circular buffer (FIFO) for a duration of 1 minute. When a record command is issued, I append the latest 1-minute pre-event buffer to the actual recording buffer to create an MP4 file.

To achieve this, I have been collecting video buffers and maintaining a buffer list. I periodically remove older buffers from the list based on buffer size constraints, effectively implementing a circular buffer. This circular buffer is then used to create an MP4 file when recording begins.

I have a H.264 backlog recording example here in case you find it useful, fwiw.

Hi tpm ,

I appreciate your prompt response. However, it’s worth noting that my specific requirements differ from the code provided above.

  1. If the camera devices disconnect, I’ll insert dummy error messages into the buffer list. This is essential to account for any interruptions in the camera feed.

  2. My primary goal is to convert the H.264 encoded video buffers into a single MP4 file. This file will contain both video and audio data.

  3. To achieve this, I’ll work with two appsrc elements. One will handle the video buffers, and the other will manage the audio buffers. These elements ensure the proper handling and synchronization of audio and video data.

  4. In the event of any errors occurring during the recording process or prior to the recording, I will include error buffers. For instance, if the intended recording duration is 5 minutes and a camera disconnects after 3 minutes, the remaining two minutes will contain error buffers in the buffer list to account for this interruption.

  5. The final step involves taking the complete buffer list, which includes both video, audio, and error message buffers, and converting it into a single, coherent fragmented MP4 file. This consolidated file ensures that all data, including any error messages, is encapsulated in a single MP4 format for further analysis or playback.

Flow Diagram:

Is it possible to buffer the data to external storage, such as a non-RAM or external memory source, when continuously recording data, as the memory usage tends to increase, potentially leading to a performance slowdown? I’d like to store the data in an external memory to address this issue.


Sure, there’s no problem with that in principle, you probably just have to set a pad offset before you push buffers into the appsrc to offset the timestamps, or offset them manually.

Hard to know why things don’t work for you as expected without looking at code.

Hello Tim,

I have attached the code files for your reference. Could you please take a look and help me resolve the issue in my code?

#include <gst/gst.h>
#include <stdio.h>
#include <gst/app/gstappsink.h>
#include <gst/app/gstappsrc.h>
#include <gst/gstbufferlist.h>

static GstBufferList *buflist, *copy_buflist;
static GstPad *identity_src_pad;
GstFlowReturn retval;

    GstElement *pipeline, *v4l2src, *identity,*videoconvert, *queue, *valve, *x264enc, *mp4mux, *filesink;
    GstBus *bus;
    GstMessage *msg;
    gboolean terminate = FALSE;

void convertToMP4(GstBufferList *buflist)
    // Create pipeline for MP4 conversion.
    GstElement *pipeline, *appsrc, *videoconvert, *videoencode, *muxer, *file_sink;
    pipeline = gst_pipeline_new("MP4-pipeline");
    appsrc = gst_element_factory_make("appsrc", "source");
    muxer = gst_element_factory_make("mp4mux", "mp4-muxer");
    file_sink = gst_element_factory_make("filesink", "filesink");

                 "stream-type", 0,
                 "format", GST_FORMAT_TIME, NULL);

    g_object_set(G_OBJECT(muxer), "fragment-duration", 2000, NULL);

    // Set the resolution and framerate caps
    // GstCaps *caps = gst_caps_new_simple("video/x-raw",
    //     "width", G_TYPE_INT, 1280,
    //     "height", G_TYPE_INT, 720,
    //     "framerate", GST_TYPE_FRACTION, 10, 1,
    //     NULL);
    // gst_app_src_set_caps (GST_APP_SRC(appsrc),caps);

    // gst_caps_unref(caps);

    g_object_set(file_sink, "location", "NEW_VIDEO.mp4", NULL);

    gst_bin_add_many(GST_BIN(pipeline), appsrc,muxer, file_sink, NULL);
    if (gst_element_link_many(appsrc,muxer, file_sink, NULL) != TRUE)
        g_printerr("Elements could not be linked in the pipeline.\n");

    copy_buflist = gst_buffer_list_copy_deep(buflist);
    g_print("isbuffered is filled and Buffer size is %d\n", gst_buffer_list_length(copy_buflist));
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    retval = gst_app_src_push_buffer_list(GST_APP_SRC(appsrc), copy_buflist);
    g_print("RETVAL %d\n", retval);
    g_print("Sending EOS!!!!!!!");
    g_signal_emit_by_name(appsrc, "end-of-stream", &retval);

static GstPadProbeReturn pad_probe_cb(GstPad *pad, GstPadProbeInfo *info, gpointer user_data)
    static GstClockTime timestamp = 0;
    GstMapInfo map;
    GstBuffer *buff;
    GstBuffer *new_buffer;

    buff = gst_pad_probe_info_get_buffer(info);

    gsize buffer_size = gst_buffer_get_size(buff);
    double bufferSizeMB = static_cast<double>(buffer_size) / (1024 * 1024);

    new_buffer = gst_buffer_copy_deep(buff);
   // g_print(" timestamp : %ld\n", GST_BUFFER_PTS(new_buffer) / 1000000000);

     if (bufferSizeMB > 0)
        gst_buffer_list_add(buflist, new_buffer);   

    g_print("Buffer length is %d\n", gst_buffer_list_length(buflist));

    if (gst_buffer_list_length(buflist) == 100)
        buflist = gst_buffer_list_new();

    return GST_PAD_PROBE_OK;

void setValvevalue()
    g_object_set(G_OBJECT(valve), "drop", false, NULL);
    g_object_set(G_OBJECT(queue), "max-size-time", 0, "leaky", 0, NULL);
    printf("drop value setted tp false\n");


int stopStream()
    gst_element_set_state(pipeline, GST_STATE_NULL);

int start_recording_camera()

    // Create GStreamer elements
    pipeline = gst_pipeline_new("pipeline");
    v4l2src = gst_element_factory_make("v4l2src", "v4l2-source");
    videoconvert = gst_element_factory_make("videoconvert", "video-convert");
    queue = gst_element_factory_make("queue", "queue");
    valve = gst_element_factory_make("valve", "valve");
    x264enc = gst_element_factory_make("x264enc", "x264-encoder");
    identity = gst_element_factory_make("identity", "identity");
    mp4mux = gst_element_factory_make("mp4mux", "mp4-muxer");
    filesink = gst_element_factory_make("fakesink", "fake-sink");

    if (!pipeline || !v4l2src || !videoconvert || !queue || !valve || !x264enc || !mp4mux || !filesink) {
        g_printerr("One or more elements could not be created. Exiting.\n");
        return -1;

    // Set element properties
    g_object_set(G_OBJECT(v4l2src), "device", "/dev/video0",NULL);
    g_object_set(G_OBJECT(mp4mux), "fragment-duration", 2000, NULL);
    g_object_set(G_OBJECT(valve), "drop", false, NULL);
    g_object_set(G_OBJECT(x264enc), "tune", 4, NULL);
    //g_object_set(G_OBJECT(filesink), "location", "NF_video.mp4", NULL);

    // Build the pipeline
    gst_bin_add_many(GST_BIN(pipeline), v4l2src, videoconvert, queue, valve,identity, x264enc, mp4mux, filesink, NULL);
    gst_element_link_many(v4l2src, videoconvert,queue,valve, x264enc,identity, mp4mux, filesink, NULL);

    identity_src_pad = gst_element_get_static_pad(identity, "src");

    // // Create a buffer list and attach it to the pipeline user_data
     buflist = gst_buffer_list_new();

    // Add probe to the identity source pad
    gst_pad_add_probe(identity_src_pad, GST_PAD_PROBE_TYPE_BUFFER, pad_probe_cb, pipeline, NULL);

    // Set the pipeline to the "playing" state
    gst_element_set_state(pipeline, GST_STATE_PLAYING);


void testAPI(int apiNumber) {
     //std::thread startStreamThread;
     switch (apiNumber) {
        case 1:
             printf("Testing startStream API\n");
            //  startStream(width,height);
             //startStreamThread = std::thread(startStream);
             //startStreamThread.join(); // Wait for the thread to finish*/
         case 2:
             printf("Testing setValvevalue API\n");
         case 3:
             printf("Testing Stopstream API\n");
             printf("Invalid API number\n");

int main(int argc, char *argv[]) {

    int apiNumber;
    while (1) {
        // Display options to the user
        printf("Select an API to test:\n");
        printf("1. startStream\n");
        printf("2. SetValvevlaue\n");
        printf("3. StopStream\n");
        // printf("3. ERROR  Buffer added\n");
        printf("0. Exit\n");
        printf("Enter API number: ");

        // Read user input
        scanf("%d", &apiNumber);

        // Exit the loop if the user chooses 0
        if (apiNumber == 0)

        // Call the selected API dynamically

        // Clear the input buffer
        while (getchar() != '\n')


    return 0;


fault gstutils.c:3981:gst_pad_create_stream_id_internal:<source:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
0:00:11.341708588 15745 0x7fbe04004f20 FIXME               basesink gstbasesink.c:3145:gst_base_sink_default_event:<filesink> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:11.342067623 15745 0x7fbe04004f20 WARN                   qtmux gstqtmux.c:4568:gst_qt_mux_add_buffer:<mp4-muxer> error: format wasn't negotiated before buffer flow on pad audio_0
0:00:11.342204438 15745 0x7fbe04004f20 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<source> error: Internal data stream error.
0:00:11.342216224 15745 0x7fbe04004f20 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<source> error: streaming stopped, reason not-negotiated (-4)

Compilation line

g++ Pre-event.cpp -o Pre-event pkg-config --cflags --libs gstreamer-1.0 pkg-config --libs gstreamer-app-1.0

Best regards,

Multiple issues:

  1. the log tells you what is wrong. You are not setting any caps on appsrc so the muxer doesn’t know what the data is meant to be interpreted as. Before you push any data into the appsrc, you must set the caps property to the relevant format which is some version of video/x-h264. You should get this from the original pipeline in some way.
  2. You are calling convertToMP4() from a streaming thread. I would not recommend doing that.
  3. in pad_probe_cb you have conditional buffer adding to buffer list which may break the recorded stream. I don’t think this is a problem in this specific case but it doesn’t make sense to do this at all without thinking about GOP sizes.