Encode / decode x264 stream problem decoding

Hello,

I’m trying to do a simple jpg → x264 encode video → client x264 decode and display (in a logic of a future server to client com) but I don’t find a way to make the decode part work.

My project is on github gstreamer_example but I will try to be as clear as possible.

This is my server pipeline

    loaded_images = Tools::getAndLoadFiles("images_test/");

    mdata.pipeline = gst_pipeline_new("image-to-video-pipeline");
    mdata.appsrc = gst_element_factory_make("appsrc", "source");
    mdata.decoder = gst_element_factory_make("jpegdec", "decoder");
    mdata.converter = gst_element_factory_make("videoconvert", "converter");
    mdata.encoder = gst_element_factory_make("x264enc", "encoder");
    mdata.muxer = gst_element_factory_make("mpegtsmux", "muxer");
    mdata.appsink = gst_element_factory_make("appsink", "sink");
...
    GstCaps* caps = gst_caps_new_simple("image/jpeg",
        "width", G_TYPE_INT, WIDTH,
        "height", G_TYPE_INT, HEIGHT,
        "framerate", GST_TYPE_FRACTION, 2, 1,
        NULL);
    g_object_set(mdata.appsrc, "caps", caps, "format", GST_FORMAT_TIME, NULL);
..
        gst_bin_add_many(GST_BIN(mdata.pipeline), mdata.appsrc, mdata.decoder, mdata.converter, mdata.encoder, mdata.muxer, mdata.appsink, NULL);

And i’m connecting to the “new-sample” signal to get the “frame” GstMapInfo and simulate a “send” by adding it to my client received gstMapInfo received (I use an interface struct to not have problem with unref and unmap)

On my client side this is my pipeline:

    _data.pipeline = gst_pipeline_new("video-display-pipeline");
    _data.appsrc = gst_element_factory_make("appsrc", "video-source");
    GstElement* demux = gst_element_factory_make("tsdemux", "ts-demuxer");
    _data.decoder = gst_element_factory_make("avdec_h264", "h264-decoder");
    _data.converter = gst_element_factory_make("videoconvert", "converter");
    _data.sink = gst_element_factory_make("autovideosink", "video-output");
...
gst_bin_add_many(GST_BIN(_data.pipeline), _data.appsrc, demux, _data.decoder, _data.converter, _data.sink, NULL);
...

    gst_app_src_set_stream_type(GST_APP_SRC(_data.appsrc), GST_APP_STREAM_TYPE_STREAM);

    GstCaps* caps = gst_caps_new_simple("video/mpegts",
        "systemstream", G_TYPE_BOOLEAN, TRUE,
        "packetsize", G_TYPE_INT, 188,
        NULL);

And i use an infinite loop to see if any GstMapInfo have been “received” (added to my vector).

But i can’t find a way to make my client display the “received” stream, I think I need help to understand if I understood correctly how to setup a decode (mayber i’m missing tsparse?)

In my git, I already load jpg to video, my images_to_displayed_video_example can do it but do not used a x264 encode/decode, if you want to compile it you just have to set the GSTREAMER_MSI_INSTALL_FOLDER environemental variable to your gstreamer install folder (i’m working with the msi installer).

I don’t want to take much of your time (and with the new linux “xz” breach more than before) so if you need anything i’ll provide (i would provide the debug log but since my configuration may not be matching, i’m not sure it’s revelant right now).

Thank you for your work.

You may try adding h264parse element between tsdemux and avdec_h264 in receiver pipeline such as:

gst-launch-1.0 multifilesrc location=img_%03d.jpg ! image/jpeg,width=320,height=240,framerate=30/1 ! jpegdec ! videoconvert ! x264enc key-int-max=30 insert-vui=1 tune=zerolatency ! mpegtsmux ! queue ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink -v

I can see two potential issues:

  1. try adding an h264parse between the demuxer and the decoder.
  2. are you handling the fact that tsdemux has dynamic pads (“sometimes pads”) correctly by connecting to the pad-added signal and linking tsdemux to the video decoder in there? (gst_element_link() before the pipeline has started will fail because tsdemux won’t have any source pads yet at that point).

Thank you both for your answer, i added h264parse and tried different configuration, this is my last attempt:

void gstreamer_client_receiver::on_pad_added(GstElement* src, GstPad* new_pad, gstreamer_client_receiver::CustomDataStruct* _data)
{
	g_print("\n on_pad_added()\n\n");
	GstCaps* new_pad_caps = gst_pad_get_current_caps(new_pad);
	GstStructure* new_pad_struct = gst_caps_get_structure(new_pad_caps, 0);
	const gchar* new_pad_type = gst_structure_get_name(new_pad_struct);

	// Check if the pad is of the type video/x-h264
	if (g_str_has_prefix(new_pad_type, "video/x-h264"))
	{
		g_print("pad has prefix x-h264\n");

		GstPad* sink_pad = gst_element_get_static_pad(_data->h264parser, "sink");

		if (!gst_pad_is_linked(sink_pad)) {
			if (GST_PAD_LINK_FAILED(gst_pad_link(new_pad, sink_pad))) {
				g_print("Failed to link demuxer pad.\n");
			}
			else {
				g_print("Demuxer pad linked.\n");

				//if (!gst_element_link(_data->demux, _data->decoder)) {
				//	g_printerr("could not link demux to decoder.\n");
				//	gst_object_unref(_data->pipeline);
				//}
			}
		}
		gst_object_unref(sink_pad);
	}
	else {
		g_print("Pad '%s' is not video/x-h264. Not linking.\n", new_pad_type);
	}

	if (new_pad_caps != NULL) {
		gst_caps_unref(new_pad_caps);
	}
}

int gstreamer_client_receiver::Init(int ac, char* av[])
{
	g_print("Client init\n");
	gst_init(&ac, &av);

	// Create the elements
	_data.pipeline = gst_pipeline_new("video-display-pipeline");
	_data.appsrc = gst_element_factory_make("appsrc", "video-source");
	_data.demux = gst_element_factory_make("tsdemux", "ts-demuxer");
	_data.h264parser = gst_element_factory_make("h264parse", "h264-parser");
	_data.decoder = gst_element_factory_make("avdec_h264", "h264-decoder");
	_data.converter = gst_element_factory_make("videoconvert", "converter");
	_data.sink = gst_element_factory_make("autovideosink", "video-output");

	if (!_data.pipeline || !_data.appsrc || !_data.demux || !_data.h264parser || !_data.decoder || !_data.converter || !_data.sink) {
		std::cerr << "Not all elements could be created." << std::endl;
		return -1;
	}

	// Build the pipeline
	gst_bin_add_many(GST_BIN(_data.pipeline), _data.appsrc, _data.demux, _data.h264parser, _data.decoder, _data.converter, _data.sink, NULL);

	if (!gst_element_link(_data.appsrc, _data.demux)) {
		g_printerr("Failed to link appsrc to demux.\n");
	}

	if (!gst_element_link(_data.h264parser, _data.decoder)) {
		g_printerr("h264parser and decoder could not be linked.\n");
		gst_object_unref(_data.pipeline);

	}
	
	if (!gst_element_link_many(_data.decoder, _data.converter, _data.sink, NULL)) {
	    std::cerr << "Elements could not be linked." << std::endl;
	    gst_object_unref(_data.pipeline);
	    return -1;
	}
	g_signal_connect(_data.demux, "pad-added", G_CALLBACK(gstreamer_client_receiver::on_pad_added), &_data);

	//GstCaps* caps = gst_caps_new_simple("video/x-h264",
	//	"stream-format", G_TYPE_STRING, "byte-stream",
	//	"alignment", G_TYPE_STRING, "au",
	//	NULL);
	//g_object_set(G_OBJECT(_data.appsrc), "caps", caps, "format", GST_FORMAT_TIME, NULL);
	//gst_caps_unref(caps);
	//gst_app_src_set_stream_type(GST_APP_SRC(_data.appsrc), GST_APP_STREAM_TYPE_STREAM);
	//// Set appsrc to treat the stream as live data
	//g_object_set(G_OBJECT(_data.appsrc), "is-live", TRUE, NULL);

	g_print("Client inited\n");
}

I tried to follow your instruction @tpm but couldn’t figure out how to do it (current commented gst_element_link result in a

(gstreamer_example.exe:31104): GStreamer-CRITICAL **: 11:44:21.590:
Trying to dispose element video-output, but it is in PAUSED instead of the NULL state.

And other configuration had not impact, or resulted in the same logs as adding the commented pad in the Init()

0:00:01.199171500 19812 0000026F373ECC40 WARN                 basesrc gstbasesrc.c:3175:gst_base_src_loop:<video-source> error: Internal data stream error.
0:00:01.199824500 19812 0000026F373ECC40 WARN                 basesrc gstbasesrc.c:3175:gst_base_src_loop:<video-source> error: streaming stopped, reason not-negotiated (-4)
0:00:01.200889200 Error received from element video-source: Internal data stream error.

In it’s current form, the stream seem “reactive” to the data i’m putting into it as it output:

WARN                 tsdemux tsdemux.c:2776:gst_ts_demux_queue_data:<ts-demuxer> warning: CONTINUITY: Mismatch packet 14, stream 9 (pid 0x0041)
0:00:04.248542300 27684 000002296CD52490 WARN        mpegtspacketizer mpegtspacketizer.c:1019:mpegts_packetizer_push_section: PID 0x0020 section discontinuity (1 vs 0)
0:00:04.249756400 27684 000002296CD52490 WARN        mpegtspacketizer mpegtspacketizer.c:1019:mpegts_packetizer_push_section: PID 0x0000 section discontinuity (1 vs 0)
0:00:04.250420300 27684 000002296CD52490 WARN        mpegtspacketizer mpegtspacketizer.c:1899:_set_current_group: GAP detected. diff 0:00:01.500000000
...

I tried to reference myself with the gstreamer_example_dynamic_pipeline and chatgpt but I think i may need to do more test with pads as my problem may come from it’s misconfiguration on my side.