Hello,
I’m trying to do a simple jpg → x264 encode video → client x264 decode and display (in a logic of a future server to client com) but I don’t find a way to make the decode part work.
My project is on github gstreamer_example but I will try to be as clear as possible.
This is my server pipeline
loaded_images = Tools::getAndLoadFiles("images_test/");
mdata.pipeline = gst_pipeline_new("image-to-video-pipeline");
mdata.appsrc = gst_element_factory_make("appsrc", "source");
mdata.decoder = gst_element_factory_make("jpegdec", "decoder");
mdata.converter = gst_element_factory_make("videoconvert", "converter");
mdata.encoder = gst_element_factory_make("x264enc", "encoder");
mdata.muxer = gst_element_factory_make("mpegtsmux", "muxer");
mdata.appsink = gst_element_factory_make("appsink", "sink");
...
GstCaps* caps = gst_caps_new_simple("image/jpeg",
"width", G_TYPE_INT, WIDTH,
"height", G_TYPE_INT, HEIGHT,
"framerate", GST_TYPE_FRACTION, 2, 1,
NULL);
g_object_set(mdata.appsrc, "caps", caps, "format", GST_FORMAT_TIME, NULL);
..
gst_bin_add_many(GST_BIN(mdata.pipeline), mdata.appsrc, mdata.decoder, mdata.converter, mdata.encoder, mdata.muxer, mdata.appsink, NULL);
And i’m connecting to the “new-sample” signal to get the “frame” GstMapInfo and simulate a “send” by adding it to my client received gstMapInfo received (I use an interface struct to not have problem with unref and unmap)
On my client side this is my pipeline:
_data.pipeline = gst_pipeline_new("video-display-pipeline");
_data.appsrc = gst_element_factory_make("appsrc", "video-source");
GstElement* demux = gst_element_factory_make("tsdemux", "ts-demuxer");
_data.decoder = gst_element_factory_make("avdec_h264", "h264-decoder");
_data.converter = gst_element_factory_make("videoconvert", "converter");
_data.sink = gst_element_factory_make("autovideosink", "video-output");
...
gst_bin_add_many(GST_BIN(_data.pipeline), _data.appsrc, demux, _data.decoder, _data.converter, _data.sink, NULL);
...
gst_app_src_set_stream_type(GST_APP_SRC(_data.appsrc), GST_APP_STREAM_TYPE_STREAM);
GstCaps* caps = gst_caps_new_simple("video/mpegts",
"systemstream", G_TYPE_BOOLEAN, TRUE,
"packetsize", G_TYPE_INT, 188,
NULL);
And i use an infinite loop to see if any GstMapInfo have been “received” (added to my vector).
But i can’t find a way to make my client display the “received” stream, I think I need help to understand if I understood correctly how to setup a decode (mayber i’m missing tsparse?)
In my git, I already load jpg to video, my images_to_displayed_video_example can do it but do not used a x264 encode/decode, if you want to compile it you just have to set the GSTREAMER_MSI_INSTALL_FOLDER environemental variable to your gstreamer install folder (i’m working with the msi installer).
I don’t want to take much of your time (and with the new linux “xz” breach more than before) so if you need anything i’ll provide (i would provide the debug log but since my configuration may not be matching, i’m not sure it’s revelant right now).
Thank you for your work.