Concat two mjpeg files into one

Hi.

Given I have two mp4 files which both are collection of jpeg files, so it is simply mjpeg in mp4 container. Previososly, when we used h264 encoding everything was fine (see pipeline below), but with mjpeg files it pbviously doesn’t work. Is it possible to adapt provided pipeline to concat mjpeg files?
Current pipeline which worked with h264:

gst-launch-1.0 concat name=c ! queue ! m.video_0 mp4mux name=m ! filesink location=result.mp4 filesrc name=fsrc1 location=mjpeg1.mp4 ! qtdemux ! h264parse ! c. filesrc name=fsrc2 location=mjpeg2.mp4 ! qtdemux ! h264parse ! c.

Thanks in advance.

Try replacing h264parse with jpegparse or just remove the h264parse (assuming the file only has video, no audio or other streams).

Hi, thank you for reply!

Here is results:

with jpegparse

C:>gst-launch-1.0 concat name=c ! queue ! m.video_0 mp4mux name=m ! filesink location=c:result.mp4 filesrc name=fsrc1 location=mjpeg1.mp4 ! qtdemux ! jpegparse ! c. filesrc name=fsrc2 location=mjpeg2.mp4 ! qtdemux ! jpegparse ! c.
WARNING: erroneous pipeline: could not link jpegparse0 to c

without h264parse

C:>gst-launch-1.0 concat name=c ! queue ! m.video_0 mp4mux name=m ! filesink location=result.mp4 filesrc name=fsrc1 location=result.mp4 ! qtdemux ! c. filesrc name=fsrc2 location=result.mp4 ! qtdemux ! c.
Use Windows high-resolution clock, precision: 1 ms
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
WARNING: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: Delayed linking failed.
Additional debug info:
gst/parse/grammar.y(859): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:
failed delayed linking some pad of GstQTDemux named qtdemux0 to some pad of GstConcat named c
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux1: Internal data stream error.
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: Internal data stream error.
WARNING: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux1: Delayed linking failed.
Additional debug info:
gst/parse/grammar.y(859): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstQTDemux:qtdemux1:
failed delayed linking some pad of GstQTDemux named qtdemux1 to some pad of GstConcat named c
Additional debug info:
…/gst/isomp4/qtdemux.c(6967): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn’t want to preroll.
Redistribute latency…
Additional debug info:
…/gst/isomp4/qtdemux.c(6967): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux1:
streaming stopped, reason not-linked (-1)
Setting pipeline to NULL …
ERROR: pipeline doesn’t want to preroll.
Freeing pipeline …

mp4mux doesn’t advertise MJPEG as allowed input format, try using qtmux instead.

C:>gst-launch-1.0 concat name=c ! queue ! m.video_0 qtmux name=m ! filesink location=result.mp4 filesrc name=fsrc1 location=mjpeg1.mp4 ! qtdemux ! jpegparse ! c. filesrc name=fsrc2 location=mjpepg2.mp4 ! qtdemux ! jpegparse ! c.
Use Windows high-resolution clock, precision: 1 ms
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
Redistribute latency…
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux1: Internal data stream error.
Additional debug info:
…/gst/isomp4/qtdemux.c(6967): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux1:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.016392100
Setting pipeline to NULL …
ERROR: from element /GstPipeline:pipeline0/GstQueue:queue0: Internal data stream error.
Additional debug info:
…/plugins/elements/gstqueue.c(992): gst_queue_handle_sink_event (): /GstPipeline:pipeline0/GstQueue:queue0:
streaming stopped, reason not-negotiated (-4)
Freeing pipeline …

Produce mp4 files which by size resemble mjpeg2.mp4 but not playable. If I remove queue it only produces (copies) mjpeg2.mp4 file as a result but now it is playable. But still error exists:

Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
Redistribute latency…
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux1: Internal data stream error.
Additional debug info:
…/gst/isomp4/qtdemux.c(6967): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux1:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.016060900
Setting pipeline to NULL …
Freeing pipeline …

Let’s start simple perhaps:

Does this work?

gst-launch-1.0 filesrc location=mjpeg1.mp4 ! qtdemux ! jpegparse ! qtmux ! filesink location=result.mp4

Yes, this works ok:

Use Windows high-resolution clock, precision: 1 ms
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
Redistribute latency…
New clock: GstSystemClock
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.009911700
Setting pipeline to NULL …
Freeing pipeline …

can you run your version with concat again, but use gst-launch-1.0 -v so we see the caps?

Are the MJPEGs of the same type/format?

Are the MJPEGs of the same type/format?

Yes, they both are chunks of mjpeg stream from ip-camera.

C:>gst-launch-1.0 -v concat name=c ! m.video_0 qtmux name=m ! filesink location=result.mp4 filesrc name=fsrc1 location=mjpeg1.mp4 ! qtdemux ! jpegparse ! c. filesrc name=fsrc2 location=mjpeg2.mp4 ! qtdemux ! jpegparse ! c.

Use Windows high-resolution clock, precision: 1 ms
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, sof-marker=(int)0, colorspace=(string)sYUV, sampling=(string)YCbCr-4:2:0, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstConcat:c.GstPad:src: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, sof-marker=(int)0, colorspace=(string)sYUV, sampling=(string)YCbCr-4:2:0, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstQTMux:m.GstQTMuxPad:video_0: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, sof-marker=(int)0, colorspace=(string)sYUV, sampling=(string)YCbCr-4:2:0, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstConcat:c.GstConcatPad:sink_0: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, sof-marker=(int)0, colorspace=(string)sYUV, sampling=(string)YCbCr-4:2:0, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstQTMux:m.GstAggregatorPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
Redistribute latency…
New clock: GstSystemClock
/GstPipeline:pipeline0/GstConcat:c: active-pad = “(GstConcatPad)\ sink_1”
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux1: Internal data stream error.
/GstPipeline:pipeline0/GstJpegParse:jpegparse1.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, framerate=(fraction)15000/1001, pixel-aspect-ratio=(fraction)1/1
Additional debug info:
…/gst/isomp4/qtdemux.c(6967): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux1:
streaming stopped, reason not-negotiated (-4)
/GstPipeline:pipeline0/GstJpegParse:jpegparse1.GstPad:src: caps = image/jpeg, parsed=(boolean)true, width=(int)1920, height=(int)1080, sof-marker=(int)0, colorspace=(string)sYUV, sampling=(string)YCbCr-4:2:0, framerate=(fraction)15000/1001, pixel-aspect-ratio=(fraction)1/1
Execution ended after 0:00:00.023849500
Setting pipeline to NULL …
Freeing pipeline …

I think it’s because it ends up with slightly different framerates in the caps somehow: 15/1 vs. 15000/1001.

Not sure what the best fix is here.

You could try something like ... ! concat name=c ! videorate skip-to-first=true ! image/jpeg,framerate=1/15 ! qtmux ! ...

That might lead to the frames from the second input file shifted slightly though to maintain equidistance on the output side.

Alternatively you could also put the videorate with the 15fps capsfilter after each jpegparse. This might also retimestamp things slightly or drop a frame every now and then, or maybe not. I’m not sure if this framerate difference is just heuristics in the demuxer or if there’s an actual difference. My guess would be it’s most likely the demuxer.

1 Like

Thank you, it seems to be working.

In your solution above you have small typo ... framerate=1/15, should be inverse

Ah yes, well spotted, thanks.