As the title says, I’m looking for a way to integrate Closed Captions (CEA-608 generated with Adobe Premiere Pro) into a video track without burning the subtitles on the video, therefore having proper Closed Captions that could be enabled on TV for example.
I’m a total rookie in terms of programming, and in terms of Gstreamer, I’ve been mostly relying on AI to form pipelines and test them but none of them have worked in the intended way. This kind of captionning seems to be restricted to TV Broadcasting softwares or something, because apart from Gstreamer, there are not a lot of options (none at all to be honest) to do this.
So as there’s no more trustable ally than a human brain, I’m here asking for your help to make it work… (Pretty please).
Could someone that has succeeded at making Closed Captions in the described way, share their code so I can see if it could work for me and if not, try to modify it so it does?
I have a H264 .ts file as a video input, and a CEA-608 .txt file as a caption input (the file generated through Adobe Premiere), don’t know if that could help but I thought it wouldn’t hurt to specify the kind of files I’m working with as it might be a source of errors (?).
Thank you !!
Note :
My last attempt was with this code gst-launch-1.0 filesrc name=filesrc5 location=1.txt ! ccconverter name=ccconverter6 ! cc708overlay name=cc708overlay2 ! autovideoconvert name=autovideoconvert3 ! x264enc bitrate=2000 ! flvmux name=mux ! filesink location=1.flv filesrc location=1.ts name=videotestsrc1 ! decodebin ! videoconvert ! cc708overlay2.
Additional debug info:
../ext/closedcaption/gstceaccoverlay.c(988): gst_cea_cc_overlay_cc_event (): /GstPipeline:pipeline0/GstCeaCcOverlay:cc708overlay2:
received non-TIME newsegment event on text input
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTSDemux:tsdemux0: Internal data stream error.
Additional debug info:
../gst/mpegtsdemux/mpegtsbase.c(1778): mpegts_base_loop (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTSDemux:tsdemux0:
streaming stopped, reason not-negotiated (-4)
Now, this one “works” as it encodes the video but there is still this issue :
WARNING: from element /GstPipeline:pipeline0/GstCeaCcOverlay:cc708overlay2: Could not multiplex stream.
Additional debug info:
../ext/closedcaption/gstceaccoverlay.c(988): gst_cea_cc_overlay_cc_event (): /GstPipeline:pipeline0/GstCeaCcOverlay:cc708overlay2:
received non-TIME newsegment event on text input
filesrc will not output a TIME segment and will output data in very large (for captions) chunks of data. What format is the 1.txt file in?. You likely need a parser for that specific format to something time-based (and with the correct caps).
The .txt file I’m using as CEA-608 captions looks like this :
00:00:00:00 - 00:00:02:29
This is not real.
00:00:03:15 - 00:00:06:14
Oh, really?
00:00:08:22 - 00:00:11:21
Yeah, this is just a test.
00:00:12:07 - 00:00:15:06
Damn, who would’ve known!
As for the pipeline, as I said, I don’t know a lot about Gstreamer (or development in general), I’m mainly using AI to try to make it work
Would you have an example of a working pipeline for that particular muxing that I could adapt to my needs, and/or an example of captions format that could work if the one that I provided isn’t right?
For the pipeline, I’ve seen an “fdsrc” here and there on Stackoverflow answers, would that work?
And for the captions, I used Adobe Premiere Pro to generate those CEA-608 subtitles so I guess the format should be right (?), but yeah, I really need this to work…
Seems like according to Subtitle file formats - General, this format is called Protoscript which does not have a GStreamer parser. If you can save as a different supported subtitle format (like srt), then using something like this should work: