In an application with 2 pipelines…
pipeline #1 sends on an intervideosink.
pipeline #2 receives on an intervideosrc.
A) which pipeline should be started first
B) it doesn’t matter
C) they need to be on the same pipeline
In an application with 2 pipelines…
pipeline #1 sends on an intervideosink.
pipeline #2 receives on an intervideosrc.
A) which pipeline should be started first
B) it doesn’t matter
C) they need to be on the same pipeline
The order shouldn’t matter in this particular case, since intervideosrc should be outputting black frames if there are no input frames.
However, if you know the input format and/or resolution you may want to specify that upfront with a capsfilter after the intervideosrc, so that the black frames are the same format as the actual video later (otherwise the video format might change once the producer starts producing, which may or may not cause problems with the rest of your pipeline).
They can be in separate pipelines, but need to be in the same application/process.
A generic variant of this is intersink / intersrc for what it’s worth, but intersrc can’t produce filler frames if there’s no input since it’s generic and that can only be done for raw audio/video.
I’ve spent a lot of time trying to get this working, even with the dreaded CoPilot, but I can’t get the timing right. CoPilot is now insisting they must be in the same pipeline. or I should use shmsrc/shmsink instead. Are you sure about this, or should I listen to CoPilot?
@tpm You are correct of course. Even the GStreamer documentation agrees. CoPilot was hallucinating.