Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) :- dGPU
• DeepStream Version :- 7.1
I have a basic pipeline.
uridecodebin -> streammux -> pgie -> sgie1 -> sgie2 -> sink
Lets say that I have a 100 second long video, 1080p at 30FPS.
And I have two set of hardware I can run this pipeline with this video.
hardware 1 :- super powerful. The entire 100 second video gets processes in 10 second.
hardware 2 :- super slow The entire 100 second video gets processes in 100 second.
I’m trying to figure out a way to get the actual frame time.
what I mean by that is hard for me to explain, so I will use an example to explain that.
lets say that in the above video, there is a vehicle at 10th second and another vehicle at 90th second.
but since the fast hardware processes the video in just 10 seconds, I get the detection/events at/after 1 second and 9th second.
what I want is to obtain the actual IN-VIDEO time of the detection? Is that possible ?
what I have tried so far :-
[1] Obtained PTS values of frames and pipeline start time. It didnt give me any good result.
any help would be appreciated.
it this is not the right forum to ask this question, please let me know.