I’m trying to use d3d11 gpu capabilities in order to decode and display h264/h265 from rtsp sources. With gstreamer version 1.22.6 I have a problem. The video display is very corrupted as if there is a lot packet loss. I checked with wireshark and compared with vlc and there is no problem with the network. When I tried with this launch line and debug enabled I see the following message:
Example launch line: gst-launch-1.0.exe uridecodebin3 uri=rtsp://somecameraIhave ! queue2 ! d3d11convert ! d3d11videosink
0:00:03.580960000 5292 000002122E8CC380 WARN basesink gstbasesink.c:3145:gst_base_sink_is_too_late: warning: There may be a timestamping problem, or this computer is too slow. WARNING: from element /GstPipeline:pipeline0/GstD3D11VideoSink:d3d11videosink0: A lot of buffers are being dropped. Additional debug info: …/libs/gst/base/gstbasesink.c(3145): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstD3D11VideoSink:d3d11videosink0: There may be a timestamping problem, or this computer is too slow.
For this problem I tried adding sync=false parameter to d3d11videosink element but it didn’t change anything. I tried adjusting buffer parameters (use-buffering=true buffer-duration=2) of uridecodebin3 but that also didn’t make any difference.
With gstreamer version 1.22.5 there is no such issue, and the computer is not slow.
I have been trying to use “draw-on-shared-texture” signal according to the d3d11videosink-shared-texture.cpp in the gstreamer repo. When I tried going back to 1.22.5 I’m getting following errors:
D3D11 CORRUPTION: ID3D11DeviceContext::DecoderBeginFrame: Two threads were found to be executing functions associated with the same Device[Context] at the same time.
I’ve tested some test rtsp streams with gst-launch/gst-play but I haven’t found corrupted output.
I have been trying to use “draw-on-shared-texture” signal according to the d3d11videosink-shared-texture.cpp in the gstreamer repo. When I tried going back to 1.22.5 I’m getting following errors
The problem was happening on H264 stream. I switched camera format to H265 and saw that there is no problem. When I switched back to H264 there is also no problem now… Probably camera stream was not ok and the problem disappeared after a switch… I missed the first step of debuggin. I will reboot everything before posting
I haven’t been using gst_bin_recalculate_latency, I will check into that.
I used “draw-on-shared-texture” signal to overlay on the image. For some reason I don’t have d3d11overlay. I installed gstreamer full:
0:00:01.343796000 17008 00000246D444E780 ERROR GST_PIPELINE gst/parse/grammar.y:570:gst_parse_element_make: no element “d3d11overlay”
If you want to overlay on d3d11 texture and render using d3d11videosink, d3d11overlay is the my recommendation but the d3d11overlay element does not exist in 1.22 release.
sorry for the newbie question but where does d3d11overlay element exist? Is it unreleased yet or does it exist in older versions?
I checked the appsink example but it uses about %30 GPU while the d3d11videosink-shared-texture.cpp example uses %8 GPU with a 4K rtsp stream from a camera. Therefore I kept using draw-on-shared-texture…
I had been using 1.22.6 release and I started having exceptions at random times:
D3D11 CORRUPTION: ID3D11DeviceContext::DecoderBeginFrame: Two threads were found to be executing functions associated with the same Device[Context] at the same time. This will cause corruption of memory. Appropriate thread synchronization needs to occur external to the Direct3D API (or through the ID3D10Multithread interface). 13044 and 2956 are the implicated thread ids. [ MISCELLANEOUS CORRUPTION #28: CORRUPTED_MULTITHREADING]
I upgraded to 1.22.8 problem continues even with the d3d11videosink-shared-texture.cpp example code. I tried every combination of use_nt_handle and use_keyed_mutex.