Dynamic removal of tee path

Hello hackers!

We have an overly complicated application which deals with routing a bunch of RTP streams all over the place. It works sort of like a switchboard and defines the concept of incoming and outgoing jacks.

An incoming jack can have many outgoing jacks associated with it. An incoming jack is pretty much a rtpbin fed by an udpsrc and it ends up in a tee.

The outgoing jack is a rtpbin feeding an udpsink. Linking these is a matter of getting a new pad from the tee and hooking all this up.

What makes it all complicated is that we want all this to be dynamic. An incoming jack should always be playing, even if there is no outgoing jack. Since we want the statistics and metrics this brings us. To solve this we are trying out two approaches.

  1. Always have a fakesink attached
  2. When we have no outgoing jack we set a blocking probe on the tee src pad

Both of these seem to work fine. And over all things are working fine. But we are seeing issues on removal. In fuzzing and in rare cases, we get in a bad state.

We get no GST_ERROR or GST_WARN messages, but we will see:

basesrc gstbasesrc.c:3042:gst_base_src_loop:<udpsrc_rtp_jack-dcf0a890-53f7-4134-9404-bdaa93a182f4_0> pausing after gst_pad_push() = flushing

And after this our pipeline is not playable and our world falls apart.

The process we are following for removal looks sort of like this:

 // For outgoing jacks:
//   1. Find the connected incoming jack (if any)
//   2. Block flow, using probes to be able to remove elements
//      dynamically.
//   3. Unlink the incoming jack from ourself
//   4. Remove ourself from the pipeline
//   5. Unblock the flow to make the spice flow again

This is how the pipeline looks before we are starting to remove.

This is after unlink:

And after remove:

At this point we unblock the flow. The probe that blocks is on the udpsrc.

Do you have any insight in what we are missing? To avoid ending up in this situation?

Do you have more logs, specifically from above that line?

Generally this means that somewhere downstream of the source there is some pad that received a buffer, but the pad is not activated (e.g. because the element is in READY or NULL state). From the debug logs you should be able to see which pad is the original cause of the GST_FLOW_FLUSHING.

You have to prevent any buffers to arrive at a pad that is not active, for example via pad probes or by linking after the pad in question is activated.

Thank you!

Here are some more logs:

0:00:07.872704320 424007 0x7f8fdc003170 INFO                    task gsttask.c:383:gst_task_func:<incoming_tee_test-ingest-in-0:video_0_src_1:src> Task going to paused
0:00:07.872708900 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstelement.c:875:gst_element_remove_pad:<test-ingest-in-0:video> removing pad 'pad_0_src_1'
0:00:07.872718273 424007 0x7f8fdc002370 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3976:pop_and_push_next: Locking from thread 0x7f8fdc002370
0:00:07.872772729 424007 0x7f8fdc002370 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3976:pop_and_push_next: Locked from thread 0x7f8fdc002370
0:00:07.872779101 424007 0x7f8fdc002370 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:4622:gst_rtp_jitter_buffer_loop: Unlocking from thread 0x7f8fdc002370
0:00:07.872725537 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3262:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> Received packet #25954 at time 0:00:04.618512988, discont 0, rtx 0, inband NTP time 99:99:99.999999999
0:00:07.872817740 424007 0x7f8fdc002770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3268:gst_rtp_jitter_buffer_chain: Locking from thread 0x7f8fdc002770
0:00:07.872823284 424007 0x7f8fdc002770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3268:gst_rtp_jitter_buffer_chain: Locked from thread 0x7f8fdc002770
0:00:07.872830827 424007 0x7f8fdc002770 LOG          rtpjitterbuffer gstrtpjitterbuffer.c:2930:calculate_jitter:<rtpjitterbuffer2> dtsdiff +0:00:00.031740832 rtptime +0:00:00.033333333, clock-rate 90000, diff +0:00:00.001592501, jitter: 0:00:00.000328322
0:00:07.872851314 424007 0x7f8fdc002770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3345:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> updated packet_rate: 313
0:00:07.872855453 424007 0x7f8fdc002770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3353:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> max_dropout: 18780, max_misorder: 626
0:00:07.872859199 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3400:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> expected #25954, got #25954, gap of 0
0:00:07.872862079 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3432:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> Clearing gap packets
0:00:07.872867791 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer rtpjitterbuffer.c:879:rtp_jitter_buffer_calculate_pts: extrtp 7027435145, gstrtp 21:41:22.612722222, base 21:41:21.546000000, send_diff 0:00:01.066722222
0:00:07.872872028 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer rtpjitterbuffer.c:733:calculate_skew: skew 0, out 0:00:04.618469653
0:00:07.872890505 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:2640:calculate_packet_spacing:<rtpjitterbuffer2> new packet spacing 0:00:00.033333334 old packet spacing 0:00:00.033338544 combined to 0:00:00.033337241
0:00:07.872894655 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3655:gst_rtp_jitter_buffer_chain: signal event
0:00:07.872900663 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3658:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer2> Pushed packet #25954, now 1 packets, head: 1, percent -1
0:00:07.872904725 424007 0x7f8fdc002770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:2447:update_current_timer:<rtpjitterbuffer2> no more timers
0:00:07.872910120 424007 0x7f8fdc002770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3666:gst_rtp_jitter_buffer_chain: Unlocking from thread 0x7f8fdc002770
0:00:07.872754297 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstpad.c:2158:gst_pad_unlink: unlinking incoming_tee_test-ingest-in-0:video_0_src_1:src(0x7f9018024c50) and pad_0_src_1:proxypad19(0x7f9018027290)
0:00:07.872785025 424007 0x7f8fdc002370 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:4624:gst_rtp_jitter_buffer_loop:<rtpjitterbuffer0> pausing task, reason flushing
0:00:07.872918691 424007 0x7f8fdc003770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:4602:gst_rtp_jitter_buffer_loop: waiting event done
0:00:07.872944668 424007 0x7f8fdc002370 INFO                    task gsttask.c:383:gst_task_func:<rtpjitterbuffer0:src> Task going to paused
0:00:07.872954921 424007 0x7f8fdc003770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3957:pop_and_push_next: Unlocking from thread 0x7f8fdc003770
0:00:07.872930359 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstpad.c:2213:gst_pad_unlink: unlinked incoming_tee_test-ingest-in-0:video_0_src_1:src and pad_0_src_1:proxypad19
0:00:07.872965850 424007 0x7f8fdc003770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3968:pop_and_push_next:<rtpjitterbuffer2> Pushing buffer 25954, dts 0:00:04.618512988, pts 0:00:04.618469653
0:00:07.872993571 424007 0x7f8fe0001230 INFO           GST_PARENTAGE gstbin.c:4395:gst_bin_get_by_name: [test-ingest-in-0:video]: looking up child element tee_test-ingest-in-0:video_0
0:00:07.872999645 424007 0x7f8fdc003770 INFO        GST_ELEMENT_PADS gstelement.c:1016:gst_element_get_static_pad: found pad identity_test-in-hash-2_0:sink
0:00:07.873008847 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstelement.c:1016:gst_element_get_static_pad: found pad tee_test-ingest-in-0:video_0:src_1
0:00:07.873016730 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstpad.c:2158:gst_pad_unlink: unlinking tee_test-ingest-in-0:video_0:src_1(0x7f9018024410) and incoming_tee_test-ingest-in-0:video_0_src_1:sink(0x7f9018024a00)
0:00:07.873021745 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstpad.c:2213:gst_pad_unlink: unlinked tee_test-ingest-in-0:video_0:src_1 and incoming_tee_test-ingest-in-0:video_0_src_1:sink
0:00:07.873036936 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2796:gst_element_continue_state:<incoming_tee_test-ingest-in-0:video_0_src_1> committing state from PLAYING to PAUSED, pending NULL, next READY
0:00:07.873050856 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<incoming_tee_test-ingest-in-0:video_0_src_1> notifying about state-changed PLAYING to PAUSED (NULL pending)
0:00:07.873038716 424007 0x7f8fdc003770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3976:pop_and_push_next: Locking from thread 0x7f8fdc003770
0:00:07.873065968 424007 0x7f8fdc003770 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3976:pop_and_push_next: Locked from thread 0x7f8fdc003770
0:00:07.873072127 424007 0x7f8fdc003770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:4109:handle_next_buffer:<rtpjitterbuffer2> no buffer, going to wait
0:00:07.873078815 424007 0x7f8fdc003770 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:4602:gst_rtp_jitter_buffer_loop: waiting event
0:00:07.873062522 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2804:gst_element_continue_state:<incoming_tee_test-ingest-in-0:video_0_src_1> continue state change PAUSED to READY, final NULL
0:00:07.873144381 424007 0x7f8fdc003170 INFO                    task gsttask.c:385:gst_task_func:<incoming_tee_test-ingest-in-0:video_0_src_1:src> Task resume from paused
0:00:07.873186061 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2796:gst_element_continue_state:<incoming_tee_test-ingest-in-0:video_0_src_1> committing state from PAUSED to READY, pending NULL, next NULL
0:00:07.873191766 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<incoming_tee_test-ingest-in-0:video_0_src_1> notifying about state-changed PAUSED to READY (NULL pending)
0:00:07.873197657 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2804:gst_element_continue_state:<incoming_tee_test-ingest-in-0:video_0_src_1> continue state change READY to NULL, final NULL
0:00:07.873201671 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<incoming_tee_test-ingest-in-0:video_0_src_1> completed state change to NULL
0:00:07.873205380 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<incoming_tee_test-ingest-in-0:video_0_src_1> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.873213409 424007 0x7f8fe0001230 INFO           GST_PARENTAGE gstbin.c:1805:gst_bin_remove_func:<test-ingest-in-0:video> removed child "incoming_tee_test-ingest-in-0:video_0_src_1"
0:00:07.873230682 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstelement.c:875:gst_element_remove_pad:<tee_test-ingest-in-0:video_0> removing pad 'src_1'
0:00:07.873279824 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtcp_test-out-tok-0:0_0> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873299122 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<udpsink_out_rtcp_test-out-tok-0:0_0> completed state change to PAUSED
0:00:07.873303470 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<udpsink_out_rtcp_test-out-tok-0:0_0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873310322 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtcp_test-out-tok-0:0_0' changed state to 3(PAUSED) successfully
0:00:07.873314737 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtp_test-out-tok-0:0_0> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873321986 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2946:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtp_test-out-tok-0:0_0' is changing state asynchronously to PAUSED
0:00:07.873326164 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873336786 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpstorage6> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873340598 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpstorage6> completed state change to PAUSED
0:00:07.873344698 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpstorage6> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873349687 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpstorage6' changed state to 3(PAUSED) successfully
0:00:07.873353758 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpssrcdemux6> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873357795 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpssrcdemux6> completed state change to PAUSED
0:00:07.873361258 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpssrcdemux6> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873365914 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpssrcdemux6' changed state to 3(PAUSED) successfully
0:00:07.873370125 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0-line> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873375613 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0-line> completed state change to PAUSED
0:00:07.873378765 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0-line> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873384659 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0-line' changed state to 3(PAUSED) successfully
0:00:07.873388733 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0> completed state change to PAUSED
0:00:07.873391983 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873398083 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0' changed state to 3(PAUSED) successfully
0:00:07.873401996 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<identity_test-out-tok-0:0_0> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:07.873405648 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<identity_test-out-tok-0:0_0> completed state change to PAUSED
0:00:07.873409226 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<identity_test-out-tok-0:0_0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:07.873414444 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'identity_test-out-tok-0:0_0' changed state to 3(PAUSED) successfully
0:00:07.873418407 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:3127:gst_element_change_state:<test-out-tok-0:0> forcing commit state NULL <= READY
0:00:07.873422303 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2796:gst_element_continue_state:<test-out-tok-0:0> committing state from PLAYING to PAUSED, pending NULL, next READY
0:00:07.873429915 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed PLAYING to PAUSED (NULL pending)
0:00:07.873555628 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2804:gst_element_continue_state:<test-out-tok-0:0> continue state change PAUSED to READY, final NULL
0:00:07.873567728 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtcp_test-out-tok-0:0_0> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873593376 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<udpsink_out_rtcp_test-out-tok-0:0_0> completed state change to READY
0:00:07.873599147 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<udpsink_out_rtcp_test-out-tok-0:0_0> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873605930 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtcp_test-out-tok-0:0_0' changed state to 2(READY) successfully
0:00:07.873611748 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtp_test-out-tok-0:0_0> current PLAYING pending PAUSED, desired next READY
0:00:07.873651199 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<udpsink_out_rtp_test-out-tok-0:0_0> completed state change to READY
0:00:07.873658129 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<udpsink_out_rtp_test-out-tok-0:0_0> notifying about state-changed PLAYING to READY (VOID_PENDING pending)
0:00:07.873666422 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtp_test-out-tok-0:0_0' changed state to 2(READY) successfully
0:00:07.873672419 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873677830 424007 0x7f8fe0001230 LOG                   rtpbin gstrtpbin.c:3810:gst_rtp_bin_change_state:<test-out-tok-0:0> setting shutdown flag
0:00:07.873682718 424007 0x7f8fe0001230 LOG                   rtpbin gstrtpbin.c:3815:gst_rtp_bin_change_state:<test-out-tok-0:0> dynamic lock taken, we can continue shutdown
0:00:07.873722216 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpstorage6> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873741202 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpstorage6> completed state change to READY
0:00:07.873746955 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpstorage6> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873756443 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpstorage6' changed state to 2(READY) successfully
0:00:07.873762518 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpssrcdemux6> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873781399 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpssrcdemux6> completed state change to READY
0:00:07.873787482 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpssrcdemux6> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873794080 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpssrcdemux6' changed state to 2(READY) successfully
0:00:07.873799899 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0-line> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873839122 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0-line> completed state change to READY
0:00:07.873844696 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0-line> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873852092 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0-line' changed state to 2(READY) successfully
0:00:07.873872845 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0> completed state change to READY
0:00:07.873878110 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873886729 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0' changed state to 2(READY) successfully
0:00:07.873892942 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<identity_test-out-tok-0:0_0> current PAUSED pending VOID_PENDING, desired next READY
0:00:07.873916111 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<identity_test-out-tok-0:0_0> completed state change to READY
0:00:07.873921283 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<identity_test-out-tok-0:0_0> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
0:00:07.873927021 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'identity_test-out-tok-0:0_0' changed state to 2(READY) successfully
0:00:07.873933300 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2796:gst_element_continue_state:<test-out-tok-0:0> committing state from PAUSED to READY, pending NULL, next NULL
0:00:07.873938027 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed PAUSED to READY (NULL pending)
0:00:07.873943300 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2804:gst_element_continue_state:<test-out-tok-0:0> continue state change READY to NULL, final NULL
0:00:07.873950854 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtcp_test-out-tok-0:0_0> current READY pending VOID_PENDING, desired next NULL
0:00:07.873969980 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<udpsink_out_rtcp_test-out-tok-0:0_0> completed state change to NULL
0:00:07.873975045 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<udpsink_out_rtcp_test-out-tok-0:0_0> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.873980633 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtcp_test-out-tok-0:0_0' changed state to 1(NULL) successfully
0:00:07.873985800 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<udpsink_out_rtp_test-out-tok-0:0_0> current READY pending VOID_PENDING, desired next NULL
0:00:07.873995736 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<udpsink_out_rtp_test-out-tok-0:0_0> completed state change to NULL
0:00:07.874000257 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<udpsink_out_rtp_test-out-tok-0:0_0> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874005531 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'udpsink_out_rtp_test-out-tok-0:0_0' changed state to 1(NULL) successfully
0:00:07.874015581 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0> current READY pending VOID_PENDING, desired next NULL
0:00:07.874024724 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpstorage6> current READY pending VOID_PENDING, desired next NULL
0:00:07.874038061 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpstorage6> completed state change to NULL
0:00:07.874042835 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpstorage6> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874051889 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpstorage6' changed state to 1(NULL) successfully
0:00:07.874057512 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<rtpssrcdemux6> current READY pending VOID_PENDING, desired next NULL
0:00:07.874063423 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<rtpssrcdemux6> completed state change to NULL
0:00:07.874067976 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<rtpssrcdemux6> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874075493 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'rtpssrcdemux6' changed state to 1(NULL) successfully
0:00:07.874080625 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<test-out-tok-0:0-line> current READY pending VOID_PENDING, desired next NULL
0:00:07.874086707 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0-line> completed state change to NULL
0:00:07.874091777 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0-line> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874097445 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0-line' changed state to 1(NULL) successfully
0:00:07.874103374 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0> completed state change to NULL
0:00:07.874108219 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874113786 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'test-out-tok-0:0' changed state to 1(NULL) successfully
0:00:07.874118652 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2480:gst_bin_element_set_state:<identity_test-out-tok-0:0_0> current READY pending VOID_PENDING, desired next NULL
0:00:07.874123938 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<identity_test-out-tok-0:0_0> completed state change to NULL
0:00:07.874128457 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<identity_test-out-tok-0:0_0> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874133681 424007 0x7f8fe0001230 INFO              GST_STATES gstbin.c:2939:gst_bin_change_state_func:<test-out-tok-0:0> child 'identity_test-out-tok-0:0_0' changed state to 1(NULL) successfully
0:00:07.874138761 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2824:gst_element_continue_state:<test-out-tok-0:0> completed state change to NULL
0:00:07.874147020 424007 0x7f8fe0001230 INFO              GST_STATES gstelement.c:2724:_priv_gst_element_state_changed:<test-out-tok-0:0> notifying about state-changed READY to NULL (VOID_PENDING pending)
0:00:07.874160019 424007 0x7f8fe0001230 INFO           GST_PARENTAGE gstbin.c:1805:gst_bin_remove_func:<Switchboard> removed child "test-out-tok-0:0"
0:00:07.874201066 424007 0x7f8fe0001230 INFO           GST_PARENTAGE gstbin.c:4395:gst_bin_get_by_name: [test-ingest-in-0:video]: looking up child element udpsrc_rtp_test-ingest-in-0:video_0
0:00:07.874211699 424007 0x7f8fe0001230 INFO        GST_ELEMENT_PADS gstelement.c:1016:gst_element_get_static_pad: found pad udpsrc_rtp_test-ingest-in-0:video_0:src
0:00:07.874276133 424007 0x7f8fe0001230 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:5059:gst_rtp_jitter_buffer_src_query:<rtpjitterbuffer0> Peer latency: min 0:00:00.000000000 max 0:00:00.000000000
0:00:07.874278956 424007 0x7f8fdc000f50 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3262:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer0> Received packet #26214 at time 0:00:07.663529084, discont 0, rtx 0, inband NTP time 99:99:99.999999999
0:00:07.874310148 424007 0x7f8fdc000f50 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3268:gst_rtp_jitter_buffer_chain: Locking from thread 0x7f8fdc000f50
0:00:07.874314212 424007 0x7f8fdc000f50 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3268:gst_rtp_jitter_buffer_chain: Locked from thread 0x7f8fdc000f50
0:00:07.874319772 424007 0x7f8fdc000f50 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:3694:gst_rtp_jitter_buffer_chain:<rtpjitterbuffer0> flushing flushing
0:00:07.874283387 424007 0x7f8fe0001230 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:5064:gst_rtp_jitter_buffer_src_query: Locking from thread 0x7f8fe0001230
0:00:07.874325176 424007 0x7f8fdc000f50 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:2447:update_current_timer:<rtpjitterbuffer0> no more timers
0:00:07.874337685 424007 0x7f8fdc000f50 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:3666:gst_rtp_jitter_buffer_chain: Unlocking from thread 0x7f8fdc000f50
0:00:07.874343070 424007 0x7f8fe0001230 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:5064:gst_rtp_jitter_buffer_src_query: Locked from thread 0x7f8fe0001230
0:00:07.874347494 424007 0x7f8fe0001230 TRACE        rtpjitterbuffer gstrtpjitterbuffer.c:5067:gst_rtp_jitter_buffer_src_query: Unlocking from thread 0x7f8fe0001230
0:00:07.874351990 424007 0x7f8fe0001230 DEBUG        rtpjitterbuffer gstrtpjitterbuffer.c:5069:gst_rtp_jitter_buffer_src_query:<rtpjitterbuffer0> Our latency: 0:00:01.300000000
0:00:07.874344164 424007 0x7f8fdc000f50 INFO                 basesrc gstbasesrc.c:3083:gst_base_src_loop:<udpsrc_rtp_test-ingest-in-0:video_0> pausing after gst_pad_push() = flushing

We do block flow from udpsrc:

gst::PadProbeType::BLOCK_DOWNSTREAM | gst::PadProbeType::IDLE,

While we remove the tee branch, and then we unblock after, I guess some more care is needed?

You need to do that on a pad directly upstream of where you want to remove elements. If there are e.g. queues or other elements in between with different streaming threads, then they will continue running with whatever data is left in them even if upstream of them is blocked.

hoo humm!

thank you @slomo but I seem to get the same error when I block at the pad closest to the one I remove …

What would be the easiest way for me to find which downstream element / pad is flushing?

It feels like I am following the guide lines in dynamic pipeline manipulation, except for the step were the guide does the eos to the sink => src of the element to remove.

I am not sure how to extrapolate that to my use-case. Any debugging strategies are welcome, this is affecting my sanity.

If we cannot solve this here … is anyone going to be at FOSDEM and perhaps accepts beer as debug help payment?

I managed to avoid the error when I pushed a FlushStop event on the udpsrc after removing … but that felt silly and generated warnings about segment->format

Get GST_DEBUG=6 debug logs and check backwards from the log line that you found up to where the first flushing comes from.

In your case, this is basically the same as the second example I wrote in GStreamer Dynamic Pipelines – coaxion.net – slomo's blog . Maybe going through that helps?

The easiest for you would probably be to add an IDLE probe to the tee sinkpad, then from there getting rid of everything downstream of the tee including the request srcpad of the tee and then return from the probe. If you don’t mind blocking the thread that removes everything you could also directly release the tee srcpad as a first step and then get rid of everything downstream of it, without any pad probes.

That’s not needed in your case.

Thank you!

A combination of using a pure IDLE probe and setting it closer to home seems to have made the trick.

You have my deep gratitude.

1 Like

Ok, so some more fuzzing showed me that this did not solve it :frowning:

I still get into a push when flushing …

Using the log tracer with GST_BUFFER (| grep "res=-2") gives us:

0:00:05.789799121 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<test-out-tok-0:0:send_rtp_src_0> 0:00:05.789794629, pad=<test-out-tok-0:0:send_rtp_src_0>, res=-2
0:00:05.789811414 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<test-out-tok-0:0-line:send_rtp_src> 0:00:05.789807671, pad=<test-out-tok-0:0-line:send_rtp_src>, res=-2
0:00:05.789832605 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<send_rtp_sink_0:proxypad18> 0:00:05.789828372, pad=<send_rtp_sink_0:proxypad18>, res=-2
0:00:05.789874358 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<identity_test-out-tok-0:0_0:src> 0:00:05.789870224, pad=<identity_test-out-tok-0:0_0:src>, res=-2
0:00:05.789940339 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<sink_0:proxypad20> 0:00:05.789936204, pad=<sink_0:proxypad20>, res=-2
0:00:05.789973902 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<test-ingest-in-0:video:pad_0_src_1> 0:00:05.789969666, pad=<test-ingest-in-0:video:pad_0_src_1>, res=-2
0:00:05.790021332 98810 0x7f4160002af0 TRACE             GST_BUFFER :0:do_push_buffer_post:<incoming_tee_test-ingest-in-0:video_0_src_1:src> 0:00:05.790017407, pad=<incoming_tee_test-ingest-in-0:video_0_src_1:src>, res=-2
0:00:05.798272269 98810 0x7f4160002050 TRACE             GST_BUFFER :0:do_push_buffer_post:<tee_test-ingest-in-0:video_0:src_1> 0:00:05.798268801, pad=<tee_test-ingest-in-0:video_0:src_1>, res=-2
0:00:05.798279563 98810 0x7f4160002050 TRACE             GST_BUFFER :0:do_push_buffer_post:<identity_test-ingest-in-0:video_0:src> 0:00:05.798277186, pad=<identity_test-ingest-in-0:video_0:src>, res=-2
0:00:05.798286268 98810 0x7f4160002050 TRACE             GST_BUFFER :0:do_push_buffer_post:<test-ingest-in-0:video:recv_rtp_src_0_1570260884_96> 0:00:05.798283901, pad=<test-ingest-in-0:video:recv_rtp_src_0_1570260884_96>, res=-2
0:00:05.798293030 98810 0x7f4160002050 TRACE             GST_BUFFER :0:do_push_buffer_post:<rtpptdemux0:src_96> 0:00:05.798290571, pad=<rtpptdemux0:src_96>, res=-2
0:00:05.798299664 98810 0x7f4160002050 TRACE             GST_BUFFER :0:do_push_buffer_post:<rtpjitterbuffer0:src> 0:00:05.798297401, pad=<rtpjitterbuffer0:src>, res=-2
0:00:05.814421892 98810 0x7f4160000d60 TRACE             GST_BUFFER :0:do_push_buffer_post:<rtpssrcdemux0:src_1570260884> 0:00:05.814417672, pad=<rtpssrcdemux0:src_1570260884>, res=-2
0:00:05.814480385 98810 0x7f4160000d60 TRACE             GST_BUFFER :0:do_push_buffer_post:<rtpstorage0:src> 0:00:05.814475530, pad=<rtpstorage0:src>, res=-2
0:00:05.814525052 98810 0x7f4160000d60 TRACE             GST_BUFFER :0:do_push_buffer_post:<test-ingest-tok-0-line:recv_rtp_src> 0:00:05.814522312, pad=<test-ingest-tok-0-line:recv_rtp_src>, res=-2
0:00:05.814532699 98810 0x7f4160000d60 TRACE             GST_BUFFER :0:do_push_buffer_post:<recv_rtp_sink_0:proxypad0> 0:00:05.814531308, pad=<recv_rtp_sink_0:proxypad0>, res=-2
0:00:05.814549772 98810 0x7f4160000d60 TRACE             GST_BUFFER :0:do_push_buffer_post:<udpsrc_rtp_test-ingest-in-0:video_0:src> 0:00:05.814547075, pad=<udpsrc_rtp_test-ingest-in-0:video_0:src>, res=-2

And, Sebastian, I might be crazy but it feels connected to the jitterbuffer. if I set latency to 0u32 I do not see this either … but it is all very racy, so I do not know.

Any way I could debug this further?

So! After making really really sure I follow your dynamic-tee-vsink example removing of the outgoing path works fine!

Now our stress test ran further, but it never ran more than ~100 or ~200 iterations, then it fails when removing the src part of the pipeline.

I dug down and read documentation and thought I got everything right. But it was hanging on setting the bin with the src to state null.

This happened in the open media room at FOSDEM and @ndufresne was there and helped me find the relevant stack trace in GDB:

Thread 13 (Thread 0x7f649f5fa6c0 (LWP 61797) "rocket-worker-t"):
#0  0x00007f64a52893c0 in __lll_lock_wait () at /lib64/libc.so.6
#1  0x00007f64a528fe97 in pthread_mutex_lock@@GLIBC_2.2.5 () at /lib64/libc.so.6
#2  0x00007f64a51b5948 in gst_base_transform_activate () at /lib64/libgstbase-1.0.so.0
#3  0x00007f64a51b5a07 in gst_base_transform_sink_activate_mode () at /lib64/libgstbase-1.0.so.0
#4  0x00007f64a58a6807 in activate_mode_internal () at /lib64/libgstreamer-1.0.so.0
#5  0x00007f64a58a733b in gst_pad_set_active () at /lib64/libgstreamer-1.0.so.0
#6  0x00007f64a588c69b in activate_pads.lto_priv.1.lto_priv () at /lib64/libgstreamer-1.0.so.0
#7  0x00007f64a5898ae5 in gst_iterator_fold () at /lib64/libgstreamer-1.0.so.0
#8  0x00007f64a590c27e in iterator_activate_fold_with_resync.constprop () at /lib64/libgstreamer-1.0.so.0
#9  0x00007f64a588c7fe in gst_element_pads_activate () at /lib64/libgstreamer-1.0.so.0
#10 0x00007f64a588ca21 in gst_element_change_state_func.lto_priv () at /lib64/libgstreamer-1.0.so.0
#11 0x00007f649c6d7960 in gst_identity_change_state () at /lib64/gstreamer-1.0/libgstcoreelements.so
#12 0x00007f64a588ba94 in gst_element_change_state () at /lib64/libgstreamer-1.0.so.0
#13 0x00007f64a588bad8 in gst_element_change_state () at /lib64/libgstreamer-1.0.so.0
#14 0x00007f64a588c349 in gst_element_set_state_func.lto_priv () at /lib64/libgstreamer-1.0.so.0
#15 0x0000559d33c62139 in gstreamer::auto::element::{impl#3}::set_state<gstreamer::auto::element::Element> (self=0x7f649f5f2e58, state=gstreamer::auto::enums::State::Null) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/gstreamer-0.20.7/src/auto/element.rs:547
#16 0x0000559d33c2f8bc in switchboard::jack::SwitchboardJack::remove_line (bin=0x7f649f5f3938, index=0) at spiideo-switchboard/src/jack.rs:825
#17 0x0000559d33c30151 in switchboard::jack::SwitchboardJack::remove_incoming_jack_line_thread_fn (bin=0x7f649f5f3938, index=0, pipeline=0x7f649f5f3940) at spiideo-switchboard/src/jack.rs:868
#18 0x0000559d33c31835 in switchboard::jack::SwitchboardJack::remove_incoming (self=0x7f6468324ae0, pipeline=0x7f64a00c9510) at spiideo-switchboard/src/jack.rs:947
#19 0x0000559d33c34332 in switchboard::jack::SwitchboardJack::remove (self=0x7f6468324ae0, pipeline=0x7f64a00c9510) at spiideo-switchboard/src/jack.rs:1072
#20 0x0000559d33bb2dd6 in switchboard::Switchboard::remove_jack (self=0x7f64a00c9510, name=...) at spiideo-switchboard/src/lib.rs:499
#21 0x0000559d331b7b08 in shibuya::switchboard_actor::SwitchboardActor::remove_incoming_jack (self=0x559d36e7d8b0) at shibuya/src/switchboard_actor.rs:372
#22 0x0000559d331b7f54 in shibuya::switchboard_actor::SwitchboardActor::handle_message (self=0x559d36e7d8b0, msg=...) at shibuya/src/switchboard_actor.rs:448
#23 0x0000559d331d001c in shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn#0} () at shibuya/src/switchboard_actor.rs:228
#24 0x0000559d330cbd40 in tokio::runtime::task::core::{impl#6}::poll::{closure#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (ptr=0x559d36e7d8b0) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:334
#25 0x0000559d330c8efb in tokio::loom::std::unsafe_cell::UnsafeCell<tokio::runtime::task::core::Stage<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}>>::with_mut<tokio::runtime::task::core::Stage<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}>, core::task::poll::Poll<()>, tokio::runtime::task::core::{impl#6}::poll::{closure_env#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>> (self=0x559d36e7d8b0, f=...) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/loom/std/unsafe_cell.rs:16
#26 tokio::runtime::task::core::Core<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>::poll<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (self=0x559d36e7d8a0, cx=...) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/core.rs:323
#27 0x0000559d33183b11 in tokio::runtime::task::harness::poll_future::{closure#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> () at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:485
#28 0x0000559d3323bae3 in core::panic::unwind_safe::{impl#23}::call_once<core::task::poll::Poll<()>, tokio::runtime::task::harness::poll_future::{closure_env#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>> (self=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/core/src/panic/unwind_safe.rs:271
#29 0x0000559d3308690b in std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>>, core::task::poll::Poll<()>> (data=0x7f649f5f83a8) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:526
#30 0x0000559d3308d70b in __rust_try ()
#31 0x0000559d33082ba8 in std::panicking::try<core::task::poll::Poll<()>, core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>>> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:490
#32 0x0000559d3303bbca in std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>>, core::task::poll::Poll<()>> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panic.rs:142
#33 0x0000559d3318212e in tokio::runtime::task::harness::poll_future<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (core=0x559d36e7d8a0, cx=...) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:473
#34 0x0000559d3318775f in tokio::runtime::task::harness::Harness<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>::poll_inner<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (self=0x7f649f5f8
5c0) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:208
#35 0x0000559d3318bea3 in tokio::runtime::task::harness::Harness<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>::poll<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (self=...) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/harness.rs:153
#36 0x0000559d3335740b in tokio::runtime::task::raw::poll<shibuya::switchboard_actor::{impl#1}::run_actor::{async_fn_env#0}, alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (ptr=...) at /home/jonasdn/.cargo/registry/src/index.crates.io-6f17d22bba15001f/tokio-1.32.0/src/runtime/task/raw.rs:276
#37 0x0000559d35524d77 in tokio::runtime::task::raw::RawTask::poll (self=...) at src/runtime/task/raw.rs:200
#38 0x0000559d35563502 in tokio::runtime::task::LocalNotified<alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>>::run<alloc::sync::Arc<tokio::runtime::scheduler::multi_thread::handle::Handle, alloc::alloc::Global>> (self=...) at src/runtime/task/mod.rs:400
#39 0x0000559d3555a0f0 in tokio::runtime::scheduler::multi_thread::worker::{impl#1}::run_task::{closure#0} () at src/runtime/scheduler/multi_thread/worker.rs:639
#40 0x0000559d35559754 in tokio::runtime::coop::with_budget<core::result::Result<alloc::boxed::Box<tokio::runtime::scheduler::multi_thread::worker::Core, alloc::alloc::Global>, ()>, tokio::runtime::scheduler::multi_thread::worker::{impl#1}::run_task::{closure_env#0}> (budget=..., f=...) at src/runtime/coop.rs:107
#41 tokio::runtime::coop::budget<core::result::Result<alloc::boxed::Box<tokio::runtime::scheduler::multi_thread::worker::Core, alloc::alloc::Global>, ()>, tokio::runtime::scheduler::multi_thread::worker::{impl#1}::run_task::{closure_env#0}> (f=...) at src/runtime/coop.rs:73
#42 tokio::runtime::scheduler::multi_thread::worker::Context::run_task (self=0x7f649f5f8cb8, task=..., core=0x559d36e86b60) at src/runtime/scheduler/multi_thread/worker.rs:575
#43 0x0000559d35558c7a in tokio::runtime::scheduler::multi_thread::worker::Context::run (self=0x7f649f5f8cb8, core=0x559d36e86b60) at src/runtime/scheduler/multi_thread/worker.rs:526
#44 0x0000559d35558789 in tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure#0} () at src/runtime/scheduler/multi_thread/worker.rs:491
#45 0x0000559d35543d90 in tokio::runtime::context::scoped::Scoped<tokio::runtime::scheduler::Context>::set<tokio::runtime::scheduler::Context, tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure_env#0}, ()> (self=0x7f649f5fa3b0, t=0x7f649f5f8cb0, f=...) at src/runtime/context/scoped.rs:40
#46 0x0000559d355132cb in tokio::runtime::context::set_scheduler::{closure#0}<(), tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure_env#0}> (c=0x7f649f5fa378) at src/runtime/context.rs:176
#47 0x0000559d355653fe in std::thread::local::LocalKey<tokio::runtime::context::Context>::try_with<tokio::runtime::context::Context, tokio::runtime::context::set_scheduler::{closure_env#0}<(), tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure_env#0}>, ()> (self=0x559d362d6188, f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/thread/local.rs:270
#48 0x0000559d3556433b in std::thread::local::LocalKey<tokio::runtime::context::Context>::with<tokio::runtime::context::Context, tokio::runtime::context::set_scheduler::{closure_env#0}<(), tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure_env#0}>, ()> (self=0x559d362d6188, f=Python Exception <class 'gdb.MemoryError'>: Cannot access memory at address 0x80
#49 0x0000559d35513244 in tokio::runtime::context::set_scheduler<(), tokio::runtime::scheduler::multi_thread::worker::run::{closure#0}::{closure_env#0}> (v=0x7f649f5f8cb0, f=...) at src/runtime/context.rs:176
#50 0x0000559d35558691 in tokio::runtime::scheduler::multi_thread::worker::run::{closure#0} () at src/runtime/scheduler/multi_thread/worker.rs:486
#51 0x0000559d35521888 in tokio::runtime::context::runtime::enter_runtime<tokio::runtime::scheduler::multi_thread::worker::run::{closure_env#0}, ()> (handle=0x7f649f5f8eb8, allow_block_in_place=true, f=...) at src/runtime/context/runtime.rs:65
#52 0x0000559d3555841c in tokio::runtime::scheduler::multi_thread::worker::run (worker=...) at src/runtime/scheduler/multi_thread/worker.rs:478
#53 0x0000559d3555828b in tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure#0} () at src/runtime/scheduler/multi_thread/worker.rs:447
#54 0x0000559d3555ca4e in tokio::runtime::blocking::task::{impl#2}::poll<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}, ()> (self=..., _cx=0x7f649f5f9040) at src/runtime/blocking/task.rs:42
#55 0x0000559d3552ebc3 in tokio::runtime::task::core::{impl#6}::poll::{closure#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (ptr=0x559d36e9c028) at src/runtime/task/core.rs:334
#56 0x0000559d3552ea3b in tokio::loom::std::unsafe_cell::UnsafeCell<tokio::runtime::task::core::Stage<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>>>::with_mut<tokio::runtime::task::core::Stage<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>>, core::task::poll::Poll<()>, tokio::runtime::task::core::{impl#6}::poll::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>> (self=0x559d36e9c028, f=...) at src/loom/std/unsafe_cell.rs:16
#57 tokio::runtime::task::core::Core<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>::poll<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (self=0x559d36e9c020, cx=...) at src/runtime/task/core.rs:323
#58 0x0000559d3554bf75 in tokio::runtime::task::harness::poll_future::{closure#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> () at src/runtime/task/harness.rs:485
#59 0x0000559d3552b743 in core::panic::unwind_safe::{impl#23}::call_once<core::task::poll::Poll<()>, tokio::runtime::task::harness::poll_future::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>> (self=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/core/src/panic/unwind_safe.rs:271
#60 0x0000559d3552950c in std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>>, core::task::poll::Poll<()>> (data=0x7f649f5f91a8) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:526
#61 0x0000559d3552b54b in __rust_try ()
#62 0x0000559d35528e08 in std::panicking::try<core::task::poll::Poll<()>, core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>>> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:490
#63 0x0000559d355776eb in std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runti
#63 0x0000559d355776eb in std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::{closure_env#0}<tokio::runtime::blocking::task::BlockingTask<tokio::runti--Type <RET> for more, q to quit, c to continue without paging--
me::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>>, core::task::poll::Poll<()>> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panic.rs:142
#64 0x0000559d3554bd4f in tokio::runtime::task::harness::poll_future<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (core=0x559d36e9c020, cx=...) at src/runtime/task/harness.rs:473
#65 0x0000559d3554aa99 in tokio::runtime::task::harness::Harness<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>::poll_inner<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (self=0x7f649f5f93c0) at src/runtime/task/harness.rs:208
#66 0x0000559d3554a817 in tokio::runtime::task::harness::Harness<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule>::poll<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (self=...) at src/runtime/task/harness.rs:153
#67 0x0000559d3552502d in tokio::runtime::task::raw::poll<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::scheduler::multi_thread::worker::{impl#0}::launch::{closure_env#0}>, tokio::runtime::blocking::schedule::BlockingSchedule> (ptr=...) at src/runtime/task/raw.rs:276
#68 0x0000559d35524d77 in tokio::runtime::task::raw::RawTask::poll (self=...) at src/runtime/task/raw.rs:200
#69 0x0000559d355635c7 in tokio::runtime::task::UnownedTask<tokio::runtime::blocking::schedule::BlockingSchedule>::run<tokio::runtime::blocking::schedule::BlockingSchedule> (self=...) at src/runtime/task/mod.rs:437
#70 0x0000559d35525687 in tokio::runtime::blocking::pool::Task::run (self=...) at src/runtime/blocking/pool.rs:159
#71 0x0000559d35527ce9 in tokio::runtime::blocking::pool::Inner::run (self=0x559d36e81b50, worker_thread_id=11) at src/runtime/blocking/pool.rs:513
#72 0x0000559d35527a14 in tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure#0} () at src/runtime/blocking/pool.rs:471
#73 0x0000559d3553ac76 in std::sys_common::backtrace::__rust_begin_short_backtrace<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/sys_common/backtrace.rs:154
#74 0x0000559d3553c232 in std::thread::{impl#0}::spawn_unchecked_::{closure#1}::{closure#0}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()> () at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/thread/mod.rs:529
#75 0x0000559d3552b5c2 in core::panic::unwind_safe::{impl#23}::call_once<(), std::thread::{impl#0}::spawn_unchecked_::{closure#1}::{closure_env#0}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()>> (self=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/core/src/panic/unwind_safe.rs:271
#76 0x0000559d35529429 in std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<std::thread::{impl#0}::spawn_unchecked_::{closure#1}::{closure_env#0}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()>>, ()> (data=0x7f649f5f9830) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:526
#77 0x0000559d3552b54b in __rust_try ()
#78 0x0000559d35529081 in std::panicking::try<(), core::panic::unwind_safe::AssertUnwindSafe<std::thread::{impl#0}::spawn_unchecked_::{closure#1}::{closure_env#0}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()>>> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panicking.rs:490
#79 0x0000559d3553c05a in std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<std::thread::{impl#0}::spawn_unchecked_::{closure#1}::{closure_env#0}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()>>, ()> (f=...) at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/panic.rs:142
#80 std::thread::{impl#0}::spawn_unchecked_::{closure#1}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()> () at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/std/src/thread/mod.rs:528
#81 0x0000559d3553c3df in core::ops::function::FnOnce::call_once<std::thread::{impl#0}::spawn_unchecked_::{closure_env#1}<tokio::runtime::blocking::pool::{impl#6}::spawn_thread::{closure_env#0}, ()>, ()> () at /rustc/58e967a9cc3bd39122e8cb728e8cec6e3a4eeef2/library/core/src/ops/function.rs:250
#82 0x0000559d35658295 in alloc::boxed::{impl#47}::call_once<(), dyn core::ops::function::FnOnce<(), Output=()>, alloc::alloc::Global> () at library/alloc/src/boxed.rs:2007
#83 alloc::boxed::{impl#47}::call_once<(), alloc::boxed::Box<dyn core::ops::function::FnOnce<(), Output=()>, alloc::alloc::Global>, alloc::alloc::Global> () at library/alloc/src/boxed.rs:2007
#84 std::sys::unix::thread::{impl#2}::new::thread_start () at library/std/src/sys/unix/thread.rs:108
#85 0x00007f64a528c897 in start_thread () at /lib64/libc.so.6
#86 0x00007f64a53136fc in clone3 () at /lib64/libc.so.6

It looked like it could perhaps be some interaction with tokio (thread sharing?) and the streaming thread?

When I did the actual removal (set to null, remove from pipeline) from a separate thread, pretty much:

            let bin_weak = bin.downgrade();
            let pipeline_weak = pipeline.downgrade();
            std::thread::spawn(move || {
                let bin = bin_weak.upgrade().unwrap();
                let pipeline = pipeline_weak.upgrade().unwrap();
                Self::remove_incoming_jack_line_thread_fn(&bin, index, &pipeline)?;
            });

I did not see the hang anymore. It is a bit hard to describe our setup … it is Rust and rocket webserver that gets requests to add / remove sinks / srcs to a pipeline.
It is an actor pattern where we send messages to the thread that runs the pipeline mainloop. But here it looks like the streaming thread and the tokio threads are intermingled? It is difficult for me to follow.

Do you have any insight here?

There is a rtpjitterbuffer in the mix and our feeling is that this more prevaliant when we set an higher latency ~1000 ms +

Setting an element’s state has the potential to block. You should not perform blocking operations from async threads (like tokio). Your move to set_state in a separate thread is the correct one.

1 Like

Thank you! Then we will feel better for doing it!

You can also use pipeline.call_async() to do that from a thread pool, or use tokio::task::spawn_blocking() for doing that from another thread pool.

1 Like

Thanks!

Should I be using this for all places where I change state of elements when I might be coming from a Tokio async thread? Or is setting to Null more susceptible?

I would do that everywhere. Easy enough to do with Rust.

You could even do something like

let join_handle = tokio::task::spawn_blocking(move || {
    pipeline.set_state(gst::State::Null)
});
let state_change_res = join_handle.await;

or similarly

let join_handle = pipeline.call_async_future(|pipeline| {
    pipeline.set_state(gst::State::Null)
});
let state_change_res = join_handle.await;
1 Like