Audio/Video synchronization for two appsrc

Hello,
I am trying to stream from CEF to rtmp and I have trouble synchronizing audio and video. I have two appsrc elements, one for audio and one for video. I have PTS on both streams and on both streams they starts from 0s. But audio buffers are delayed 100+ms. That is problem in flvmux that needs strictly monotonic timestamp. I need something before flvmux that will allign both buffers on PTS. I tried multiqueue with sync-by-running-time but I dont think i understand that correctly.
Is there an element that will take multiple queues and just sort both pipelines by PTS?
This is my pipeline now:

 appsrc is-live=true name=source format=time  caps=video/x-raw,format=I420,width=1920,height=1080,framerate=30/1
 ! x264enc name=h264enc bitrate=3000 speed-preset=superfast tune=zerolatency
 ! mux.
 appsrc is-live=true name=audiosource format=time
 ! audio/x-raw,rate=44100,channels=2,format=F32LE,layout=interleaved
 ! audioconvert
 ! avenc_aac name=audioenc bitrate=128000
 ! mux.
 flvmux name=mux streamable=true
 ! rtmp2sink name=muxsink location=xxxx

can you try to start the audio first and then video? Then the gap could be gone?

I have implemented double buffer before gstreamer that feeds both streams to appsrc by their PTS. That works quite fine. (There is still little desync in flvmux, because avenc_aac is holding one buffer, but I can also fix that in my double buffer). But my target pipeline will be more complicated and my question is, if there is way with current components of GStreamer to synchronize both streams before flvmux by PTS.