Displaying livestream video and saving individual frames at the same time on android

In order to run the video from the livestream from an esp32, this is the current code that initiates the videoplayer on the screen

    private fun innitPlayer() {
        Log.i(TAG, "innitPlayer")

        val streamURL = viewModel.setGstPipeline()

        Log.i(TAG, "Pipeline sent to C code: $streamURL")
        // we use this function to set the pipeline on the c side
        nativeSetPipeline(streamURL)

        // initialise gstreamer player and warns if it fails
        try {
            GStreamer.init(requireContext())
        } catch (e: Exception) {
            Toast.makeText(requireContext(), e.message, Toast.LENGTH_LONG).show()
        }

        val sv = binding.surfaceVideo
        val sh = sv.holder
        sh.addCallback(this)

        // initialise native code and is ready to accept commands
        nativeInit()

    }

i am using this command to be sent to the Gstreamer init. I have read that using the tee allows me to display both the frame and to save the frame to a variable for further processing

souphttpsrc location=http://192.168.0.100 ! multipartdemux ! jpegdec ! videoconvert ! tee name=splitter
                splitter. ! queue ! autovideosink
                splitter. ! queue ! jpegenc snapshot=TRUE ! appsink

The goal is to:

  1. Save the frame to the android device locally
  2. Pass the bitmap to a tflite model for object detection on the phone itself.

How should i modify my code in order to access the appsink so that I can access the individual frames?
In addition, is using appsink the correct way to do this?