When using 'ahcsrc' streaming camera on Android, can it stream output to both 'glimagesink' and 'rtph264pay' simultaneously?

When using ‘ahcsrc’ streaming camera on Android, can it stream output to both ‘glimagesink’ and ‘rtph264pay’ simultaneously?
Because I cannot see anything from the device surface view, but I can stream and play from VLC on the PC or other android device.

Waiting for your help.Thank you.

Nancy

Perhaps there is something to improve in your usage of tee? Maybe you want to tell a bit more about your pipeline?

Can you help check where the problem is?

My usage is as follows:

str= "(ahcsrc ! queue ! timeoverlay ! x264enc tune=zerolatency ! rtph264pay pt=96 name=pay0 openslessrc ! queue ! audioconvert ! rtpL16pay pt=96 name=pay1 ! openslessink)";
factory = gst_rtsp_media_factory_new ();
ahc->ahcsrc =gst_rtsp_media_factory_set_launch (factory,str);
gst_rtsp_media_factory_set_shared (factory, TRUE);
gst_rtsp_mount_points_add_factory (mounts, "/test", factory); gst_rtsp_server_attach (server, context);
ahc->vsink = gst_element_factory_make ("glimagesink", "vsink");
ahc->pipeline = gst_pipeline_new ("camera-pipeline");

gst_bin_add_many (GST_BIN (ahc->pipeline),ahc->ahcsrc,
    ahc->vsink,
    NULL);

g_object_set (ahc->vsink, "sync", TRUE, NULL);
gst_element_link_many (ahc->ahcsrc,  ahc->vsink, NULL);

ahc->vsink will be dispaly video , but nothing is show.

Nancy

Hi, @nancywang can you get the source from gst_element_factory_make("ahcsrc", "camera"); ? I always get NULL when I try to retrieve the ahcsrc element.

Yes,I can.
Just like:
GstElement *ahcsrc = gst_element_factory_make (“ahcsrc”, “source”);

BRs,
Nancy

Its not possible to place the same elements in two pipelines. As the GstRTSPServer manages its own pipeline, the only way to multi-plex the camera is to bridge the two pipelines using tee and using appsrc/sink or perhaps intervideosrc/sink will be earlier.

// Preview pipeline
ahcsrc ! tee name=t
  t. ! queue ! intervideosink
  t. ! queue ! glimagesink

// RTSP Server pipeline
intervideosrc ! timeoverlay ! x264enc tune=zerolatency ! rtph264pay pt=96 name=pay0
openslessrc ! queue ! audioconvert ! rtpL16pay pt=96 name=pay1

Notice you had a spurious openslessink in this pipeline, it can’t handle RTP packets, so that can’t work. You should follow RTSP server example for further details on how to handle that, since its not done correctly in your sample code.

Ok, I got it.
Thank you for your help.

BRs,
Nancy