Hi everyone,
I have a webpage that is capturing the camera and microphone of a connected client and streaming it to my server via websocket where it’s being read by a PHP script (via an nginx websocket proxy) running a Ratchet Websocket server to accept the stream and allow me to forward it when messages are received.
I was able to get it working fine using the gstreamer tcpserversrc on a specific port; worked great. But I would prefer a more ‘local’ solution like a unix socket to make that transfer instead. I’ve tried working with ChatGPT about how to do this and it keeps sending me down rabbit holes without any proper solution.
So I thought I’d ask here for how I should approach this?
I want to avoid using tcp ports bc I want my script to be able to scale and handle multiple streams and I’d rather designate a local socket/fifo pipe for gstreamer to read from.
So I’m curious what I should be using to implement this; assuming I can at all. I’ve tried dabbling with fdsrc, unixfdsrc and socketsrc a bit but I’m not quite sure if those are what I should be using or not; and I don’t see many examples of how to use them out there.
Can someone point me in the right direction?
Thanks!