Websocket stream as gstreamer source via local (Unix) sockets (instead of tcpserversrc)

Hi everyone,
I have a webpage that is capturing the camera and microphone of a connected client and streaming it to my server via websocket where it’s being read by a PHP script (via an nginx websocket proxy) running a Ratchet Websocket server to accept the stream and allow me to forward it when messages are received.

I was able to get it working fine using the gstreamer tcpserversrc on a specific port; worked great. But I would prefer a more ‘local’ solution like a unix socket to make that transfer instead. I’ve tried working with ChatGPT about how to do this and it keeps sending me down rabbit holes without any proper solution.

So I thought I’d ask here for how I should approach this?
I want to avoid using tcp ports bc I want my script to be able to scale and handle multiple streams and I’d rather designate a local socket/fifo pipe for gstreamer to read from.

So I’m curious what I should be using to implement this; assuming I can at all. I’ve tried dabbling with fdsrc, unixfdsrc and socketsrc a bit but I’m not quite sure if those are what I should be using or not; and I don’t see many examples of how to use them out there.

Can someone point me in the right direction?
Thanks!

What format do you capture audio/video in (and send it to tcpserversrc)?

fdsrc reads data from a file descriptor, which can be pretty much anything including a pipe or unix domain socket, but you’d have to open that in your app and then pass the socket to the element. socketsrc is just a variation of that that uses GLib/GIO abstractions instead of a ‘raw’ unix file descriptor.

Main problem is that fdsrc currently doesn’t have a live mode yet where it timestamps the incoming data.

unixfdsrc is I believe meant to be used in connection with unixfdsink. It kind does what you want but there’s a protocol involved, so you can’t just shove data towards it.