Need to capture from video device, process buffer in my app, and write buffer to a GST rendering window

I have an application that processes video buffers in real-time. I have this part working. I’m trying to make the following enhancement…

I need to capture video buffers from a video input device (my GST 1st video pipeline), process the buffers (in YUV or RGB color space) in my app, and write the buffers to a GST rendering window (my 2nd GST video pipeline). I need to be able to do this in both Windows and Linux (which is why I chode to use gStreamer). I realize that the gStreamer pipelines will need to be different (mostly due to the fact that Windows video capture is different than Linux video capture). Thus far, I have not been able to get this to work properly in Windows or in Linux.

My current video input (capture) pipeline in Windows is:
showvideosrc device-index=0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=YUV ! appsink

My current video output (render) pipeline in Windows is:
appsrc ! video/x-raw, format=YUV, width=640, height=480, framerate=30/1 ! videoconvert ! autovideosink

I have verified that video input device 0 can capture YUY2 at 640x480@30fps.

Any suggestions as to what my correct video pipelines should look like???

Welcome @htartisan!

What problem do you run into? Anything showing up setting the env var GST_DEBUG=2?

When you pass buffers between 2 pipelines, you usually need to set both to the same time reference. Take a look at this code as an example.