My application requires touchscreen input to trigger actions in a pipeline that uses a gtk window for the sink on Linux using X11, not Wayland. I am easily able to do this with mouse or keyboard input, but trying to get the touchscreen data to be recognized is not yet happening. I feel like I should look at the C code for “gst_navigation_event_new_touch_down” to see how it looks for the input data, as I am having to write our own driver because it’s required to use a 35 year old serial elographics touchscreen controller and the available elo drivers don’t work with 6.x kernel, as far as I have been able to determine. Or can anyone suggest an easier/smarter route? I really just need to parse x,y coords and map them to an svg file used by my app to identify touches within certain areas. Thanks for any guidance.