I’m working on a desktop Windows project. I’m not seeing the TouchPanel picking up gestures despite enabling them like below. Am I missing something?
TouchPanel.EnabledGestures = GestureType.FreeDrag | GestureType.Pinch | GestureType.PinchComplete | GestureType.DragComplete;
Then I am trying to read gestures like this:
Like XNA, we don’t support touch input on Windows desktop. This blog post from Shawn Hargreaves about touch input on Windows for XNA 4.0 explains it all a lot better than I can.
That solution doesn’t seem to be all that ‘easy’. :-/ Out of the box there are compat issues with the window message loop and opentk … but it at least points me in a direction.
Thanks for the pointer!
That’s strange, I am getting touch events in my desktop Windows game, the whole nine yards: Tap, Hold, Double Tap, Pinch, Drag, Flick, everything. The problem is Double Tap (not Tap!) and the start of the dragging generate LMB mouse events, which is not exactly what Shawn wrote: I don’t get RMB events for Hold, for example. It looks like something has changed since 2010.
IE11, VS2015 Preview and Notepad++ manage to filter out mouse events, so I don’t know if this still an unsolved problem for MonoGame.
I think I found the problem. Pointer events, if left unhandled, turn into gesture events, and those turn into mouse events. WinForms doesn’t handle pointer events. This doesn’t explain why I am not getting mouse duplicates for all events, but it looks like you could add the following lines to WinFormsGameForm.cs after line 108:
m.Result = IntPtr.Zero; return;
to avoid unnecessary mouse events. I’ll test this and post a PR if I’m successful.