I have some code written long ago (many years) which depends on the Mouse.GetState logic on Android. The code depends on the mouseState.LeftButton being set to MoueState.Pressed when the user touches the touch screen. Unfortunately, this code is not working.
To see if it had to do with a bug in an old version of MonoGame, I created a new project targeting MonoGame 3.8. This is a near-empty project, and the relevant code is here:
I place a breakpoint inside the if-check, and when I run the project on emulator and on my own phone, the breakpoint is never hit even if I push the touch screen.
I believe this used to work at one time, and if I remember correctly, the Mouse would return touches on Windows Phone 7, which I suspect is the model that MonoGame Android is following.
So my questions are:
Does anyone know if GetState used to work on Android but is no longer working?
Is this intentional behavior? Should MonoGame projects no longer rely on Mouse for touch screen interaction?
Ok, I think I misguide you.
EnableMouseTouchPoint is the reverse of what you want, it emulates touch with mouse.
I don’t see monogame ever implemented the other way around.
I dunno how helpful this will be since this sounds like it’s more of a bug report… you had a thing that worked, now it doesn’t work, and you don’t know why. Bummer!
However, I can maybe offer an alternative approach that I’ve used in the past that can maybe help you get past this. I’m assuming you’re using Mouse.GetState because you want your program to work on both Windows and Android?
If that’s the case, it helps to write yourself a simple touch input interface… something like this:
public struct SurfaceState
{
public Vector2 TouchLocation { get; }
public bool IsTouched { get; }
}
public interface ISurfaceInputManager
{
void Update(GameTime gameTime);
SurfaceState GetSurfaceState();
}
public class AndroidSurfaceInputManager : ISurfaceInputManager
{
// Implement interface using TouchCollection to determine whether or not the screen is touched and where that touch is located.
}
public class WindowsSurfaceInputManager : ISurfaceInputManager
{
// Implement interface using the mouse, using left click (or any click) to determine if the screen is touched and the mouse location to set TouchLocation.
}
public class SomeGameEngine : Game
{
private ISurfaceInputManager _inputManager;
public SomeGameEngine(ISurfaceInputManager inputManager)
{
_inputManager = inputManager ?? throw new ArgumentNullException();
}
protected override void Update(GameTime gameTime)
{
_inputManager.Update(gameTime);
var inputState = _inputManager.GetSurfaceState();
if (inputState.IsTouched)
{
Debug.WriteLine($"Surface is touched at {inputState.TouchLocation.ToString()}");
}
}
}
// ...
void SomeAndroidEntryPoint() // ie, OnCreate method in your GameActivity class
{
ISurfaceInputManager inputManager = new AndroidSurfaceInputManager();
SomeGameEngine engine = new SomeGameEngine(inputManager);
engine.Run();
}
// ...
void SomeWindowsEntrypoint() // ie, Main method in your Program class
{
ISurfaceInputManager inputManager = new WindowsSurfaceInputManager();
SomeGameEngine engine = new SomeGameEngine(inputManager);
engine.Run();
}
This is a simplified approach that allows only a single touch location and single input button, but is a common input mechanism for both platforms. Feel free to extend this however you like, or define your input in terms of actions that your game is expected to be able to perform and use the same technique to implement those actions on each platform.