KNI Engine / Blazor: How does touch screen on mobile platforms work?

Hello all! First time on these discussions!

I’ve been using MonoGame for a while and recently I’ve decided to write a game using the KNI fork with its Blazor platform. I posted my game on itch.io:

The game appears to work fine on regular computers. However, the touch screen doesn’t appear to work when running the game on mobile. For context, I’m using an iphone and I can see the game open, but touching the screen doesn’t appear to work. From the developer mode console, I can see “Failed to load resource: the server responded with a status 403 ().” for “…_framework/dotnet.js.mp” and “…_framework/dotnet.runtime.js.map”, however I don’t see any problematic files when examining the sources getting downloaded (i.e. under the browser developer mode’s network menu).

I’ve tried using TouchPanel, but no connection is ever established. I assume MouseState should be enough to detect when the user taps the screen and where.

If more information is needed from me, I can provide.

(this question is mostly directed to knast lol)

Thanks in advance for any help!

You cannot use the Mouse state - you need to use the TouchPanel functionality.
Remember, the screen could be touched in multiple places - not just one which means it needs different handling to the mouse.

TouchCollection touches = TouchPanel.GetState();

You need to keep track of all the touches returned and see what’s changed since the last iteration, similar to what you would do for the mouse, except for multiple points.

Hello! Thanks for your reply.

That’s what I tried using, but TouchPanel reported there was no connection. Maybe I do need to use it. Is there some kind of setup required for connection?

Hi @Andrew_Powell, Welcome to the Community!

Not sure if you missed it but:

The link for GitHub Discussions is on the home page under Community.

I don’t think TouchPanel is exposed in a web browser? @nkast might know. Will ping him on Discord.

Happy Coding!

Oops.

I’ll move my post to the GitHub Discussions then. Thanks!

1 Like

Are you checking for GetCapabilities().IsConnected?

I think that IsConnected and the MaxTouchCount are not currently wired to anything.

1 Like

Hello!

So, a few things:

  1. Through my hacky way of getting information on mobile browser–writes to the console straight up don’t work–“TouchPanel.GetCapabilites().IsConnected” always returns False, no matter if the “touchstart” event triggers or not. Is this a possible bug?
  2. So, I’ve managed to get something working. In JavaScript, if I add a “touchstart” event listener to the document object in JavaScript that sends the touch points back to the C# app through an invoke async call, IT WORKS. I previously tried adding the event listener to the window object, but then it only works on reloading the browser in Chrome and it wouldn’t work at all in Safari.
  3. I’ve run into a new problem: the sound doesn’t work on mobile browsers at all! I’ll create a new thread for this problem on GitHub discussions

I apologize for the double post, but I thought it was worth mentioning I was not calling “TouchPanel.GetCapabilites().IsConnected” from the Update method, if it matters. I was instead calling it as a JSInvokable from a JavaScript event listener.

  1. A possible solution is to request the maxTouchPoints count from Window.navigator and return true if mtp > 0. Navigator is not yet implemented in the wasm library, we are going to need it for other stuff anyways (gamepads, …).

  2. I guess you were the one that left a message on kni issues. So the problem is that on some browsers Window doesn’t get the event. Right?
    And we have to move the event to Document…

  3. Check if this is a limitation of webbrowsers first. You need to get an interaction from the user before playing any sound.

It’s been a bit but I managed to get the Audio issues resolved… somehow over the weekend (not gonna bother with creating a separate thread at this point). I’ve reached an acceptable point–in other words, I will just archive the other issues for later.

  1. At least for now I’m just going to stick with my solution of directly adding a listener to the “touchstart” event in JS.
  2. That’s what it seems.
  3. I’m assuming you’re referring to the “The AudioContext was not allowed to start error. It must be resumed (or created) after a user gesture on the page” warning message. I resolved this error on regular computer browsers (i.e. browsers running not on mobile phones) by adding “soundEffectInstance.Stop()” prior to playing any sounds. I couldn’t find an effective way to acquiring the AudioContext of the SoundEffectInstance separately in my own JS script, but from experimentation, it seemed like the problem was able to resolve itself whenever the “nkAudioScheduledSourceNode.Start” function is called on the AudioContext. From looking at the KNI and nkast.Wasm.Audio code-base, the “nkAudioScheduledSourceNode.Start” is only called when the “PlatformStart” is called in the SoundEffectInstance, but that only happens when the SoundEffectInstance is in its stopped state. This didn’t seem to fix the problem on mobile browsers, though. What’s crazy is that after I fixed a performance issue with MonoGame.Extended–I’m using its SpriteFactory, and I needed to modify its code so that the JsonContentLoader cached the content–the audio then fixed itself!

There’s probably some other unrelated reason why what I changed suddenly worked. Unfortunately, I don’t quite understand everything well enough to know what : /