Why does my spritesheet animation jitter on iOS but not Windows?

My spritesheet animation is rock solid on Windows but unacceptably jittery on iOS devices.

By “jittery”, I mean that the frames appear to be drawn at slightly different positions or scales, even though they’re all exactly the same size. The only difference between the frames is that they’re at different source rectangles in the texture.

I’ve tried everything I can think of:

  • MonoGame 3.6, 3.7.1, and 3.8 develop versions
  • Textures as a power of two
  • Every sprite frame the same size in a grid (instead of optimally packed)
  • Various sampler states
  • Ensuring nice integer position and scale values for SpriteBatch.Draw that are identical for every frame
  • Loading texture directly from a stream and via the content pipeline
  • With and without generating mipmaps

How are you all successfully animating from a spritesheet without jitter? What am I missing?


Is there a difference in screen resolution between the two platforms? Stuff like this happens when scaling is involved and the sprite pixels aren’t aligned to the screen pixel grid.

If the screen resolution on iOS is the exact same as it is on Windows, then disregard :slight_smile:

Yes, there is a difference in the screen resolution. And on Windows, I can resize the window as much as I like and have a rock solid spritesheet animation at all kinds of resolutions. It’s only on iOS that it jitters regardless of the resolution. I’ve tried on a few different screen sizes, including iPhone and iPad resolutions.

MonoGame must be able to display a simple 2D sprite sheet animation on iOS without it being all jittery, or it would be unusable for 2D games, right? I must be doing something wrong, but so far haven’t been able to figure out what.

Regardless of the resolution, on iOS, it’s always weird? How does the aspect ratio of the iOS resolutions compare to the Windows resolutions?

I don’t have an answer for you, I’m just giving you some stuff to look for that has caused me similar issues in the past. I haven’t ever done a build on iOS though, only Windows and Android.

When displaying only one frame, a still image, does it already look blurry? Some platforms use to need a half-pixel offset to either the position or texture coordinates.