Oooh! Alright, I’ll have a look… So it turns out, I was using:
Global.deltaTime = ElapsedGameTime.TotalSeconds.
I’ve changed it to:
Global.deltaTime = ((float)gameTime.ElapsedGameTime.Ticks / TimeSpan.TicksPerSecond);
After which, I did several trials. Here were my results (IsFixedTimeStep and SynchronizeWithVerticalRetrace are false for all of these):
-
TargetElapsedTimeunset. Game stuttered at around frame 1700. -
I set
TargetElapsedTimeto 60fps (in Ticks). Game ran once without ever stuttering; then began stuttering again when I closed and re-opened it (this happens all the time. I’ll randomly “luck” out and get a stutter free run). -
I set
TargetElapsedTimeto 120fps (in Ticks). Game ran at a smooth 120 FPS, but then stuttered at frame 3400 (which is double the 1700 frame that it usually stutters on in 60FPS). I guess from this we can tell that the stutter is based on time, not frames. -
I set
TargetElapsedTimeto 20 fps (in Ticks), just to see if maybe giving each frame more time to execute would solve things. I could not get the stutter to occur. -
I set
TargetElapsedTimeto 30 fps (in Ticks). Couldn’t get the stutter to occur. -
I set
TargetElapsedTimeto 45 fps (in Ticks). Stutter occured.
So… what do I make of this? Well, whenever the stutter occurs when I run the game in 30FPS+, the FPS dips below 30FPS, yet when I limit the game to 30FPS, I guess it’s got more time inbetween to catch up…? That, and it the actual FPS doesn’t seem to matter, because the stutter will always occur at around 30 seconds in regardless of how many frames have been processed. Any ideas?