FPS drop at regular intervals

Oooh! Alright, I’ll have a look… So it turns out, I was using:

Global.deltaTime = ElapsedGameTime.TotalSeconds.

I’ve changed it to:

Global.deltaTime = ((float)gameTime.ElapsedGameTime.Ticks / TimeSpan.TicksPerSecond);

After which, I did several trials. Here were my results (IsFixedTimeStep and SynchronizeWithVerticalRetrace are false for all of these):

  • TargetElapsedTime unset. Game stuttered at around frame 1700.

  • I set TargetElapsedTime to 60fps (in Ticks). Game ran once without ever stuttering; then began stuttering again when I closed and re-opened it (this happens all the time. I’ll randomly “luck” out and get a stutter free run).

  • I set TargetElapsedTime to 120fps (in Ticks). Game ran at a smooth 120 FPS, but then stuttered at frame 3400 (which is double the 1700 frame that it usually stutters on in 60FPS). I guess from this we can tell that the stutter is based on time, not frames.

  • I set TargetElapsedTime to 20 fps (in Ticks), just to see if maybe giving each frame more time to execute would solve things. I could not get the stutter to occur.

  • I set TargetElapsedTime to 30 fps (in Ticks). Couldn’t get the stutter to occur.

  • I set TargetElapsedTime to 45 fps (in Ticks). Stutter occured.

So… what do I make of this? Well, whenever the stutter occurs when I run the game in 30FPS+, the FPS dips below 30FPS, yet when I limit the game to 30FPS, I guess it’s got more time inbetween to catch up…? That, and it the actual FPS doesn’t seem to matter, because the stutter will always occur at around 30 seconds in regardless of how many frames have been processed. Any ideas?