gameTime.ElapsedGameTime.TotalMillseconds returns max

If I set IsFixedTimeStep to false and turn off vsync my game runs at 120fps. However, the reported ElapsedGameTime each frame is 500 milliseconds. Why? I understand that 500ms is the max frame time limit set, but I’m not understanding why GameTime is reporting that it’s been half a second since the last time Update has been called. Any insight?

Well, I looked through the source code and answered my own question. This is by design. It’s not possible to get the time since last Update from GameTime.ElapsedGameTime when running a variable time step. It’s always going to return the max which defaults to 500ms. Wouldn’t being able to know how much time has passed since the last Update still be useful even in a variable time step setup?

Edit: I take that back. Looking at the source again, it should NOT be reporting the max time default as the ElapsedGameTime. Especially when the game is actually running at 120fps. So… why am I getting 500ms back from ElapsedGameTime?

Meh i just do this.

timelast = timenow;
timenow = (float)gameTime.TotalGameTime.TotalSeconds;
elapsedFrameTimeInSeconds = timenow - timelast;

Thats in my timer class in my framerate class i do the same thing pretty much.

        timeRecordedLast = timeRecordedNow;
        timeRecordedNow = gameTime.TotalGameTime.TotalSeconds;
        elapsedTotalGameTime = timeRecordedNow;
        elapsedFrameTimeMeasuredSeconds = (timeRecordedNow - timeRecordedLast);
        fps = frames * (1d / elapsedFrameTimeMeasuredSeconds);
        frameToUpdateRatio = (float)frames / (float)updates;

were updates and frames are counters that are incremented in update and draw respectively when i print out the values every second or so, i reset the counters.

Thanks for the reply. I’m seeing some really weird timing issues/changes depending on the combination of fixedtimestep and vsync that I run. My animations are running off of accumulators that are filled from gameTime.ElapsedGameTime.TotalMilliseconds and the speed of the animations actually change depending on whether fixedtimestep is enabled and if vsync is turned on or not. I just did a few more tests and I’m still getting 500ms (default max) as ElapsedGameTime on code that is running at 120fps. Beyond bizarre to me, but maybe I just don’t understand.

I never understood why anyone would prefer to use milliseconds at all. Like i don’t even understand why they added it to xna and didn’t encapsulate it to just use seconds and helper functions that relate to it… Using fractions of seconds for any game is far more intuitive. All those above values relate to seconds because i A) only ever used that call and B) i only intend to think in terms of seconds. All that said its probably you doing something wrong.

If you want ill upload the newest version of my frame rate and timer classes combined with my no garbage string builder it forms a powerful trifecta of classes lolz for this sort of stuff and for watching your garbage as well.

Ill post a link to it if you like, but the first three lines i posted is how i do it.

http://i936.photobucket.com/albums/ad207/xlightwavex/programing%20and%20concepts/FrameRate05_zpscmfpwdlu.gif

https://drive.google.com/open?id=0B1zD887frY04flRBaXhUU21zZnZHRUtKQzFKZGpmVXhuc1lxdS1VZDBRZzg2WmQ0bGlHUm8

Well, milliseconds ARE fractions of seconds, so it’s really just about perspective/perception. Anyway… I’m not doing anything wrong. I just created a new project and added the following two lines to the Game1 Constructor:

this.IsFixedTimeStep = false;
graphics.SynchronizeWithVerticalRetrace = false;

I then added the following line to the the Update method:

double timer = gameTime.ElapsedGameTime.TotalMilliseconds;

Those are the only additions I made to a brand new project. If I set a break point on the timer line gameTime.ElapsedGameTime.TotalMilliseconds returns 500 each hit. Something is off/wrong. Perhaps it’s my hardware configuration. I don’t know. I’m running on a powerful desktop that can/will run an empty project at hundreds or even thousands of fps. Maybe one of the devs has some insight.

using a breakpoint to get the time is not very good idea as other than the very first time it hits the breakpoint the elapsed time will be massive since it takes in to account all the time the game was paused at the breakpoint.

You need to output the time to screen or the output window.

Or even simpler set the window title to the time.

And there it is. When running a fixed time step it always returns 16.6667… I know that’s not necessarily accurate and it’s going to always return that. It just “tricked” me into thinking it should give an accurate time with a variable time step even if you’re just setting a break point. Output the value to the screen, everything is as it should be. Thanks!