[Solved] Jittery, Choppy Frame Movement for 2D game

This isn’t a question, I just found it quite difficult to locate my particular issue online, so it’s here for google.

I’m building a platformer, and even though my render loop was able to complete in 0.05 ms (or 1200 fps), I was seeing constant frame losses that made the game look choppy. It looked absolutely horrendous. Google kept pointing to issues like IsFixedTimeStep, or people who had used atlases poorly, etc. None of those problems applied to me.

It was a VSYNC issue. Doing this line solved it completely:

graphics.SynchronizeWithVerticalRetrace = false;

Despite doing some research on vsync, I still don’t quite understand the full ramifications of why this happened, but this did solve everything. So if anyone else is having laggy rendering or janky movement caused by vsync, hopefully you stumble across this.

I’m interested in this subject.

I’ll admit, ‘just use vsync’ isn’t a very appealing solution. There’s plenty of games out there that look smooth with or without vsync. There must be a deeper issue at play. There’s a variety of things that might cause it… and now I’m curious what the actual reason was.

This is really unsatisfying. xP

Yes this is caused by the timer resolution on different hardware or different os or different builds of a os even. Vertical Retrace timing just exacerbates the problem on some systems.

The mg timing and or timing resolution code is not as robust, i suppose as it should be, but this is a ongoing thing that people keep coming back to from time to time and taking a shot at.

Here is a huge discussion that had occured on the forum over some time with numerous people digging into it.

Physically vsync represents a hardware interrupt deep down (for a idealogical two way street read write signal design between the monitor and the video card), translated thru the OS (on Windows this is a IRQ in software and a pin physically). These calls typically pause threads including the game thread but this pretty fast usually 1 ms or less.

So your video card dumps the Backbuffer to the screen typically as fast as it can (when you get to base.draw) when not using vsync. The monitor itself has a set limit of how fast it can physically completely update the image on the screen, from top to bottom left to right.
When it completes a full draw there is a monitor refresh signal sent to the os and a very tiny pause typically about 4 milliseconds give or take were the monitor is preping for the next draw of its own video buffer. This is the vsync signal which a card can wait for to burst out another frame of data from its own back buffer to the monitors buffer before the monitor starts its next draw.
What this implicity means, is that the video card and in turn the base.draws render call, will try to time themselves to wait for this signal or line up with it, which can take a little extra time in a games draw loop.

However if a game timer misses this signal (say because it is oversleeping), it can throw off the timing.

Over time this can cause you to lose a frame or get a double drawn frame ect, thus triggering is running slowly or force extra wait time, which can look like a stutter, but as i said this is made harder to deal with because different systems can use or fallback to different resolutions timers.
Its (pretty hairy stuff) basically mostly controlled by the os between hardware and software.

So for now.

Turning off vsync speeds up the draws to maximum meaning more cpu load and allows a low probability of screen tearing (not sure this is really a thing at such high monitor refresh speeds anymore), At the same time this stabilizes the monogame timer a bit which is a little off which can be more important.

You can test what i mean (by a little off) by setting the fixed time step on and set your fixed target elapsed time to a very high number (try with a lower number too) then just count the updates and draws they should match but they probably wont by a considerable amount.

One other take away is that your updates if set for 120 per second may not actually be 120 in a second which could i suppose be a concern for network syncing.

4 Likes

So the TL;DR I guess is that when the OS/hardware wants to render a frame and when the game is trying update aren’t always synced and can cause stuttering.

So if simply rendering more frames(vsync off) to mitigate that issue works… what, then, are some games doing which allows them to look smooth with vsync being on and the minimal number of frames are being rendered? Is there some magical voodoo being done on the game timer which more intelligently reads the intent of the OS/hardware and adjusts is Thread.Sleep timer more accurately?

what, then, are some games doing which allows them to look smooth with
vsync being on and the minimal number of frames are being rendered?

Ah you know what my bad, i think at some point they got vysnc working right and the timer seems to be improved as well.

I just tested it at 300 fps and the frame rate locked at 60fps which is my video cards default refresh rate for overriding apps. Which its obviously doing here updates were 300 draws were 60 with IsRunning slowly constantly firing.

When i set the framerate to 60 with vsync and pushed the card and i got no frame skips or running slowlys and it looked steady. Though im not gonna test it at higher speeds and try to unlock the default on the card for my random apps.
.
.
So i guess instead my advice should of been… try setting your frame rate to the monitors refresh rate first.

You should also be aware that modern crt monitors take from 16 to 30 milliseconds for a pixel to dim its not immediate, so that can actually affect what your seeing as a form of motion blur.

Good to know. It’s probably worth emphasizing that anyone suffering this issue should still take a look at their code, and especially the way they increment translational/vector related data every frame, as I suspect a great many of the issues surrounding this are related to that.

That, and just generally keeping an eye on each major process you finish coding to get a feel for its performance. I always run some tests on each major code path I connect to Update() or Draw() to get a feel for how much CPU crunching is going on. Processes I wouldn’t expect to be costly can be, and things I imagined would need a lot of cycles ended up being trivial.

I suppose a good long-term solution to this would be to implement some debug performance tracking classes which you can insert here and there in your major code paths to keep track of all that.

Correct me if I’m wrong, but MonoGame would have to be responsible for these changes. There were no issues with Garbage Collection, CPU, Memory, rendering performance, update performance, etc… the code is lightning fast. I benchmarked everything and can conclusively say that all of the code I wrote runs in about 0.06ms per frame, there are zero GC’s happening, and CPU isn’t even remotely close to its limits.

I even ran a test where I decided to force GC to run on every single frame just for the lulz (and because I was so confused as to what could be happening), and it has absolutely zero effect on the staggering.

My guess is that it has something to do with Fixed Timesteps, because MonoGame will only do certain things after the 16.67ms has passed. I haven’t dug into the depths, but I don’t want mislead anyone into thinking it’s their code at fault when it’s linked to a deeper issue.

What I mean is how movement of objects is handled in the game. There are many pitfalls which newer(and I’m not suggesting anyone in this thread is new) programmers may make related to positioning and movement. It’d be equally unhelpful if we lead a newer programmer into thinking Monogame is at fault when there might be deeper issues in their own code.

For instance, if fixed timestep is on, and someone ties all their movement to a fixed value per frame… this works fine, unless frames get dropped, resulting in slowdown/jitters. Or indeed, speed-up, if there were somehow extra frames.

Updating all translation/position data based on the GameTime’s delta time then makes sense.

But even there, there could be issues. The obvious being, that someone uses delta time incorrectly, or makes a mistake in their maths somewhere. Even something as simple as missing a (Cast) to/from a float could cause unintended behavior in the way objects are moved around frame-to-frame. I’d say most of the problems I’ve read about on this issue, related to stuttering/jitters, ended up being some kind of error in the handling of Delta time, or in the maths for calculating movement. All those sorts of things can cause janky looking movements even if the game is running at a very high frame rate.

It’s also possible to get janky movements depending on how the actual rendering is being done. If the game is being tied to a virtual resolution for instance, and that resolution might be low to get a pixelated effect… but depending on how resolution and scaling are being handled, can also cause weird effects in the way things get rendered… or even the way sub-pixel movements may be handled.

Point being, ‘jittery framerate’ is such a nebulous and encompassing problem which covers a massive range of potential causes, it’s good to make sure people consider all the possibilities. Hence my first comment of ‘this is an unsatisfying resolution’, because really, we still don’t know what happened or why it happened. And more to the point, someone visiting this page from Google search months from now might have a totally unrelated issue to yours, but with similar symptoms, and they could walk away from this thread convinced the problem is caused by X or Y even if it isn’t.

2 Likes

Completely agree with Rei here, it needs to be clear that this solved this specific issue and was not or possibly not an underlying fault, and I am pretty certain one should never really base movement on unpredictable constants, where constants is a clue of sorts. :smirk:

Well, I have to strongly disagree here. Fixed Timesteps are a cornerstone of deterministic multiplayer. It’s literally one of the best things to have for certain game types, and countless multiplayer games rely on them.

Naturally then, the server is the constant provider, no?

Yes. And I’m not sure why that’s posed as a question?

If we want to dig into this subject, then let’s address the issues of concern.

For instance, if fixed timestep is on, and someone ties all their movement to a fixed value per frame… this works fine, unless frames get dropped, resulting in slowdown/jitters. Or indeed, speed-up, if there were somehow extra frames.

The key here being it’s fine “unless frames get dropped.” Okay, right. So why are the frames getting dropped, is the question. The updates are hitting every frame correctly and registers the movements correctly. However, the rendering may encounter lost frames for hardware reasons that you are more familiar with than me. This is where vsync comes in. Rendering it faster than it needs to = it doesn’t drop my frames any more.

Point being, ‘jittery framerate’ is such a nebulous and encompassing problem […]

Which is why I thoroughly investigated the problem and narrowed it down exclusively to rendering. If there was a problem with my actual game code, the janky framerate would occur regardless of the speed of the rendering.

I am not suggesting that my code is godlike and untouchable. I can write flawed code. But given the inputs and repeated experimentation of this code, I can verify that rendering is the only source that is causing frames to disappear.

I am not trying to insult Monogame. I am not questioning your understanding of vsync. But the idea that I’m potentially doing something wrong for having used a fixed timestamp is frankly absurd and it is completely irrelevant to the cause of any jittery, choppy frame movement.

Quite frankly this thread will only go into a non constructive debate from what I can see, so, that’s me out on this one.

Good luck though :pray:

Though I might add, I know a lot about network gameplay coding and frankly if jittering is a problem, then it is 100% coding related, because I know it can be smoothed out, I think the expression is Predictive Networking or something.

EDIT

Oh, you know what, just realised this was for a 2D game, ignore everything I said :joy:

For the record, I would like a constructive conversation, and a deeper understanding of how I could go about resolving this. And yes, I was perhaps a bit frustrated by my interpretation of some of the responses. I apologize for that. But if there is light to be shed on this, I would like to hear it.

As of right now, network issues can be ruled out as well. I can toggle between my localhost server and an internal fake server that single player uses. In either case, latency would not be an issue, and in singleplayer there’s no traffic even being generated.

So, single player, 2D. And as a recap, the entire scene currently updates in something like 0.05ms, or about 340 times per frame. There is no GC occurring; all objects are correctly pooled, and in the test runs I’m working with there is no purging of any objects (or at least none that I’m aware of, and certainly none that have incurred a GC penalty).

All rendering is separated logically within the appropriate methods, and the render loop is equally fast.

I am using Monogame’s fixed timestamp, rather than implementing my own. This is because of an assumption I’m making, which clarity on may help resolve an underlying issue. My assumption is that Monogame will run the game logic (update method), and immediately upon completion of the game logic, trigger the render update. Then, it will simply wait until the next frame is ready to trigger (e.g. if(time > nextElapseTime) { do game logic })

Now, assuming all of those things are true, I don’t understand why vsync would matter. I could equally understand why someone else might be inclined to think the matter is due to my game code.

But if it has to do with my code, then the movements would also have to be janky when rendering is keeping up. And unless anyone here can refute that particular statement, I am left to conclude that something is happening at a deeper level (maybe Monogame, maybe hardware, whatever). Because that is not happening.

In which case, I am here, looking to the experts to help me understand why.

1 Like

Look up, Client Side Prediction, and look into how V-Sync works, like, a deep dive…

Here’s some put togethers for you…

Buy this book, and read it, like, all of it! click the print edition for the full preview experience [or just click the Kindle edition page preview image and then click print at the top, sadly the topic I think you may benefit from is not in the preview but the Latency section has some gems in the preview]

I coughed up the print edition price :see_no_evil: totally forgot how much I paid for it so… yay

And for VSYNC, here:

https://www.youtube.com/results?search_query=How+VSYNC+affects+game+code

This one explains it quite well…

If possible, could you actually video demonstrate the actual jittering you are experiencing so maybe it can be dug down a bit more…

I want to add, frame issues should be fixed in code, and some users hate not being to uncap from refresh rates, but honestly, I prefer using it a lot more than allowing my hardware to overheat with useless extra frames… this will have to be a decision you make based on your development process.

During this searching, I came across this:

Can someone tell me, is that a tea mug sip at the end? Really annoying to hear that cut off that way [#BritishThing]

Anyway, at this time, I am still coming back to MonoGame and coding with a fresh pair of eyes [literally] and as such, I cannot offer more detail at present, however, Networking is in my planned project timeline, so, perhaps a year from now I may look into it deeper.

I hope this is further deep dived, could make for a good read for future networking enthusiast coders.

Well, as I said, this is happening on single player, so that’s a very different topic you’re addressing.

If I find something relevant in my process, I’ll be sure to update.

Necro’ing this thread because I have been trying to understand this issue for some time, and I’m not alone. I believe this entirely to do with the synchronisation of Monogame’s render frames in a window, and the screen output.

For context, my test game was just a basic platformer with tile-based collision that runs with a low resolution backbuffer (that gets upscaled into the window).

As I was encountering this stuttering issue in the game window; I proceeded to try to capture video with OBS. However, the video preview being shown in REAL-TIME on my other monitor through OBS, was showing absolutely no stuttering at all. The recorded video also showed no stuttering either. This would occur with the vsync flag turned on or off.

So this would seem to suggest it’s an issue at the video output level?

Oh my god, you revived something I was looking for lol