Accuracy of game.TargetElapsedTime on laptops (affects Beam Racing)

(Founder of Blur Busters here)

Short Question:

How do I make TargetTimeElapsed microsecond-accurate as possible?
…Making the next Update()/Draw() cycle called right exactly on time, not too early, not too late, better-than-millisecond granularity.

Long Form Explanation:

I have a special beam-racing application that I created in MonoGame (see YouTube of mouse arrow dragging the exact position of a VSYNC OFF tearline) – and I’m trying to create the world’s first cross-platform beam racing demo for Blur Busters using the MonoGame engine.

On the PC, the game.TargetElapsedTime is very microsecond accurate. Once I set it, the next Update/Draw cycles are called right on the dot, almost to the exact microsecond. Fantastic. Beam racing success.

…Look at that! Mouse dragging the exact position of a VSYNC OFF tearline! Old skool beam racing… (using 100% pure MonoGame APIs to simulate raster interrupts, with no raster register – just precision clock math as an offset from VSYNC timestamps).

MonoGame retweeted this recently, see

However, on my laptop, game.TargetElapsedTime is only millisecond accurate. If I do huge hacks and directly Present() / Flush() on QueryPerformanceCounter(), it works. But I’d rather MonoGame do an optional ultra precision microsecond-accurate frame pacing mode (even at more battery consumption) – where TargetElapsedTime precisely schedules the next Update/Draw to within approximately a 10-microsecond accuracy like I can do via other laptop hacks (direct polls of QueryPerformanceCounter() instead) – that’s not very MonoGamey.

How can I make MonoGame trigger the next Update() call to better-than-millisecond granularity? I need to do this for beam-racing on more platforms (laptops, Linux, Mac with beamsync OFF / VSYNC OFF, etc).

This is some current research work towards lagless VSYNC ON modes being added to some emulators recently (synchronizing emulator rasters to real-world raster – now successfully implemented in WinUAE and GroovyMAME)


Thanks so much,
Mark Rejhon
Founder, Blur Busters

TargetElapsedTime precisely schedules the next Update/Draw to within approximately a 10-microsecond accuracy

I think targetelapsed time has like 1 to 2 milliseconds of error, (.002 seconds) (2000 micro seconds)
Im not sure you can get much better then that without problems.

Im not sure what the point is though, vertical retraces occur typically at 120 to 200+ times a second.
If your game can run at those speeds whats the big deal of missing a frame now and then ?

I mean i sort of think i get it, i think your trying to sleep perfectly so you don’t have to wait for efficiency. but vsync is a monitor signal as well.

Anyways as far as far as i know query performance counter is as fast as it gets without directly manually getting into calling system interupts, im not sure performance counter is thread safe either. There is Enviroment.GetTicks or something i have no idea how accurate that is.
If you want to do your own timings period just turn off isfixedtimestamp.

It already is if I use VSYNC OFF.

Video proof of microsecond MonoGame 3.6 accuracy:

This is a Monogame app! With pixel-exact VSYNC OFF tearlines (with NO raster register)

Every single pixel on that screen in that YouTube has only 3ms between spriteBatch draw call and photons hitting my eyeballs (remember: BlurBusters tests input lag, photodiode oscilloscope, high speed camera – e.g. G-SYNC 101 | Blur Busters etc). It’s a recipie that makes a lagless VSYNC ON mode possible via some cleverness (links below)

I simply real-time manipulate .TargetTimeElapsed inside every single Update() to microsecond accuracy to achieve pixel-exact VSYNC OFF tearlines. It dynamically increases/decreases too, to compensate for processing loads too – it’s rather simple mathematics for beam chasing professionals (e.g. people who’ve programmed raster interrupts Back In The Day).

That I wrote. It’s part of an upcoming open source beam-racing sandbox demo called “Tearline Jedi”, that will teach beam racing newbies.

I’m just asking for it to occur on laptops too, because I can already emulate a microsecond-accurate TargetTimeElapsed if I hack outside of MonoGame to force its accuracy via creatively-placed busywaits + forced GraphicsDevice.Present() inside my own Draw() – then it works. So it works on laptop. But I want to do it in unmodified Monogame if possible…

(Maybe even via “.HighPerformanceClockMode = true” attribute – using 10% more battery power when this suggested hypothetical flag is enabled, it’s probably automatically using battery-saving low-precision timer events.)

Do I have to get my hands dirty and modify MonoGame code instead? But before I do, I’ll give a $200 BountySource reward so I can delete my hack. My laptop already has a microsecond timer but MonoGame isn’t using it, you see?

But before doing this, I want to be sure that there’s no way to already do it (since it’s already possible to do via outside-of-MonoGame hacks).

I want this to be the world’s first cross-platform beam chasing demo, and so I need to make sure all platforms optionally uses the already-existing higher-precision clocks that actually fixed my laptop (by bypassing MonoGame code)

WinUAE Amiga Emulator reduced 40ms of input lag to less than 5ms using my own algorithm: Input latency measurements (and D3D11) - Page 8 - English Amiga Board

Android already does sub-millisecond beam racing for virtual reality:
That’s a device MUCH more underpowered than my laptop…

Here’s my Blur Busters article:

As a beam racing expert, please listen: MonoGame can do it. It’s simply ignoring the already-existing high-precision stuff in my laptop that I can already access via alternative means.

I want to see a cross-platform-compatible beam-racing library. Wouldn’t MonoGame love to conquer more of the VR market, reducing virtual reality and emulator input lag via beam racing techniques?

In reality, I only need approximately ~1/67,000th second accuracy for 1080p60, so “microsecond accuracy” is simply “best effort”.

But as the YouTube video proof clearly shows accuracy even better than that – it’s running on a 160KHz scanrate (1080p144) and such stable rasters in my YouTube required 1/160,000sec accuracy that MonoGame 3.6 is actually successfully doing with no modifications! Just standard realtime manipulation of .TargetTimeElapsed on the fly…and simple System.Diagnostics.Stopwatch (0.1us) mathematics to guess raster position as an offset from VBI that’s all. VSYNC OFF tearlines are just simply rasters.

To make MonoGame ready for VR beamracing applications & emulator beamracing applications, it needs to have an optional “always-use-high-performance-clocks-only” mode (at expense of slight amount of battery power on laptops).

Well im nearly 100% monogame doesn’t have micro second accurate timing already.

Im 100% sure this would also be useful for code block timing in certain cases for monogame especially were stopwatch wont work right though it uses performance counter under the hood.

You can directly download the source / compile it from git hub found here make a demo version that replaces or is additional to what monogame currently uses for GameTime.


Maybe better open a issue and submit your idea for discussion on what steps you need to take for it to comply to what is required to get it added in or to replace the current game time, ect… get help on how to procced ect.

The good news is that it is already good enough when running on most of my machines.

It’s laptops that it automatically randomly downgrades timer accuracy (against my will) to save battery power. In reality, I only need approximately ~1/67,000th second accuracy for 1080p60Hz, so “microsecond accuracy” is simply “best effort”.

Occasionally, sometimes out of necessity for a specific kind of lag-critical application, a 10% increased power consumption can reduce lag from 40 millisecond down to less than 5 millisecond – but we plug the laptop in anyway, but timer accuracy doesn’t automatically upgrade itself back to original good accuracy. So if one could set an optional flag, like “.HighPerformanceClockMode = true” to kick the code into high gear – that would be perfect for me and other users who need beam racing!

Often, things will work fine if things jitters by say 6us there, 11us here, 5us now, etc. That’s what happening. What’s important is that beam racing (to do a lagless VSYNC ON mode) is a function of a display’s horizontal scanrate, and that’s the error margin.

And when it’s momentary worse, there’s simply brief reappearances of tearing artifacts for only that one particular refresh cycle, then back to normal the next refresh cycle (lagless VSYNC ON via tearingless VSYNC OFF via beam racing technique). There’s a jitter safety margin technique that I successfully developed for beam-raced rendering techniques.

Real world implementation of “Lagless VSYNC ON” technique via beam racing: WinUAE emulator (which I helped!) – it actually successfully synchronizes the real-world raster with emulator-raster, and – yes, it works on all GPUs (including laptops & intel embedded GPUs). Chips have had the RTDSC (microsecond counter) instruction for roughly a couple decades now…

So nothing inherently stops a modern system (made in the last ~8-10 years) from being able to do this; it’s simply an arbitrary limitation somewhere preventing me from doing such.

Possibly will open a bug tracking entry and put a bounty on it. But first, I’m going to wait until someone else answers if there’s a workaround (that’s less hacky than mine) and then…

If none, I’ll either investigate doing the code change myself and submit a pull request to MonoGame, or open/pay a $200 BountySource for someone else to do the relatively simple MonoGame engine change. [Readers, ping me mark[at], if you know a quick answer and I’m not paying attention…]

Another reason why this may be important to future MonoGame implementations is beam-raced virtual reality.

Some VR apps on Android uses beam racing and, thus, demands automatically using the highest-precision mechanisms available in a platform (even if it uses a slight amount more battery power in order to remove headache-inducing latency).

There are ways to use MonoGame for VR and mechanisms of maximizing synchronization accuracy should be kept where possible, at least as optional options.

I upgraded from MonoGame 3.6 to MonoGame and have noticed a (very) slight degradation of accuracy – might be due to develop branch – but it merits some attention and maybe possible future frame-pacing accuracy-verification.

I have a new YouTube video of the “Lagless VSYNC” beamracing techniques I’m doing. Works on both PC and Mac.

Thanks to beamracing, this is a “lagless VSYNC” mode – the photons of each row of text is hitting my eyeballs only 2-3 millisecond after the spriteBatch.DrawString calls on my BenQ XL2720 gaming monitor (which has synchronous panel scanout from cable scanout). That’s mostly LCD GtG – so nothing gets lower latency than beamracing!

One problem I am having is I can’t get a full 1000Hz from my computer mouse. I’m trying to do Mouse.GetState() multiple times per refresh cycles so I can get lower-lag mouse reads, since I need to read input multiple times during a single Draw() frame.

Formerly, I relied on Update()/Draw() alternating ~1000+ a second via precision-granularity .TargetElapsedTime which works fine on PC. But intermittently fails on my laptop.

So to fix the laptop problem I’m now doing multiple GraphicsDevice.Presents() in one Draw() loop, as a workaround to the MonoGame laptop-precision bug. Upon this, I lost access to 1000Hz polls, as Mouse.GetState() only returns the same coordinates for the full duration of the Draw() call even if I’m doing a beamrace loop lasting 16 milliseconds.

Yes, I know MonoGame was never designed for beam racing. But look at that YouTube video. Works on PC and Mac. And virtual reality programmers are adding beam racing to reduce input lag, so this is a good way to make MonoGame get ready for VR.

So in other words:

– How to improve .TargetElapsedTime precision on laptops so I can cycle Update-Draw frequently for the high-Hz polling of high-Hz gaming mice. Even improving MonoGame precision ($200 BountySource offered for anyone for a MonoGame game engine commit)


– I continue my workaround workflow of doing multiple precision-timed GraphicsDevice.Present() during one Draw() call (one beamraced refresh cycle). I would like a separate poll of the mouse per frameslice though. How can I get multiple different refreshed Mouse.GetState() values during one Draw() call – it doesn’t refresh itself. (Is there a Mouse.ForceRefreshState() or one that could be added?)

Due to the problem in this bugtracking ticket, I can no longer rely on OPTION A for laptops, and the OPTION B workaround now forces a new problem into my hands…

I’m essentially debating which pick-poison approach is preferred. :wink: