It already is if I use VSYNC OFF.
Video proof of microsecond MonoGame 3.6 accuracy:
This is a Monogame app! With pixel-exact VSYNC OFF tearlines (with NO raster register)
Every single pixel on that screen in that YouTube has only 3ms between spriteBatch draw call and photons hitting my eyeballs (remember: BlurBusters tests input lag, photodiode oscilloscope, high speed camera -- e.g. www.blurbusters.com/gsync/gsync101 etc). It's a recipie that makes a lagless VSYNC ON mode possible via some cleverness (links below)
I simply real-time manipulate .TargetTimeElapsed inside every single Update() to microsecond accuracy to achieve pixel-exact VSYNC OFF tearlines. It dynamically increases/decreases too, to compensate for processing loads too -- it's rather simple mathematics for beam chasing professionals (e.g. people who've programmed raster interrupts Back In The Day).
That I wrote. It's part of an upcoming open source beam-racing sandbox demo called "Tearline Jedi", that will teach beam racing newbies.
I'm just asking for it to occur on laptops too, because I can already emulate a microsecond-accurate TargetTimeElapsed if I hack outside of MonoGame to force its accuracy via creatively-placed busywaits + forced GraphicsDevice.Present() inside my own Draw() -- then it works. So it works on laptop. But I want to do it in unmodified Monogame if possible...
(Maybe even via ".HighPerformanceClockMode = true" attribute -- using 10% more battery power when this suggested hypothetical flag is enabled, it's probably automatically using battery-saving low-precision timer events.)
Do I have to get my hands dirty and modify MonoGame code instead? But before I do, I'll give a $200 BountySource reward so I can delete my hack. My laptop already has a microsecond timer but MonoGame isn't using it, you see?
But before doing this, I want to be sure that there's no way to already do it (since it's already possible to do via outside-of-MonoGame hacks).
I want this to be the world's first cross-platform beam chasing demo, and so I need to make sure all platforms optionally uses the already-existing higher-precision clocks that actually fixed my laptop (by bypassing MonoGame code)
WinUAE Amiga Emulator reduced 40ms of input lag to less than 5ms using my own algorithm: http://eab.abime.net/showthread.php?t=88777&page=8
Android already does sub-millisecond beam racing for virtual reality: https://www.imgtec.com/blog/reducing-latency-in-vr-by-using-single-buffered-strip-rendering/
That's a device MUCH more underpowered than my laptop....
Here's my Blur Busters article:
As a beam racing expert, please listen: MonoGame can do it. It's simply ignoring the already-existing high-precision stuff in my laptop that I can already access via alternative means.
I want to see a cross-platform-compatible beam-racing library. Wouldn't MonoGame love to conquer more of the VR market, reducing virtual reality and emulator input lag via beam racing techniques?
In reality, I only need approximately ~1/67,000th second accuracy for 1080p60, so "microsecond accuracy" is simply "best effort".
But as the YouTube video proof clearly shows accuracy even better than that -- it's running on a 160KHz scanrate (1080p144) and such stable rasters in my YouTube required 1/160,000sec accuracy that MonoGame 3.6 is actually successfully doing with no modifications! Just standard realtime manipulation of .TargetTimeElapsed on the fly.....and simple System.Diagnostics.Stopwatch (0.1us) mathematics to guess raster position as an offset from VBI that's all. VSYNC OFF tearlines are just simply rasters.
To make MonoGame ready for VR beamracing applications & emulator beamracing applications, it needs to have an optional "always-use-high-performance-clocks-only" mode (at expense of slight amount of battery power on laptops).