For the record, I would like a constructive conversation, and a deeper understanding of how I could go about resolving this. And yes, I was perhaps a bit frustrated by my interpretation of some of the responses. I apologize for that. But if there is light to be shed on this, I would like to hear it.
As of right now, network issues can be ruled out as well. I can toggle between my localhost server and an internal fake server that single player uses. In either case, latency would not be an issue, and in singleplayer there’s no traffic even being generated.
So, single player, 2D. And as a recap, the entire scene currently updates in something like 0.05ms, or about 340 times per frame. There is no GC occurring; all objects are correctly pooled, and in the test runs I’m working with there is no purging of any objects (or at least none that I’m aware of, and certainly none that have incurred a GC penalty).
All rendering is separated logically within the appropriate methods, and the render loop is equally fast.
I am using Monogame’s fixed timestamp, rather than implementing my own. This is because of an assumption I’m making, which clarity on may help resolve an underlying issue. My assumption is that Monogame will run the game logic (update method), and immediately upon completion of the game logic, trigger the render update. Then, it will simply wait until the next frame is ready to trigger (e.g. if(time > nextElapseTime) { do game logic })
Now, assuming all of those things are true, I don’t understand why vsync would matter. I could equally understand why someone else might be inclined to think the matter is due to my game code.
But if it has to do with my code, then the movements would also have to be janky when rendering is keeping up. And unless anyone here can refute that particular statement, I am left to conclude that something is happening at a deeper level (maybe Monogame, maybe hardware, whatever). Because that is not happening.
In which case, I am here, looking to the experts to help me understand why.