Forcing high-performance mode for laptop GPUs?

Hey guys,

There’s a problem that can happen in Monogame on Laptop systems with an on-board intel HD chip and a mobile GPU. Nvidia has this system called Optimus that renders the screen using the intel GPU and then only switches on the Nvidia GPU when needed for high-load games, and the Nvidia GPU then passes its output to the intel GPU for drawing to the screen. AMD has its own version of this called PowerXpress.

The problem is that some games don’t use the real GPU unless the user manually creates a profile for the game and tells it to run in high performance mode. While that solution works, we can’t exactly just tell all affected players to follow a complex set of instructions to fix this problem as some players won’t be computer literate. Nvidia and AMD have both released simple fixes for this that involve exporting a global keyword from the executable which the driver will pick up on, and the C++ code for that is very simple:

extern "C"
{
  __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
​  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

The problem is, there doesn’t appear to be any way to do this in C#, or at least I can’t for the life of me figure it out. The global reportedly has to be exported directly from the game executable rather than a DLL too, so we can’t do it that way. The one way that seems to work is this one ( Exporting data symbols in C# for NVIDIA Optimus ), which uses a dirty post-compile hack that takes several minutes to execute and stops you from using breakpoints and debug tools.

Optimus has been mentioned a few times over the years online (including on this forum) but there doesn’t seem to be a solution posted for this yet. Has anyone managed to find a fix for this? I wouldn’t be so concerned if this were just a matter of performance, it’s actually causing frequent DXGI_ERROR_DEVICE_REMOVED crashes for at least one player on parts of the code that send info to the GPU such as creating vertex buffers or calling DrawUserIndexedPrimitives.

I’ve been struggling to figure out something very similar on my game. Same problems. Game crashing from GPU

Can you can use the post-compile hack only for RELEASE builds? Then your player gets what they need and you can still debug during development.

Yeah, this is probably what I will end up doing in the absence of other solutions. Right now the only difference between my debug and release builds is code optimisation and one event handler (for a crash reporter), so I shouldn’t end up with any weird heisenbugs that only show up in release. Still not sure if it’ll fix my user’s crashes, but I will report back if that is the case.

Did you roll your own crash reporter, or use a 3rd party lib? What’s the set up?