DirectX game not using Nvidia GPU

I’m using a laptop that has 2 GPUs: an Intel HD Graphics and a Nvidia graphics card. I am also running a two monitor setup (I’m only running the game on one of them, of course). When I request the GraphicsAdapter descriptions, I get 2 names (one for each monitor). They are both the Intel HD Graphics: no Nvidia. When I set ToggleFullScreen() with the default HardwareModeSwitch(true) and look at my GPU usage, my Intel HD Graphics is maxed out and my Nvidia is at 0%. How do I use the Nvidia GPU?

[System.Runtime.InteropServices.DllImport("nvapi64.dll", EntryPoint = "fake")]
private static extern int LoadNvApi64();

[System.Runtime.InteropServices.DllImport("nvapi.dll", EntryPoint = "fake")]
private static extern int LoadNvApi32();

void TryForceHighPerformanceGpu()
{
    try
    {
        if (System.Environment.Is64BitProcess)
            LoadNvApi64();
        else
            LoadNvApi32();
    }
    catch { } // this will always be triggered, so just catch it and do nothing :P
}

Put this in Program.cs (or whatever you’ve renamed it to) and run at the start of Main()

4 Likes

It worked! Thank you kind sir, and for the quick answer :^)

1 Like

No worries, ran into the same problem a while ago. If you find a way to do something similar for AMD GPUs I’d love to hear about it. Only way I’ve found thus far is via DllExport, which is not that straight-forward in C#, nor that elegant.

3 Likes