GPU memory leak

I have a GPU memory leak somewhere. Visual Studio works well to detect memory leaks, but I haven’t found any tool to detect GPU memory leaks in Visual Studio.

Does somebody have some experience in debugging those GPU memory leaks? I’ve been trying to use DirectX debug output ( https://blog.rthand.com/post/2010/10/25/Capture-DirectX-1011-debug-output-to-Visual-Studio.aspx ) but I’m getting no DirectX information (although the log outputs some C++ exceptions not related to DX). Has somebody been able to get the debug output working?

The game is quite complex and bypasses Content.Load most of the time, and I’d rather avoid to check all gpu allocations without having at least a clue of where to search them.

Any idea on how to tackle this issue would be welcome. (on worst case scenario, I’ll modify monogame to log all allocations and disposes of textures, vertex buffers, …)

Thanks!

If you are loading in textures or just about anything without the content manager.
You absolutely must call dispose on it or any texture you want to unload or before you reassign a from streamed texture to that reference. Also before you re-use that same reference even if you have disposed it, you must also call IsDisposed before reassigning it. If that call returns false you should not use it, until it returns true.
You should also never set a undisposed texture or resource to null before disposing that is a automatic gpu cpu memory leak.
If a app crashes without disposing its resources all those resources will memory leak.

You have very little actual control over the gpu or device driver and the gpu may just keep things in memory until it decides it needs room provided you have told it already the resource is disposed. In other words it make take some time before it unloads the memory itself for the resource.
The tricky thing here is there is a question of who owns the resource the gpu the gc the driver or you.

The way i did this sort of testing was to actually fill up the gpu memory on purpose id find the point were my gpu would throw a out of memory error. Then start loading and unloading textures and overwriting them ect when the gpu was just short of that point. To make sure there were no memory leaks and that the application wouldn’t crash because i was doing something improperly. Once you know how to do it right the rest is straight forward.

In doing this you will quickly realize that the gpu’s memory leaks are sneaky persistant. They carry over to other applications, as well as when you rerun the same application the second time and indeed the entire os can be bogged down by this situation. Until the computer is restarted. In fact another program can do this to your program.

There are quite a few gpu tools around i used to use msi afterburner to look at the basic gpu values but im pretty sure there are debugging tools if you search around for them. Though using the above tactic you don’t need them.

1 Like

Hi will, thanks for the answer.

I’m aware of IDisposable pattern, in fact I use it extensively. It’s just that sometimes a bug appears and tracking those are hard without valid tools.

I found no tool to check the problem so at the end I went with my plan and added some Debug.Writes to MonoGame. I just printed the IntPtr of the native pointer of the texture
after every SharpDX texture creation (+stack trace), and another print in the Dispose function.
Once done, it’s been just a matter of finding the IntPtrs which were not freed. It was way way easier than I thought (jftr the culprit was a RenderTarget for the shadow of a point light which flickers disabling it from the ECS. Depending on when you exited the room it was not disposed.)

Just curious, what kind of information do the MSI tool gives? From screenshots I’ve only seen the memory usage of the GPU, but I haven’t been able to find a screenshot with more information.

thanks,

Ya it just gives basic info on memory usage and the state of the gpu power use temperatures clock speed shader speed ect. It’s not a debuging tool its more of a gpu tweeking tool.
Only suggested that as it would let you see the memory use as you load or unload textures and resources. Like i said there are some real gpu shader debuging tools out there but i don’t know them by name.

I’ve been adding code to MonoGame to print debug information when at the end I realized that, if I’m not wrong, there’s a single point where you can detect all the memory leaks. I’ve also added a stack trace to know where the leaked GPU memory is originally created.

I’m posting the modified code just in case somebody finds it useful ( or when my HDD crashes and have to create the patch again ^^ ):