[SOLVED]RenderTarget2D.GetData slow at certain resolutions

I have a screenshot function in my game that takes screenshots by drawing the screen to a RenderTarget2D then it extracts the data using the GetData method so it can be saved. I can configure my game to run at any resolution in a window and found today that certain resolutions causes the GetData method to take a significant amount of time, causing a very noticeable hitch in the game.

What confuses me is that it isn’t the higher the resolution the longer it takes, its only certain resolutions. I can run my game at 1920x1080 or even 3840x1080 (my game can support dual monitors) and I don’t get a hitch at all, but if I run at 1680x1050 or 1366x768 I get a hitch every single time I take a screenshot.

Is there anything special about those resolutions that would cause this kind of behavior with RenderTarget2D.GetData?

I haven’t heard anyone talking about that before, nor have I experienced problems like that. (Not that I’ve seen everything)
Could it have to do with hardware or OS settings on client machine? Or maybe even some code that is dependent on the resolution values?

I just found the issue after thinking about it a bit longer. I found that all the resolutions that were having the issue had widths that were not evenly divisible by 32. If I modify the render target so that it pads extra pixels so that it has a width divisible by 32 it works great.

Good find! -Don’t forget the [SOLVED] :slight_smile:

Why do you need GetData?

Stream stream = File.Create("Screenshot.png");

_renderTarget1.SaveAsPng(stream, _renderTarget1.Width, _renderTarget1.Height);

stream.Dispose();
1 Like

I have a PR up that implements GraphicsDevice.GetBackBufferData which can be used for taking screenshots when you render to the backbuffer directly. Tested and working at least for DirectX and DesktopGL. PR is here: https://github.com/MonoGame/MonoGame/pull/5114

1 Like