High-quality upscale with SpriteBatch possible?

If I provide SpriteBatch with a scale matrix, it’ll resize the output using the method indicated by its SamplerState - by default Linear. The values supported by SamplerState include Point (nearest neighbor) and Anisotropic (useful for 3D projections).

I would like to use a custom scaling algorithm like Lanczos, to achieve a sharper resize than what the available SamplerState values provide. There’s shader code for this readily available e.g. Shader - Shadertoy BETA .

Is this possible using SpriteBatch, e.g. providing a custom resizing shader? How would I go about doing that? I only have very basic experience with graphics programming, so a code example would be appreciated. If not what should I look into for 2D rendering with custom scaling?

For the shader it doesn’t really matter if you provide the data via SpriteBatch or by rendering a fullscreen quad. The VertexShader may differ … for performance reason, one would normally just plot a fullscreen quad (or triangle) using the least data possible, but SpriteBatch should do the trick, especially, when it’s only done once in a while and not every frame.

SpriteBatch.Begin has a paramter where you can supply your own Effect, that would be the way to do it, but I don’t know out of my head if the VertexShader has to do anything special when doing it that way (someone else for sure can answer that).

The shader you’re aiming for is supposed to work as a single instance on the full rendertarget - so it’s not advised to just use it as an in-place alternative for regular SpriteBatch-Renderings (would work, but performance may be an issue) - so you may want to use it to pre-process your textures with it. But yea, technically you can also do it in realtime

Unfortunately, I can’t give you much info about SpriteBatch itself, but the regular approach is to just setup a rendertarget of the desired size, supply the texture you wanna scale up and call that shader - the output will be the scaled up texture (which is already on the GPU and ready to use, you don’t necessarily need to copy it, as long as you dont repurpose that rendertarget)

Ideally tho, you don’t even need to scale up anything but rather provide the textures already at a bigger scale and provide nice MipMap Levels (MG does this with the ContentManager on its own, when configured so) - nothing else needed from your side, the GPU will handle it


A custom pixel shader can definitely do that. You can create a new SpriteEffect in the MGCB editor. This will give you a basic pixel shader template. You then just have to fill in the code, basically convert that upscaling code to HLSL.

He didn’t really say anything about that, did he? It could be for a fullscreen effect, but it could also be for individual sprites.

You would be worried about performance when using SpriteBatch for a single fullscreen quad? I’m sure there is some overhead compared to doing it yourself, but is that anything substantial? Isn’t it more when drawing thousands of sprites that creating a custom system pays off?


The shader I was refering to was the linked one - there is no info about usage of it, but doing 4+ samplings per pixel is not something I would consider doing for a multitude of sprites in realtime conditions just for scaling :slight_smile: (except when its needed and deliberatly decided) - I thought it was just worth mentioning so one does not simply put that in and permanently rendering 1000 sprites with it and asking why game is slow :slight_smile:

1 Like

All very useful comments, thanks to both of you! I do plan on rendering the entire scene at relatively low resolution to a render target and scaling up the result at the end.

I see, I thought you were talking about his use case.

4 samples per pixel are not such a big deal nowadays though, especially 4 samples to neighbouring pixels on the same texture.

1 Like

na it isn’t a big deal in normal cases and we do it all the time for different effects like bluring shadows, SSAO and such.

But think of it, you have a game with say like 400 sprites on the screen, and you manually quad-sample them (you may double quadsample overlapping pixels) … and then you add a particle system, now it’s 10.000 more with even more overlapping pixels. It will still work, but it’s a waste. Doing the upsampling afterwards as a post-fx is just more efficient (if applyable) and may sometimes even yield better results (blending etc), especially if you need to upscale the background anyway.

In the end, the more devices can maintain a steady framerate, the more potential players you’ll have, but yea, I tend to overoptimize quite regulary.