Pixel Art Resolution Scaling


Im rendering pixel art with a native resolution of 480x270 with aspect ratio 16:9. We can easily make this independent and scale it too any size using black bars to perserve the aspect as explained in this classic article:


However this creates problems at certain resolutions where my tile map will display with verticle lines in between them and flicker along with pixel decimation. This indicates a floating percision rounding problem usally but i havent been able to solve it.

There is a potential solution but it seems rather silly:
-Render Native x4 too 1980x1080 using scale
-Render Upscaled image too screen by Downsampling too viewport resolution with black bars

Now this does work perfectly for most cases but if a user doesnt have 16:9 1920x1080 display it would become blurry. I could upscale too the highest possible integer avliable on there display and downsample that too there screen but it seems wasteful.

The other benift of upscaling is subpixel camera pcercison movment as the origional articles suffers from jittery camera movment atleast in my case.

If anyone has ideas for solving this in a more elegant way please let me know.


1 Like

Draw first to a render target at your native resolution, 480x270, then scale to fit your screen resolution. If you want everything to stay the same aspect ratio, you’ll first have to calculate the correct integer scaling factor. For example, if your screen resolution is 1280x720, the scale factor is going to be 2.67. You might want to round this down to 2 and pad using black bars, or you might want to round this up to 3 and have a decreased map view.

This type of solution works great if everything you want to render will be in that native resolution. However, if you want some elements of your game to be higher resolution (UI, particles), or you want elements in your scene to rotate but have anti-aliasing applied at the screen resolution, you might have to get a bit more creative and scale everything individually at render time.

Hopefully that helps :slight_smile:

Your suggestion is the classic route, however it leads to jittery camera movment due to lack of sub-pixel percesion. Is you use the Resoloutions caling method you can scale the sprite batch too fit the screen at any resolution with correct aspect but that causes Decimation and the lines.

So scaling 480x270 too Full HD then scaling FullHD too the viewport with black bars is the creative solotuion you hinting at but it still seems like there should be something a bit better than that as like i said if someone has 4K you end up with problems.

The only work around i can think of is always find the highest supported resolution that is a multiple of the native target and scale that too fit… just seems wrong though.

It shouldn’t be jittery if your character in the native resolution is moving one pixel at a time. As long as you’re moving one world unit, this will translate to a consistent screen amount. In the case of 1280x720, you’d move 2 screen pixels at a time, or 3 screen pixels at a time, depending on your scaling choice. At 1920x1080, you’d move 4 screen pixels at a time.

For the most part this should be fine… but yea, if you wanna go further, like I said, you’ll have to scale up at render time. Here’s an example where I did that for a prototype I was developing a few years back…

I scaled up each individual sprite to the desired screen resolution (maintaining aspect ratio) and rendered with anti aliasing on. I liked the look I achieved, but it was also a pain :wink:

Hi thanks for the reply, thats essnetially what im doing yeah.

Method 1: Classic Upscale

  1. Render Target at 480x270
  2. Upscale too Screen

Easy to implement


Method 2(yours): Indepdent Resolution Scale:

  1. Native 480x270
  2. Find ViewPort that fits Aspect at screen res with pillar or letter box bars
  3. Find scaling ratio and create matrix too map 480x270 content too screen res
  4. Render all spritebatch content upscaled

Smooth camera no jitter

Lines appear on tiles (see first post), pixel decimation also occurs skippign every other pixel

Method 2: Compromised Scale

  1. RenderTarget at Closest next power of 2 too the screen resolution
  2. Scale Native Sprites too RenderTarget created above.
  3. DownSample RenderTarget too Screen But use a scaled ViewPort with Pillar or Letter box

This method has no swimming, No Artifact lines, smooth camera (dependent on power of 2).

Slightly over renders

I feel like we’re not communicating effectively here haha. Sorry, I’m trying my best!

I would not expect Method 1 to have jitter, provided you are resolving your final positions to the nearest world (native) pixel.

I would not expect Method 2 to have lines, provided you are resolving your final positions to the nearest world (native) pixel.

All your scale factors should be integers. An integer multiplied by an integer should be another integer. You get gaps like that when you have sub-pixel values in your world, or when you are scaling your world by a non-integer.

If I have some time tonight or tomorrow I will try to do up an example.

Ha yeah maybe im just not getting it, but the break down is this:

Method 1:
Jitter comes from snapping from pixel too pixel in a low resolution and scaling too screen res, tbh it might be okay? its just sudden because its snapping by whole pixels rather than sub pixels so it feels juddery.

Method 2:
No Jitter as it has sub pixel movement
World has lines however even if camera is locked too int

Method 3:
No Jitter as it has sub pixel movement
Everything works fine but seem in-elegant but works.

Ah yes, I see what you mean. I don’t think I’d call that jittery. You’ve got a bit of acceleration there and so, as the camera begins to decelerate, it only moves one pixel at a time.

I actually think this looks fine. If you were playing this game on a NES, it would look like this. However, if you want to increase your overall resolution and just have the native look so you can achieve special effects (such as moving less than a native pixel) then yea, those other solutions are the way to go.

Right, it’s because your aspect ratio is 1.5. Since your scale is non-integer, you run the risk of your tile coordinates landing at mid-pixel boundaries, which causes the rasterizer to have to guess which pixel it belongs in. This is why I mentioned using a scale factor of only integers… either 1, 2, 3, 4, whatever fits best in the resolution you have available. Either draw more/less, or bound the remaining area with black boxes. As soon as you scale by a non-integer value, you’ll run into this type of thing.

I wouldn’t worry too much about “in-elegant” haha. If it works, great! Refactor later if you need to.

I’ll suggest one more options… you could combine methods 1 and 2 if you wanted. I can’t quite tell if your character and camera are independent or not… I think they are. It looks like the character finishes moving and the camera lags behind a tiny bit… yea?

If I’m understanding this correctly, you could render your world to a 1x scale render target at one tile (in all directions) larger. Then scale it up to the screen resolution, positioning it at the correct offset.

So lets say your current native offset is x=2.4 pixels and your scale factor is 1.5x as you are using in the example here. Keep that floating point offset, but render to a render target at the floor of your offset (ie, 2). The render target should be able to contain an extra tile, so it should be 480 + tileWidth wide.

Finally, render your tile layer to the screen but subtract the offset you omitted before, so instead of rendering at x = 0, render at x = -1 * (2.4 - 2) * 1.5.

I dunno if that makes sense. It’s a bit involved and if you’ve already got something working, probably no point in monkeying with it further. Work on something else and come back to it :smiley: Also, I do think that Method 1 looks good… and is representative of what you’d see if you ran this on an old console.

(I like your sprites by the way!)

Yeah i agree Method 1 seem authentic, i can just do the same as method 3 and retain the aspect ratio with the black bars and render the target into the corrected viewport.

So basicall the only difference between Method 1 and 3 is 3 is upscaled too HD with sprite scaling then rendered too viewport. Method 1 is rendered at native 480x270 and scaled too corrected view port.

Movment is Box based:

And all credit for sprites go to derek yu as i have basically remade the splunky classic sprites for debug purposes shh dont tell anyone.

1 Like

Ah, I didn’t recognize them from Spelunky. I haven’t played it very much :slight_smile:

In Jetboard Joust I do the following:

  1. Find the appropriate integer scale for your display.
  2. Scale everything at draw time (don’t render at a smaller size first).

This gives you sub pixel movement and also allows for smaller than ‘art pixel’ style rendering for things like particles. You do have to watch for jitter caused by rounding errors a bit but I didn’t find this a big issue.

I also found I needed to render UI elements at higher resolutions when translating into non-roman character sets (Chinese, Japanese etc). Basically this method gives you more flexibility but rendering smaller and scaling everything up is the more bonafide ‘retro’ approach.

1 Like

Thanks the the message, what im doing right now is finding the cloest integer too the monitor creating a render target at that scale. Scaling all the sprites up rendering too that then rendering that target into an aspect corrected viewport in the back buffer.

The full Code is avliale here:

Let me know if thats close to what your doing? it seems to work nicely althuogh at certain resolutions can become slightly blury because i linear interpolate. I think your method is too find the closet int scale and scale everything too that and NOT correct the aspect thus just rendering either slightly more or less of the scene? thats probably a good solution too i might just add a toggle for that.