Tiled rendering issue when scaling up low resolution tilemap

Using Tiled I want to make a game in the same graphical style as the Gameboy where there’s a viewport resolution of 160 x 144. This would make it where each tile is 8 x 8 and there would be a width/height of 20 x 18 of those tiles.

These are the 8 x 8 tiles I made in Aseprite to be used as Tilesets in Tiled:
Screenshot 2021-05-01 161604

Arranging the tiles a certain way in Tiled I would like the tilemap to look like this:


(it’s supposed to be a room with an entrance, walls, a floor, obstructions on the inside and the white space at the bottom is intended to be the HUD. I’m no artist, I know)

Now if I attempt make the screen size 160 x 144 in MonoGame, the pixels look consistent but it would be too small to be playable:
Screenshot 2021-05-01 154949

Screenshot 2021-05-01 154546

So I figure I would increase the window size by a factor of 4 (to which the BoxingViewportAdapter in MonoGame Extended will scale up the pixel size accordingly):
Screenshot 2021-05-01 155041

But here are the results:

See how it looks inconsistent and wavy where some of the pixels stretch and some of them cut off? I do not like how that looks!

Is there any way to scale up this low resolution tile map where each of the pixels are presented consistently or is it not advisable to do it this way? If not then what are the alternatives?

If this has any importance here’s what the Draw method looks like:
image

Hopefully the screen shots are helpful but let me know if you need to see code to get to the bottom of this and I’ll get to uploading it on GitHub real soon. Thanks!

Hmm, that looks right. You’re scaling by a whole number and you’re using PointClamp. I wonder if the resulting scale factors of BoxingViewportAdapter.GetScaleMatrix isn’t what you expect it to be. That’s the only explanation I can offer at the moment, sorry!

I don’t know if this has any significance but when i go to the tileset tsx file I get these errors:

However when I close that file in Visual Studio and run the application it works fine so I usually ignore it. But I’m throwing that out there in case that may have anything to do with it but I’m not sure.

Visual Studio thinks your Tiled .tsx files are TypeScript files and is attempting to parse the file using TypeScript syntax.

Thanks, I’ll sift through this information you’ve given me as soon as I’m able. In the meantime I put the code that this concerns on GitHub, in case you see anything amiss.

I agree that a render target is probably a better way to do it, but what he’s doing should still work. If you look at his resulting image, the aspect ratio is off. The only thing that would cause this would be the scale matrix he’s passing into his spritebatch on begin.

That comes from the BoxingViewportAdapter, which is a part of MonoGame.Extended and I don’t really have much knowledge about it.

@ReadrX - Try something for me… stop using BoxingViewportAdapter for a bit. Manually set your screen resolution to something like 1024x768 and then replace your scale assignment with the following…

var scale = Matrix.CreateScale(4f);

You’ll have excess space on the right and bottom, but I suspect the pixels will be square.

*Edit: Oh! I just noticed something. Not only are you passing scale into your SpriteBatch.Begin call, you’re passing it to the tiledMapRenderer.Draw call. Again, I don’t know what this does, but I would only expect scale to be required once. Either on the SpriteBatch.Begin call, which will scale everything that SpriteBatch draws, or on the map draw, which would presumably scale the things that it draws.

Thanks, having a way to reproduce the problem allows for better bug squashing.

May I ask what operating system are you using? On what hardware are you testing? Are you using OpenGL or DirectX? What version of MonoGame?

The Tiled map is not rendered using SpriteBatch. However, yes, it is still an orthographic projection, and yes the view matrix may have a scale.

The problem is due to how the texels are mapped into pixels with magnification. This process happens when the texels and pixels are not 1-1. As it was mentioned earlier, a work-around to this problem is to render where the texels and pixels are 1-1 to a framebuffer (RenderTarget2D).

Dell, Windows 10, 64-bit operating system and x64-based processor, Intel® Core™ i5-3470 CPU @ 3.20GHz,16GB RAM, OpenGL, MonoGame 3.8.

Here’s the MonoGame Extended files concerning the Viewport Adapters which will show what is does with scaling. BoxingViewportAdapter inherits from ScalingViewportAdapter which has the code on how it scales.

I also tried implementing the scaling code manually (taken from the project files to the book “Learning C# by Programming Games: Second Edition” https://csharpprogramminggames.com/) but the same problem occurred.
csharpgames/ExtendedGame.cs at master · egges/csharpgames · GitHub

So I’ll be thrilled to try out the method you recommended and will ask question as help is needed. :+1:

In case you were curious of the results of your suggestion

Screenshot 2021-05-01 195537

Screenshot 2021-05-01 195457

And if I take the scaling matrix out of the the Draw method of the tilemapRenderer it’ll look like this

Hmmm interesting. As @LithiumToast pointed out, the tiledMapRenderer call doesn’t even take the spritebatch. I looked at the code and it looks like it’s drawing a textured quad with a cached texture that it scales using the matrix you pass in. It looks like it renders this with a default effect, or one that you can override. I’m not sure if that has its sampler state set to point clamp by default.

If the quad that gets drawn is the same size as the scaled viewport coordinates, I wouldn’t expect to see this… but the bottom line is that I don’t know as I don’t have experience with MonoGame.Extended.

I know that if you render the tiles yourself via the spritebatch, or if you draw 1x scale to a render target and then draw that scaled up to the screen with a sprite batch as Lithium suggested, it will work because I’ve done both of these things :slight_smile:

There is a reason why SpriteBatch is not used. Rendering tiles with SpriteBatch is not very efficient in the case where that geometry doesn’t change on a frame-by-frame basis. You can think of it as:

  1. Building the geometry to render at specific points in time (not frame-by-frame). The geometry is stored in video memory as a polygon mesh in the format of vertices and indices.
  2. Render that geometry stored in video memory every iteration of the game loop. Note that the model (world), view, projection matrices can transform the polygon mesh on a frame-by-frame basis.

I got it to work! Here’s what I did:

Screenshot 2021-05-02 121533

In LoadContent

and in Draw

Result:

Click on the GitHub link I provided previously if you want to see the code in full.

Now will someone explain to me what happened behind the scenes that got it to work this way as opposed to the previous way that didn’t work?

1 Like

Yea I figured it was for optimization; however, there would have to be a way to control the sampler state on the effect used to render the quad, wouldn’t there?

I dug a bit through the code and saw where the effect was created, but didn’t see where any sampler state was set. I’m not even sure if that’s why his rendering was incorrect. A quad mapped to the screen space should yield the same aspect ratio. That looked off and so I think something’s up.

He’s solved his issue with a render target. I’m more curious than anything else, really :wink:

It has to do with the half-pixel offset that is added to the UV coordinates. It was originally added to correct for texture bleeding. The value that used for the half-pixel is most likely not 100% correct, it is likely that instead of adding a half-pixel offset to the UV coordinates the Position coordinates should be subtracted by half a pixel. It also likely that instead of only applying the offset to the top/left positions of the quad that it should also apply to the bottom/right positions of the quad. When mapping samples of a texture to fragments (pixels), the problem gets worse when scaling up as the UV mapping is not exactly right causing anything other than 1:1 to not be perfect. Anyways the half-pixel offset was removed in the master branch of MonoGame.Extended because people no longer seem to have issues with texture bleeding; likely that MonoGame fixed this issue internally.

1 Like

Just a correction. I don’t know why the way I tried it above worked for that project but it didn’t work for another project I tried. Here is the correct way that I changed it to

In LoadContent:
renderTarget = new RenderTarget2D(GraphicsDevice, worldSize.X, worldSize.Y); GraphicsDevice.SamplerStates[0] = SamplerState.PointClamp;

In the Draw method:

    protected override void Draw(GameTime gameTime)
    {
        GraphicsDevice.Clear(Color.Black);
        GraphicsDevice.SetRenderTarget(renderTarget);

        _spriteBatch.Begin(SpriteSortMode.Deferred, null, SamplerState.PointClamp, null, null, null, null);

        tiledMapRenderer.Draw();

        _spriteBatch.Draw(playerSprite, playerPosition);
        _spriteBatch.End();

        GraphicsDevice.SetRenderTarget(null);

        //render target to back buffer
        _spriteBatch.Begin(samplerState: SamplerState.PointClamp);
        _spriteBatch.Draw(renderTarget, new Rectangle(0, 0, worldSize.X * 4, worldSize.Y * 4), Color.White);
        _spriteBatch.End();

        base.Draw(gameTime);
    }