Texture bleeding when using non whole number scaling


I am drawing a tilemap using a texture atlas where I get the tile I want to draw using the spritebatch source rectangle argument. Pretty standard stuff. The issue comes when i use a non whole number value to scale the the transform matrix. The texture atlas source rects start to bleed like so:
Screenshot 2021-07-06 103432

I am using point clamp in the sprite batch and in my custom shader as well:

Texture2D SpriteTexture;
sampler2D SpriteTextureSampler = sampler_state
	Texture = <SpriteTexture>;

Texture2D MaskTexture;
sampler2D MaskTextureSampler = sampler_state
	Texture = <MaskTexture>;
	MagFilter = POINT;
	MinFilter = POINT;
	Mipfilter = POINT;
	AddressU = CLAMP;
	AddressV = CLAMP;

struct VertexShaderOutput
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 TextureCoordinates : TEXCOORD0;

float4 MainPS(VertexShaderOutput input) : COLOR
	float4 color = tex2D(SpriteTextureSampler, input.TextureCoordinates) * input.Color;
	float4 mask = tex2D(MaskTextureSampler, input.TextureCoordinates) * input.Color;

	return color * mask;

technique SpriteDrawing
	pass P0
		PixelShader = compile PS_SHADERMODEL MainPS();

I’ve also tried adding padding to the texture atlas but the tearing is still there. Could I get some help with this issue?

The sprite is the tree and the mask is the texture effect? What happens if you put the point clamp on the sprite instead of on the mask?

You’ll probably only need the padding if you are downscaling the texture when using a point filter.

Yeah I forgot to mention, I’m using texture atlas for the mask only, not the base textures/sprites. So this is how the masks only look like:

I am not downscaling any textures, only upscaling and the padding doesn’t work when I tried.

Ah OK I can see better what you’re doing now. Looks like your source rect isn’t quite aligned to pixels here, somehow? It looks like it’s maybe half a pixel out horizontally in some places, which could cause the sampler to land between pixels and unpredictably pick the wrong colour as you scale as a result of small floating point errors. I don’t think this is a shader problem, maybe post the relevant part of the C# might see what is happening.

So I’ve solved this by changing by zooming increment from 0.25 to 0.2. To be honest I’m really unsure why it fixes it, but oh well, at least the problem is gone.

1 Like

Another fix that worked for me is setting the GraphicsDeviceManager.PreferHalfPixelOffset to true.

The best strategy I’ve found for dealing with these sorts of issues is to render the tiles to RenderTarget and keep the scaling at 1:1, and then draw the render target texture with the scaling that you want.

So, in case anyone is puzzled about this: keep in mind that floating values are subject to imprecision given their binary representation. Therefore, what might have happened in your case is that 0.2 has a more accurate binary value, whilst 0.25 does not. Please, notice that it’s not an error within MonoGame, C# or .NET, it’s just how floating points are implemented following the standard IEEE 754.

This is more like a heads up in case someone tries to fix their problem by changing constants whilst keeping their fingers crossed, hoping it works. It may look like “luck” or “bad luck”, but it’s really this floating point inaccuracy that sometimes become more evident depending on the figures used.