Artifacts on sides of textures, background is blue except when I debug pause

With the latest MonoGame NuGet package, I’ve begun updating my project. However, there are some new troubles I’ve had with my drawing. For one, and this is only when the character is on the P1 (left, facing right) side, I get some banding artifacts on the left side of the texture. All of my spritebatch calls have SamplerMode.PointClamp, so I have no idea where these random bands are coming from, or why they’re only when the Entity is on certain parts of the screen.

The second problem is, except for when my engine is in a Debug paused state (that is, a state where no Update logic is processing… but all drawing happens just the same), my game looks like this:

But as you can see in the screenshot above it (which I got when my game was debug paused) that the drawing is sort of OK except the background color is incorrect (that is certainly not the color I used to clear it with, no, that should be #290600FF, which is WAY off), if I wait a few moments after debug pausing, the screen appears mostly correct, but it does take time. Why is it taking such a long time to render the elements properly, so long that the engine updates before it even gets a chance to display the more correct colors? You can also see the artifacts more clearly in this screenshot on the left side of P1 there. What is causing these artifacts?

For that color thing, are you sure nothing you draw is dependent on update logic? … Shader timings, etc… Anything that depends on progression, timing, fading over time, blinking, etc… Whatever you have that inverts the colors?

I see the pixel-thin vertical line on the edge of PART of the character… Are you sure your sprite sheet is clean? Maybe your stencil rect is off for your spritesheet?

The sheet (which is read into memory from another file type that basically contains an array of images with a group, index, and axis information) should be fine because when it’s on other areas of the screen, the artifacts don’t display. The sprites are actually not in the resolution you see in the screenshot and are being upscaled, so there should be absolutely no elements that are 1px wide in that resolution (all should be upscaled on the X and Y by 2.0, meaning what’s getting in there is garbage from elsewhere down the pipeline). Furthermore, it doesn’t happen to P2 side, it’s only a problem with P1, and the texture region for that sprite is going to be the same in either case.

As for the update… you were definitely onto something, as I do have an invertall function in my shader! Doing a search led me to find the culprit: an invertall PalFX I had applied to the BG in my character script file! Thank you! I’m so happy! That solves that!

1 Like

… The other artifacting you have is strange because there are seperate un connected, artifacts, only on some sprite coordinates, under certain conditions… Do you use any shader on your sprites? Maybe you have a area of alpha > 0 on your sprite sheet, and it is being shaded, or outlined, that would explain its dependance on location and facing dir… You would not be able to see very faint alpha > 0 pixels, you need your art tool for that.

I don’t see why it would do that, though, as there should be no alpha pixels there that should get shaded (the images in the textures themselves for character sprites are 8-bit indexed images that use only the R component and each pixel is shaded with a color from a palette whose index is from that red component). At 320x240, the artifacts don’t even show up, so why do they only show when I’m scaling my sprites? Should I not be packing the sprites so closely together in the texture I create and leave 1px blank borders around all edges to prevent this?

I figured maybe you had a shader that would darken a pixel if it has color, and the pixel to the left or right of it did NOT, depending on the relative pos of the light… That would create dark spots if there was any pixels that had alpha not = 0… This can happen unintentionally through the use of art programs, from anti-aliasing, smoothing effects, eraser settings, etc… It would be fast to check it using selection tool on your spritesheet, thats why I suggested it…

But have you also made sure each frame of your sprite sheet stays within its grid? … You can also verify this quickly by just setting your grid settings in gimp to match your sprites stencil rect.
BE SURE to select the empty space, so that a near invisible protrusion from a frame can be, well, near invisible. You need the border from the selection tool to be aware of any such intrusions, which may trigger a shader, or become highlighted somehow, perhaps against a contrasting back ground.

Are you manipulating your stencil rect in some manner that might cause a number to get rounded into the wrong x coordinate? … Do you use the same sprite mirrored, or seperate sheets for left right? ---- Try swapping them :slight_smile: lol, I am getting creative.

I just use one sheet (well, several that are procedurally generated), and it’s not even loaded as a PNG. They’re separate files that get read in from one big file to create a texture of images, mixed in bit-depth (8-bit indexed sprites that use palettes, 24-bit and 32-bit PNG for everything else). It would not read in alpha components for these sprites at all because they simply do not exist in the format they reside in. I use as little data as possible, so the same data for the right side is the same data for the left side unless the user makes separate sides themselves.

I use a rectangle packer to pack all these sprites into 2048x2048 textures, and that algorithm does make sure that each image stays within its grid since it just packs rectangular regions together. Again, this doesn’t happen for P2, nor does it even happen consistently to the same pixels in P1, this is clearly a problem with MonoGame’s scaling. Even modifying it to give it 1px boundaries still results in these glitches in certain areas of the screen as seen here:

Here’s this same frame, but on another part of the screen, without the glitch:

Also, I don’t think I’m modifying the stencil rect in any way. I didn’t even know that was possible.

EDIT: Modifying the rectangle packer to give a 2px gap between rectangles (1px was not enough) seems to absolve the issue. I won’t be making quite as efficient use of texture space this way, but it does at least mask the problem.

So you load separate files for each frames of animation, instead of a spritesheet… For the same reasons as before, have you verified that the frame of animation does not have some compromised pixels?

Or temporarily replace all those textures you load for the guy, with dummy ones with just a picture of a ball or something, and see if it duplicates the problem…

I mean, the scaling math can’t know what happens to be player one, so the problem must be something specific to him… right? realistically speaking… I know you have a work around, but I feel like you could get to the bottom of this.

Dude, if you zoom in on the pixels, they have the SAME colors as his hair… What the?

I load one file that is basically an array of images, and then I load in all the images, sort all sprites by dimensions so that the rectangle packer can do its job more efficiently, and then load that data into a free region on a 2048x2048 texture (it starts from blank data, all initialized to 0). When I run out of space on one texture, I create another. I keep track of which texture each sprite uses along with the rectangular region it resides in so that it can reference the correct region when drawing later.

I think what it ultimately comes down to is you can’t have sprites too close together in a texture or you risk sampling neighboring sprites in MonoGame, and giving a 2px gap after each image ensures that the next won’t sample the previous image in the texture accidentally.

I don’t think you need a gap… I mean I don’t.

I got it…

How do you set a stencil rect for your sprite animations? It’s off by one, which is why adding buffer-space makes it work.

But WAIT… WHY are you getting a sample of texture that DOESN’T EXIST in any frames? Right?
No where do you have a frame of animation that has a band of white and gray pixels on the very edge… It’s something else…

I don’t set a stencil rect, to my knowledge. I have zero clue what you’re talking about and wish to know more, in fact, because I’d like to know what I need to do to get this to work more properly. All I have is DepthStencilState.DepthRead, that is the extent of anything stencil in my code.

So a stencil rect, is when you draw a texture, which SEGMENT rectangle do you draw… This is so you can use one spritesheet, and just imagine move around a little frame, which is drawn on screen, on the position or in the rectangle you draw on…

I’m going too fast… A stencil rect, is a normal rectangle, that represents what segment of pixels you want to draw from a texture…

You apply it in your DRAW code, there are like 7 or so variants of overloads it takes, some of which include stencil rect, which can be set to NULL to draw the WHOLE texture.

So in cases like a run animation, you change the position (x,y) of the rectangle, to move it between frames of animation…

Oh, the rect is just the same rectangle I get from the rectangle packer when I create the texture (I told you, I keep track of that). It corresponds to a region of that texture. All values in that are just integers, so I don’t know why I’m getting junk only on certain areas of the screen.

Yea it’s strange one. Are the artifacts THE SAME each time, are they in the same frames? Is there pattern? … I would replace effected textures / frames, to help trouble shoot. It’s like every think we can think of is proven wrong in some other way, nothing makes sense.

Like if it were the spritesheet, both characters would have it, etc…

Or double the draw scale, see if the pixel error doubles too or stays the same… I experiment like that when I’m stuck.

Disable shaders. Disable draw on the other character. Try every knob!