Textures looks like blurried.

Some days ago I started to make a Renderer for the NuklearDotnet library that uses MonoGame. But I have one problem… the fonts look like losing their color. Here are two images that show what I’m talking about.

This is what I want to achieve.

And here is what I got.

Here is the source code. If anyone would like to see it. :slight_smile:

Okay a quick look at the code shows a lot of problems.

  1. Bitmap loading

Your code is obviously from an old Xna sample and is dangerous, definitely won’t work in an x64 build
Instead use Texture2D.FromStream()
Lot less code, lot safer, and doesn’t requir eyou to use “unsafe” code

  1. GraphicsDevice
    You are just using the default settings
    When you create it, setup things like MultiSampling and GraphicsProfile to match your desires

  2. You are setting BasicEffect.World to a projection matrix.
    Orthographic projections are just that, a projection not a world matrix
    I cannot see you setting the projection or view matrix, so I am assuming they default to Matrix.Identity which would be why anything is displayed, but still very bad and will bite you in the future

  3. Rendering
    From this code I cannot see where you are rendering the text. Is it using SpriteBatch or your basic effect?
    Be aware your coordinates for rendering the text may not be integers and if so they could end up looking like your images as the quad gets rendered over multiple extra pixels

Hope this gives you things to look at

Hi. Thanks for your answer.

I modified the source code as you said and now the bug was gone.

Except for one thing…

It seems the texts and windows got a white transparent outline. Any idea what causes the problem?

Nuklear handles everything for me. I just draw a bunch of textured triangles. The fonts are parts of the output texture.

The Render method only do two things:

  1. Update the VertexBuffer.
  2. Draw the VertexBuffer.

When I have seen this before it was because the texture had a transparent white background.

Have a look at it and change it to transparent black.

Note a lot of apps like Paint.Net use transparent white instead of transparent black so it is easy to miss

I can’t :(. The Fonts are generated at runtime.

I need a custom BlendState maybe?

First question what type of font are you using is this a bitmap or a hinted font.

Your bluring maybe due to down sampling or up sampling hard to say.

i.e. Larger texture source width or height then the destination width height, in that case you could attempt to switch to a Additive Blendstate and use Linear u v Address mode, If it is up sampling Point u v Address mode might work to some degree.

but if it is down or up sampling then just don’t do it is the real answer thats what it looks like to me streached textures onto the destination possibly some aliasing as well due to the min mag filters.

I checked in the debugger whats Blendstate its use and it’s said that it is using Opaque;
This is the result.

However, if I set the blend State to Opaque in the spriteBatch Beginning. I got a different result.

Opaque ya it looks like the background of the image is white with transparency so opaque is gonna be useless.

  1. What did you get with additive the top one?
  2. Can you tell nucleak to make the font background color transparent black.
  3. Is the scaling altered at all for the drawing if it is make it 1,1.
  4. Pretty sure the real problem is its screwing up the source destination rectangles.

Ah wait, is this the begin call that draws the text too…

spriteBatch.Begin(samplerState: SamplerState.AnisotropicClamp, depthStencilState: DepthStencilState.None);

Ansiotropic can have a heavy antialiasing effect on 2d quads.
If it is get rid of it and replace it with linear if that doesn’t give better results replace it with point clamp and see how it looks.

Also System.Drawing is not cross platform it will only work on windows. Are you doing this because it only gives you the intprt for load the font texture?

public override Texture2D CreateTexture(int W, int H, IntPtr Data)
    Drawing.Bitmap Bmp = new Drawing.Bitmap(W, H);
    MemoryStream memoryStream = new MemoryStream(W * H);
    Imaging.BitmapData Dta = Bmp.LockBits(new System.Drawing.Rectangle(0, 0, W, H), Imaging.ImageLockMode.WriteOnly, Imaging.PixelFormat.Format32bppArgb);
    for (int i = 0; i < W * H; i++)
        Marshal.WriteInt32(Dta.Scan0, i * sizeof(int), Marshal.ReadInt32(Data, i * sizeof(int)));

    Bmp.Save(memoryStream, Imaging.ImageFormat.Png);

    return Texture2D.FromStream(_graphics, memoryStream);

If you really can’t get around this then your going to be limited to windows anyways at which point you might as well just process the pixels as you load them with a alpha threshold and fully brighten anything over said threshold to fully white opaque and zero out the rgba on any pixels below it as you load it in above.
Or you could get data even then redo it, blahh double work triple really.

God i hate to say this but if your not bound to windows and if nothing works weapon of last resort is to write your own shader and clip low alpha pixels and boost anything above the threshold to fully white pixels, but that is the hackyest of hacks for this type of thing.

It might just be that nucleax sucks at text. Alpha blend should work in the first place.
I think this is because its destination rectangle doesn’t match the source rectangles width and height per character. But try getting rid of Ansiostropic ecckk this just might be that library.

Thank you for your help guys.

Ok in the past days I worked in a code and make a lot of changes and finally found the 2 bugs that are causing the problems.

The first bug that I fixed is with the texture creation and with the white background. ( As you said. )
Yeah, using the System.Drawing library I lose the cross-platform option so I started to make a new way to create a texture from a pointer. This is a cross-platform code and working with no bug. :smile:

Setting The Sampler to PointClamp and the White background to black is resolve the font textures problems.:slight_smile: