Indexed Texture2D?

does anybody know how to index a texture2d and then use those indexes to look up a color in a list of colors?
I am making a game where all graphics will be indexed lists of Texture2D’s with 1 color being transparent.

I am wondering how to do this and if I need to write my own Texture class to achieve something like this?

can you specify a bit more what you mean?

Here is a color grading filter, maybe that helps a bit https://github.com/Kosmonaut3d/ColorGradingFilter-Sample

I am thinking more of a list of colors

I need to index the images when imported and lookup the colors when drawing
for example a 3x3 image would look like this
{ 1,0,1, 0,0,0, 1,0,1 }

and the palette would look like this
{ 0x0000FF, //Color 0 is Always Drawn with Full Transparency 0xFF0000 //Colors Above 0 are Always Drawn with Full Opaqueness }

which would result in this image…

bump I would still like to know this

Is your problem about handling transparency from textures ? (=AlphaTest ?)

Indexed (paletted) textures were used a lot back in the old days, but graphics APIs these days don’t support paletted textures. You would be working against the graphics API to implement paletted textures yourself.

You would need to write your own shader that takes an indexed texture (probably SurfaceFormat.Alpha8 for 256 values per pixel) and a 256x1 “palette” texture of SurfaceFormat.Color that represents the palette. Use the value sampled from the indexed textured to generate a texture coordinate to read from the palette texture. This is then your colour to return from the pixel shader.

1 Like

Why not just make your texture transparent? Why go through the trouble of making one Color in the pixel array be handled in a way that could be done upfront?

@KonajuGames
thats too bad. I don’t really know much about shaders but I don’t want to just apply the colors,
I need them to mapped when imported and then able to be swapped during runtime depending on that map.

@Spool
its not just the transparency that I want but the indexed colors as well

There’s Texture2D.GetData to get textures color information

namespace ZGDK
{
public static class Helper
{

    /// <summary>
    /// Get colors from texture and save to another texture.
    /// </summary>
    /// <param name="pTargetT2D">Target texture</param>
    /// <param name="pSourceT2D">Souce texture</param>
    public static void GetTextureColors(Texture2D pTargetT2D, Texture2D pSourceT2D)
    {

        //*>> Get the size of the texture
        //
        int m_Width = pTargetT2D.Width;
        int m_Height = pTargetT2D.Height;
        int m_TotalPixels = m_Width * m_Height;


        //*>> Set the array that holds pixels data
        //
        Color[] m_TargetColors = new Color[m_TotalPixels];
        Color[] m_SourceColors = new Color[m_TotalPixels];


        //*>> Initialize target texture with transparent black.
        //
        for (int i = 0; i < m_TotalPixels; i++) m_TargetColors[i] = Color.TransparentBlack;


        //*>> Get data from source texture.
        //
        pSourceT2D.GetData<XF.Color>(m_SourceColors);

        //*>> Get colors only.
        //
        for (int i = 0; i < m_SourceColors.Length; i++)
        {
            if (m_SourceColors[i] == Color.TransparentBlack ||
                 m_SourceColors[i] == Color.Transparent ||
                 m_SourceColors[i] == Color.White ||
                 m_SourceColors[i] == Color.Black) continue;

            m_TargetColors[i] = m_SourceColors[i];
        }

        //--> Save the pixels to the target texture.
        //
        pTargetT2D.SetData(m_TargetColors);


    }//EndMethod


}

}

this is what I want to do as a graphics system…

btw this is made in allegro 4 and uses indexed color mode which is now outdated.

I am thinking more of a list of colors

I need to index the images when imported and lookup the colors when drawing
for example a 3x3 image would look like this
{
1,0,1,
0,0,0,
1,0,1
}

GetData on the image coming in it will end up in a color_array [].

create a convert method for the r g b a component to a integer value.

public int convert(Color c)
{
return = c.r + c.g256 + c.b(256256) + c.a(256256256);
}

Create a ‘new color array’ of the same size as the original image [W*H]

if your palette is hard coded just read from it using the value in the palette.
you can read in a palate in the same way or generate one.

for(int i = 0; i < img.Width*img.Height;i++)
{
newcolorarray[i] = mypalatecolors[ convert(color_array[i]) ] ;
}

use SetData to put the newcolorarray into a texture and display it.

Its really about encoding and decoding.

This could be done in a pixel shader with multitexturing directly if one texture held the values and the other was used as the palatte. but…

They are right to tell you this is a huge waste, recoloring can be achieved far easier in a shader.
Hex code letters are just representations of 4bits of data a nibble.
so 0xFF is just 11111111 in binary.
Having 24 bit color tables is pointless when the gpu uses 24 bit colors directly from the textures it keeps in its own memory. Their is plenty of memory nowdays.

@willmotil thankyou so much however regarding the shader method

  • can it take in a array of colors instead of a texture for the palette?
  • can textures be encode and decoded as binary data?
  • how would the shader method be achieved specifically for this case?

This is pretty much all elaborating on what konju said before.

can it take in a array of colors instead of a texture for the palette?

I have never tried to directly pass a array to a shader. Regardless you can use multi-texturing remember a texture 2d is basically just a 2d array.

Indexing mathematically a color palette in a 1D array[] to a Texture2d or vice versa is very simple you basically use the following indexing formulas such that it doesn’t matter if its in a array [ ] or a texture 2d [ , ].

You can use this same method later to instead encode a color index into a u,v look up in the palette
.
// 1D index to 2D index
// were i is a index to a array in a for loop
y = i / img.Width;
x = i - (int)(y) * img.Width;

// 2D index to 1D index
i = y * img.Width + x;

can textures be encode and decoded as binary data?

Yes of course its all in binary however they are stored as vector4 normalized floats in a texture.
This means you convert values of a color element back and forth like so.

int myblue = 24;
float normalizedblue = myblue / 255;
int blue = normalizedblue * 255;

// a color in the cpu
new Color( , , blue, );

// what the gpu reads off of a texture in a shader

float4 c = Tex2d(coordinateA, pos2d);
// c’s blue element is basically the normalized value.
new Float4 ( , , normalizedBlue, );

however you cant directly do bitwise math on it as its in float8 format not base2 binary notation instead you have to multiply and divide ect.

how would the shader method be achieved specifically for this case?

load your palette colors[] to a texture2d with set data use a 255,x 255 texture.
(that is 0 to 65,025 colors)
This will be Texture A on the shader.

load your Image data to a array[] this will probably the hardest part.
I don’t think monogame supports directly loading palette indexed images loading them into a array will be on you. (if your just generating them you have the array already)

Create a texture with the proper Width x Height, same size as the image, well call this texture B.

Transform the arrays[] index’s values into the r b elements of a color array of the same size this is for the texture.
Previously discussed we do it to encode this time not to index
1D index to 2D index as color positions r,b instead
// 1D encode to 2 color elements r,b in this case
r = i / 255;
b = i - ( (int)( i ) * 255;
Place these found values into a color array.
use set data to send the color array to a texture2d call it B.

Send both textures A and B to the shader.

In the pixel shader.

Get the color at the current texture coordinates of B
Take the colors r and b values and then…
Use those r b values as x y coordinates for getting a color in texture A.
return that color.

the shader will have all the info it needs so nothing will need to be decoded or indexed on its side.

sorry for the late reply
but here is a example of vs 2017 supporting it.

I just want to know if this can be supported in monogame?
its obviously possible by either using shaders or allowing direct access to the image data.

The question is can the hardware any hardware natively support it? then the answer is if you have 20+ year old graphics card yes.
Can monogame support it the answer is
Yes it could but … why would you.

No one uses pal anymore the hardware doesn’t use pal underneath.
You are basically performing a De-Optimization on modern hardware to use it.

https://people.freedesktop.org/~marcheu/extensions/EXT/paletted_texture.html

Except for anything other then compressing image data and then only in specific cases without color loss.

Convert the pal based images into pngs or jpegs.
You can make a tool im pretty sure with a few lines using system.winforms.drawing. image or bitmap;
To convert all your images to png or jpg pretty easy or just find a tool to do it.

Monogame uses the graphics cards themselves basically all use RGBA per pixel that means every single color can be alpha if you set the alpha value. Not trying to flame you, Im just trying to show you that you can just simulate this, but even that is wasteful.

You can even just manually decode each pal image using the color palette table right into a array then set it to a texture using set data. Which is probably all vs is doing.

I was actually working on a port of Tilengine for Monogame, that would have been able to do exactly what is being described here. But there wasn’t enough interest, so I’m just finishing up support for Unity instead. (where there is more of a need for this sort of support)

While there is plenty of demand for modern rendering approaches, there is still some interest in legacy rendering. (just very niche and small) There are some effects that are better handled through full color palette manipulation.

Tilengine - the 2D retro graphics engine
A community discussion about the plug-in
A recent tutorial I wrote, where I embed Tilengine into Unity

1 Like

@willmotil well the main reason to support it would be so you don’t have to redefine a bunch of the same texture with different colors.
another reason would to emulate retro games. paletted pixel art is not dead it’s actually quite popular.

I do not use a 20 year out graphics card and visual studio renders it just fine.
I highly think monogame should support this and make it easier for users to use and import/export textures into paletted textures for the game.

this could be used to make editors or just mod support.

Tilengine is an open-source project. I may not be pushing to port it to Monogame at the moment, but everything actually important to make that happen is accessible. (source code, examples, a C# binding to begin working from) It also doesn’t have to be used to replace Monogame’s standard rendering. You could easily cook up a stripped-down version of it solely for the purpose of loading in palette textures, altering their palettes, and using that to render Texture2D objects to use as in-game textures or sprite sheets.

It’s also worth noting that before I stopped pushing to embed Tilengine in Monogame, I was able to get a prototype working that allowed for dynamic palette creation and application. So it isn’t a theoretical solution, I’ve put it into action already.

I might look at it, but I think I am going to try to suggest it for monogame first.

Just my 2 cents : I have done something similar for some prototype I was working on months ago.
It’s fairly simple : it allows you to swap a Texture2D colors, while keeping the result in a cache for further access to the same color swap. It is not optimized at all, but maybe that’ll help ?

public static class SpriteColorSwapCache
{
    private static readonly Dictionary<Texture2D, Dictionary<ColorSwap[], Texture2D>> SwappedTextures;

    static SpriteColorSwapCache()
    {
        SwappedTextures = new Dictionary<Texture2D, Dictionary<ColorSwap[], Texture2D>>();
    }

    public static Texture2D GetSwappedTexture(GraphicsDevice graphicsDevice, Texture2D source,
        params ColorSwap[] swaps)
    {
        if (swaps == null || !swaps.Any())
        {
            return source;
        }

        Dictionary<ColorSwap[], Texture2D> sourceSwaps =
            SwappedTextures.SmartGet(source, new Dictionary<ColorSwap[], Texture2D>());

        Texture2D swappedTexture;

        if (sourceSwaps.Any(kvp => kvp.Key.SequenceEqual(swaps)))
        {
            swappedTexture = sourceSwaps.FirstOrDefault(kvp => kvp.Key.SequenceEqual(swaps)).Value;
        }
        else
        {
            Color[] data = new Color[source.Width * source.Height];
            source.GetData(data);

            foreach (ColorSwap swap in swaps)
            {
                for (int i = 0; i < data.Length; i++)
                {
                    if (data[i].Equals(swap.From))
                    {
                        data[i] = swap.To;
                    }
                }
            }

            swappedTexture = new Texture2D(graphicsDevice, source.Width, source.Height);
            swappedTexture.SetData(data);

            sourceSwaps.Add(swaps, swappedTexture);
        }

        return swappedTexture;
    }
}

public struct ColorSwap
{
    public Color From { get; }
    public Color To { get; }

    public ColorSwap(Color from, Color to) : this()
    {
        From = from;
        To = to;
    }
}
1 Like