Glitch with point lights on Android in deferred rendering

Hi, I have a weird problem with point lights in deferred rendering on Android. Exactly the same code works well on PC.

That’s how it looks on Android:

And the light render target:

And for comparison the same pair on PC:


Do you have any guess what’s wrong here? Below is the code for point lights:

float4x4 World;
float4x4 View;
float4x4 Projection;    
//color of the light 
float3 Color; 

//position of the camera, for specular light
float3 cameraPosition; 

//this is used to compute the world-position
float4x4 InvertViewProjection; 

//this is the position of the light
float3 lightPosition;

//how far does this light reach
float lightRadius;

//control the brightness of the light
float lightIntensity = 1.0f;

// diffuse color, and specularIntensity in the alpha channel
texture colorMap; 
// normals, and specularPower in the alpha channel
texture normalMap;
//depth
texture depthMap;

sampler colorSampler = sampler_state
{
    Texture = (colorMap);
	MagFilter = POINT;
	MinFilter = POINT;
	Mipfilter = POINT;
	AddressU = Clamp;
	AddressV = Clamp;
};
sampler depthSampler = sampler_state
{
    Texture = (depthMap);
	MagFilter = POINT;
	MinFilter = POINT;
	Mipfilter = POINT;
	AddressU = Clamp;
	AddressV = Clamp;
};
sampler normalSampler = sampler_state
{
    Texture = (normalMap);
	MagFilter = LINEAR;
	MinFilter = LINEAR;
	Mipfilter = LINEAR;
	AddressU = Clamp;
	AddressV = Clamp;
};

struct VertexShaderInput
{
    float3 Position : POSITION0;
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
    float4 ScreenPosition : TEXCOORD0;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
    //processing geometry coordinates
    float4 worldPosition = mul(float4(input.Position,1), World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);
    output.ScreenPosition = output.Position;
    return output;
}

float2 halfPixel;
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
    //obtain screen position
    input.ScreenPosition.xy /= input.ScreenPosition.w;

    //obtain textureCoordinates corresponding to the current pixel
    //the screen coordinates are in [-1,1]*[1,-1]
    //the texture coordinates need to be in [0,1]*[0,1]
    float2 texCoord = 0.5f * (float2(input.ScreenPosition.x,-input.ScreenPosition.y) + 1);
    //allign texels to pixels
    texCoord -=halfPixel;

    //get normal data from the normalMap
    float4 normalData = tex2D(normalSampler,texCoord);
    //tranform normal back into [-1,1] range
    float3 normal = 2.0f * normalData.xyz - 1.0f;
    //get specular power
    float specularPower = normalData.a * 255;
    //get specular intensity from the colorMap
    float specularIntensity = tex2D(colorSampler, texCoord).a;

    //read depth
    float depthVal = tex2D(depthSampler,texCoord).r;

    //compute screen-space position
    float4 position;
    position.xy = input.ScreenPosition.xy;
    position.z = depthVal;
    position.w = 1.0f;
    //transform to world space
    position = mul(position, InvertViewProjection);
    position /= position.w;

    //surface-to-light vector
    float3 lightVector = lightPosition - (float3)position;

    //compute attenuation based on distance - linear attenuation
    float attenuation =  saturate(1.0f - length(lightVector)/lightRadius); 

    //normalize light vector
    lightVector = normalize(lightVector); 

    //compute diffuse light
    float NdL = max(0,dot(normal,lightVector));
    float3 diffuseLight = NdL * Color.rgb;

    //reflection vector
    float3 reflectionVector = normalize(reflect(-lightVector, normal));
    //camera-to-surface vector
    float3 directionToCamera = normalize(cameraPosition - (float3)position);
    //compute specular light
    float specularLight = specularIntensity * pow( saturate(dot(reflectionVector, directionToCamera)), specularPower);
    //take into account attenuation and lightIntensity.
    return attenuation * lightIntensity * float4(diffuseLight.rgb,specularLight);
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunction();
    }
}

Still having a problem with this one :frowning: My wild guess basing on how light render target looks is that there may be something wrong with precision. The closer the camera the models are, the less issues are visible. The more distance the worse it looks. Did anyone have any similar issues on Android?

Your color and normal maps are using point sampling, but your depth map uses linear filtering. That seems strange, but shouldn’t cause problems as long as you are hitting pixels dead center.

So yeah, here’s my shot in the dark theory:
You are not hitting pixels dead center, filtering influences the depth values, but not the normals. Is the half pixel offset maybe wrong, or not needed at all? Does it look any different if you switch to point filtering?

With point filtering, I don’t see any noticable difference unfortunately :frowning: :


Half pixel is correct - any change in values introduces additional errors. Please notice, on PC the same shader (with the same parameters) gives correct results.

Okay. Are you absolutely sure that those artifact patterns are generated in this shader? The shader just outputs the lighting information for a single point light, and the final image is composed later on, right? Could it be z-fighting or something between the individual point lights, when you finally blend everything together? What happens if you use a single light only?

For me the clue is that you say the artefacts go away when you get closer.

I would have a look at your depth buffer format.

Even though you are in a deferred renderer, the native depth buffer is still used in individual passes.

Single light unfortunately also exhibits the same issues.

Declaration of rendertargets:
_colorRT = new RenderTarget2D(GraphicsDevice, backbufferWidth, backbufferHeight, false, SurfaceFormat.Color, DepthFormat.Depth24);
_normalRT = new RenderTarget2D(GraphicsDevice, backbufferWidth, backbufferHeight, false, SurfaceFormat.Color, DepthFormat.Depth24);
_depthRT = new RenderTarget2D(GraphicsDevice, backbufferWidth, backbufferHeight, false, SurfaceFormat.Single, DepthFormat.Depth24);
_lightRT = new RenderTarget2D(GraphicsDevice, backbufferWidth, backbufferHeight, false, SurfaceFormat.Color, DepthFormat.None);

That’s how Color, Normal and Depth are calculated:

float4x4 World;
float4x4 View;
float4x4 Projection;
bool forceRed = false;
bool forceBlue = false;

texture Texture;
sampler diffuseSampler = sampler_state { texture = <Texture>; magfilter = POINT; minfilter = POINT; mipfilter=POINT; AddressU = WRAP; AddressV = WRAP;};

texture SpecularMap;
sampler specularSampler = sampler_state { texture = <SpecularMap>; magfilter = POINT; minfilter = POINT; mipfilter=POINT; AddressU = WRAP; AddressV = WRAP;};


texture NormalMap;
sampler normalSampler = sampler_state { texture = <NormalMap>; magfilter = POINT; minfilter = POINT; mipfilter=POINT; AddressU = WRAP; AddressV = WRAP;};


struct VertexShaderInput
{
    float4 Position : POSITION0;
    float3 Normal : NORMAL0;
    float2 TexCoord : TEXCOORD0;
    float3 Binormal : BINORMAL0;
    float3 Tangent : TANGENT0;
};

struct VertexShaderOutput
{
    float4 Position : SV_POSITION;
    float2 TexCoord : TEXCOORD0;
    float2 Depth : TEXCOORD1;
    float3x3 tangentToWorld : TEXCOORD2;
};

struct PixelShaderOutput
{
    float4 Color : COLOR0;
    //float4 Normal : COLOR1;
    //float4 Depth : COLOR2;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output = (VertexShaderOutput)0;

    float4 worldPosition = mul(float4(input.Position.xyz,1), World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);

    output.TexCoord = input.TexCoord;
    output.Depth.x = output.Position.z;
    output.Depth.y = output.Position.w;

    // calculate tangent space to world space matrix using the world space tangent,
    // binormal, and normal as basis vectors
	output.tangentToWorld[0] = mul(input.Tangent, (float3x3)World);
	output.tangentToWorld[1] = mul(input.Binormal, (float3x3)World);
	output.tangentToWorld[2] = mul(input.Normal, (float3x3)World);

    return output;
}


PixelShaderOutput PixelShaderFunction(VertexShaderOutput input)
{
    PixelShaderOutput output = (PixelShaderOutput)0;
    output.Color = tex2D(diffuseSampler, input.TexCoord);
    
	
    float4 specularAttributes = tex2D(specularSampler, input.TexCoord);
    //specular Intensity
    output.Color.a = specularAttributes.r;

    return output;
}
PixelShaderOutput PixelShaderFunctionEditor(VertexShaderOutput input)
{
    PixelShaderOutput output = (PixelShaderOutput)0;
    output.Color = tex2D(diffuseSampler, input.TexCoord);
    
	
    float4 specularAttributes = tex2D(specularSampler, input.TexCoord);
    //specular Intensity
    output.Color.a = specularAttributes.r;
	
	if(forceRed)
	{
		output.Color.r=1.0f;
	}
	if(forceBlue)
	{
		output.Color.b=1.0f;
	}
    return output;
}
PixelShaderOutput PixelShaderFunctionNormal(VertexShaderOutput input)
{
    PixelShaderOutput output = (PixelShaderOutput)0;
	
	float4 specularAttributes = tex2D(specularSampler, input.TexCoord);
	
    // read the normal from the normal map
    float3 normalFromMap = tex2D(normalSampler, input.TexCoord).rgb;
    //tranform to [-1,1]
    normalFromMap = 2.0f * normalFromMap - 1.0f;
    //transform into world space
    normalFromMap = mul(normalFromMap, input.tangentToWorld);
    //normalize the result
    normalFromMap = normalize(normalFromMap);
    //output the normal, in [0,1] space
    output.Color.rgb = 0.5f * (normalFromMap + 1.0f);

    //specular Power
    output.Color.a = specularAttributes.a;

    return output;
}

PixelShaderOutput PixelShaderFunctionDepth(VertexShaderOutput input)
{
    PixelShaderOutput output = (PixelShaderOutput)0;

    output.Color = input.Depth.x / input.Depth.y;
	
	
    return output;
}
technique Color
{
    pass Pass0
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunction();
    }
}
technique Normal
{
    pass Pass0
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunctionNormal();
    }
}
technique Depth
{
    pass Pass0
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunctionDepth();
    }
}

And that’s how everything is combined:

texture colorMap;
texture lightMap;
float ambient;
float2 halfPixel;

sampler colorSampler = sampler_state
{
    Texture = (colorMap);
    AddressU = CLAMP;
    AddressV = CLAMP;
    MagFilter = POINT;
    MinFilter = POINT;
    Mipfilter = POINT;
};
sampler lightSampler = sampler_state
{
    Texture = (lightMap);
    AddressU = CLAMP;
    AddressV = CLAMP;
    MagFilter = POINT;
    MinFilter = POINT;
    Mipfilter = POINT;
};

struct VertexShaderInput
{
    float3 Position : POSITION0;
    float2 TexCoord : TEXCOORD0;
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
    float2 TexCoord : TEXCOORD0;
};


VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
    output.Position = float4(input.Position,1);
    output.TexCoord = input.TexCoord - halfPixel;
    return output;
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
    float3 diffuseColor = tex2D(colorSampler,input.TexCoord).rgb;
    float4 light = tex2D(lightSampler,input.TexCoord);
    float3 diffuseLight = light.rgb + ambient;
    float specularLight = light.a;
    return float4((diffuseColor * diffuseLight + specularLight),1);
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunction();
    }
}

Guys, thanks for investing your time to help :slight_smile: Hopefully you can find what’s wrong here.

One thing I think you should check.

//specular Power
output.Color.a = specularAttributes.a;

I am wondering if your textures are being handled differently on Android

I have a problem that the stock Monogame tries to handle pre-multiplied alpha and (IMHO) gets it wrong.

I would play about with the alpha channel in that shader and see if it makes any difference

When you say you’re using the same shader in the PC version, are you using DirectX or OpenGL for PC?

In the past I’ve had lots of problems with MojoShader, which converts HLSL code into GLSL, but building the PC version with OpenGL made the shader problems appear also in PC, and it was a lot easier to find a solution on PC than Android.