3D Objects aren't interpolated, just plain flat shade.

Isn’t linear interpolation default in the shader?

Because what ever I do my polygons will be rendered as flat shaded, the result is more like ‘nointerpolation’.
Here’s an image of my 3D object.

And here’s a part of my HLSL code.

struct VertexShaderInput
{
float4 Position : POSITION0;
float4 Pos2DAsSeenByLight    : TEXCOORD1;
float3 Normal : NORMAL0;
float2 TextureCoordinate : TEXCOORD0;
};


struct VertexShaderOutput
{
float4 Position				: POSITION0;	// The vertex position again
float4 Position3D           : TEXCOORD2;
float4 Pos2DAsSeenByLight   : TEXCOORD1;
float2 TextureCoordinate	: TEXCOORD0;
float3 Normal               : TEXCOORD3;
float4 ParticleColor		: COLOR0;
};

struct SScenePixelToFrame
{
float4 Color : COLOR0;
};

VertexShaderOutput VertexShaderCommon(VertexShaderInput input, float4x4 instanceTransform, float4 instanceColor)
{
VertexShaderOutput output;

// Apply the objects translation in the world
// to the input.Position that contain the
// X and Y values of the screen coordinate of the current pixel
float4 worldPosition = mul(input.Position, instanceTransform);

// Apply the camera view to it
float4 viewPosition = mul(worldPosition, View);

// And the projection frustum to become the camera screen position
output.Position = mul(viewPosition, Projection);

// And do the same for the light screen pixels
output.Pos2DAsSeenByLight = mul(worldPosition, xLightsViewProjection);

// Calculate the objects in the world vertex normals
output.Normal = normalize(mul(input.Normal, (float3x3)instanceTransform));

// The objects 3D positions is stored
output.Position3D = worldPosition;

// Copy across the input texture coordinate.
output.TextureCoordinate = input.TextureCoordinate;

output.ParticleColor = instanceColor;

return output;
}





SScenePixelToFrame PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
SScenePixelToFrame Output = (SScenePixelToFrame)0;

float2 ProjectedTexCoords;
ProjectedTexCoords[0] = input.Pos2DAsSeenByLight.x / input.Pos2DAsSeenByLight.w / 2.0f + 0.5f;
ProjectedTexCoords[1] = -input.Pos2DAsSeenByLight.y / input.Pos2DAsSeenByLight.w / 2.0f + 0.5f;

float diffuseLightingFactor = 0;
if ((saturate(ProjectedTexCoords).x == ProjectedTexCoords.x) && (saturate(ProjectedTexCoords).y == ProjectedTexCoords.y))
{
	float depthStoredInShadowMap = tex2D(ShadowMapSampler, ProjectedTexCoords).r;
	float realDistance = input.Pos2DAsSeenByLight.z / input.Pos2DAsSeenByLight.w;
	if ((realDistance - 1.0f / 100.0f) <= depthStoredInShadowMap)
	{
		diffuseLightingFactor = DotProduct(xLightPos, input.Position3D, input.Normal);
		diffuseLightingFactor = saturate(diffuseLightingFactor);
		diffuseLightingFactor *= xLightPower;

		// The light texture that will be projected onto the objects
		float lightTextureFactor = tex2D(LightShapeSampler, ProjectedTexCoords).r;
		diffuseLightingFactor *= lightTextureFactor;
	}
}

float4 baseColor = tex2D(TextureSampler, input.TextureCoordinate);
Output.Color = baseColor * (diffuseLightingFactor + xAmbient);

return Output;
}

What am I doing wrong here? Why is this always turning out with flat shaded polygons?
Or, might there be something wrong with my objects?

Regards, Morgan

All of a sudden I might have found what is causing it, if the two color values are the same there’s no seen interpolation because all the colors in between will be the same.

But if that’s the case will I have to create my own interpolations then?
Calculating light against each normal in between?

Or is it that my calculations are wrong in the Vertex Shader function.

I’m lost…

I think the issue is that the normals of your model are not interpolated. Did you make it yourself?

IIRC in blender normals are not interpolated by default. I.e. seperate vertices are created for each triangle, with normals perpendicular to its surface.

1 Like

How are you setting up your normals ? It appears that each triangle has the same normal direction on each vertice. It’s the typical cause of what is shown.

Edit: looking at your model a bit closer i think even more so now. That your normals are messed up.

To respond to your question.

if the two color values are the same there’s no seen interpolation because all the colors in between will be the same

Don’t be confused colors passed into a shader are not used for the lighting shading calculation, the Normal’s are. Instead color is for color shading or color re-shading.
The vertice normals dot’ed against the light directional normal give in an Acos result value in the range of 1 to -1, that 0 to 1 is a intensity scalar used on the texel (texture’s u,v) color;

The vertex shader will interpolate the position the normals and colors as it interpolates across a triangle and then sends that to the pixel shader that’s its modus operandi…
If all 3 normal’s of a triangle are the same, a light calculation will of course always evaluate to a flat light intensity across a triangle (unless the light is of a positional type and distance based).

1 Like

Concurring with the above. Your vertex normals are probably all the same for each triangle.

As pictures say more than words
https://www.scratchapixel.com/images/upload/shading-intro/shad-face-normals2.png?

All of you are right, I’ve messed it up in Blender I didn’t set them to ‘smooth’

Thanks all of you.
Willmotil, thanks for clearing that out for me. I guess I’ve been guessing too much.

Regards, Morgan

1 Like

Very ironic sentence :stuck_out_tongue:

Haha yes, I didn’t even think about it until now when you’ve pointed it out for me. :grin:

It really doesn’t help that there’s (IIRC) 3 different types of smoothing in Blender and only one of them is the one you actually want for smoothing to survive export.

1 Like