3D Models and Geometric Instancing Issue

Hi MonoGame Community,

I have recently transitioned from 2D Development to 3D Development and have been trying to learn Hardware Instancing for tiling the ground objects.

I have followed some examples here but I am unsure exactly how to resolve an issue. The error I am currently getting when trying to Instance a model is

An error occurred while preparing to draw. This is probably because the current vertex declaration does not include all the elements required by the current vertex shader. The current vertex declaration includes these elements: SV_Position0, NORMAL0, TEXCOORD0, POSITION1, TEXCOORD1.

I have seen a few solutions presented from searching but haven’t been able to resolve.

The current Shader (Which is a copy from another user here)

#if OPENGL
#define SV_POSITION POSITION
#define VS_SHADERMODEL vs_3_0
#define PS_SHADERMODEL ps_3_0
#else
#define VS_SHADERMODEL vs_4_0_level_9_1
#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

// Camera settings.
float4x4 World;
float4x4 View;
float4x4 Projection;

// This sample uses a simple Lambert lighting model.
float3 LightDirection;
float3 DiffuseLight;
float3 AmbientLight;
float4 Color;

texture Texture;

sampler CustomSampler = sampler_state
{
Texture = (Texture);
};

struct VertexShaderInput
{
float4 Position : POSITION0;
float3 Normal : NORMAL0;
float2 TextureCoordinate : TEXCOORD0;
};

struct VertexShaderOutput
{
float4 Position : SV_POSITION;
float4 Color : COLOR0;
float2 TextureCoordinate : TEXCOORD0;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input, float4x4 instanceTransform : BLENDWEIGHT)
{
VertexShaderOutput output;

// Apply the world and camera matrices to compute the output position.
float4x4 instancePosition = mul(World, transpose(instanceTransform));
float4 worldPosition = mul(input.Position, instancePosition);
float4 viewPosition = mul(worldPosition, View);
output.Position = mul(viewPosition, Projection);

// Compute lighting, using a simple Lambert model.
//float3 worldNormal = mul(input.Normal, instanceTransform);
//float diffuseAmount = max(-dot(worldNormal, LightDirection), 0);
//float3 lightingResult = saturate(diffuseAmount * DiffuseLight + AmbientLight);

output.Color = Color;

// Copy across the input texture coordinate.
output.TextureCoordinate = input.TextureCoordinate;

return output;

}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
return tex2D(CustomSampler, input.TextureCoordinate) * input.Color;
}

// Hardware instancing technique.
technique Instancing
{
pass Pass1
{
VertexShader = compile VS_SHADERMODEL VertexShaderFunction();
PixelShader = compile PS_SHADERMODEL PixelShaderFunction();
}
}

From what I understand I am not passing in something I need in this shader? This is the current model Code I’m trying to use to draw.

foreach (var mesh in ThisModel.Meshes)
{
foreach (var meshPart in mesh.MeshParts)
{
// Setting Buffer Bindings here
GraphicsDevice.SetVertexBuffers(
new VertexBufferBinding(meshPart.VertexBuffer, meshPart.VertexOffset, 0),
new VertexBufferBinding(instanceVertexBuffer, 0, 1)
);

                GraphicsDevice.Indices = meshPart.IndexBuffer;

                Effect effect = Content.Load<Effect>("modelInstance"); ;
                effect.CurrentTechnique = effect.Techniques["Instancing"];

                effect.Parameters["World"].SetValue(modelBones[mesh.ParentBone.Index]);
                effect.Parameters["View"].SetValue(View);
                effect.Parameters["Projection"].SetValue(Projection);

                effect.Parameters["Color"].SetValue(Vector4.One);
                /*effect.Parameters["AmbientLight"].SetValue(Vector3.One);
                effect.Parameters["DiffuseLight"].SetValue(Vector3.Zero);
                effect.Parameters["LightDirection"].SetValue(Vector3.One);*/

                foreach (var pass in effect.CurrentTechnique.Passes)
                {
                    pass.Apply();

                }
                GraphicsDevice.DrawInstancedPrimitives(PrimitiveType.TriangleList, 0, meshPart.StartIndex, meshPart.PrimitiveCount,
                    instanceCount);
            }
        }

If anyone knows what I might be missing or could point me in the right direction for understanding what I am doing wrong please let me know. I have searched for a while and found some examples but they seemed to be different issues.

Thanks for any help

You have in your class file defined a vertex structure with the following attributes.
The current vertex declaration includes these elements: SV_Position0, NORMAL0, TEXCOORD0, POSITION1, TEXCOORD1.

in your shader however you have a two fold mismatch.

First your vertex struct in the shader only defines half of those attributes. It’s missing position1 and texturecoords1

struct VertexShaderInput
{
float4 Position : POSITION0;
float3 Normal : NORMAL0;
float2 TextureCoordinate : TEXCOORD0;
};

Secondly you are passing in a matrix to the shader.
VertexShaderFunction(VertexShaderInput input, float4x4 instanceTransform : BLENDWEIGHT)

When you instance you define a second struct one that holds the instancing values these hold information that allows the VertexShaderInput to essentially be redrawn or drawn multiple times hence instances of the VertexShaderInput

Look to the very bottom post in the following link there is a game1 and instancing shader to go with it …

When you look to that code note that each technique only uses one vertex and one pixel shader so don’t be confused by seeing 2 of each there those are to show that you can have more then one technique in a shader.fx file to use from game1. Everything else should be clearly commented.

Thank you for this. I also had managed to get it to render after posting this by fixing that 2nd problem. however the models were all Black, I’m guessing it could be to do with the first issue you mentioned? I have a bit to learn when it comes to shaders and I’ll read up on the example you gave.

Yep if that bottom one is too confusing try the one above it which also uses instancing but it not fully gpu based. It’s a bit simpler though on the shader side. There are quite a few examples in that post.

Thanks it makes sense and similar to the example I was basing off. I did spend a bit of time understanding Shaders after posting.

Say in those examples you are passing a Texture2D to instance, I am doing it with models, I have now the models instancing but I am getting them as all black so would it be either I am not passing in the model texture correctly to the shader or also could it be something to do with lighting?

My Current Shader now looks like this:

#if OPENGL
#define SV_POSITION POSITION
#define VS_SHADERMODEL vs_3_0
#define PS_SHADERMODEL ps_3_0
#else
#define VS_SHADERMODEL vs_4_0_level_9_1
#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

matrix WorldViewProjection;

float4 Color;

float4 AmbientColor = float4(1, 1, 1, 1);
float AmbientIntensity = 1;

texture ModelTexture;
sampler2D textureSampler = sampler_state {
Texture = (ModelTexture);
MagFilter = Linear;
MinFilter = Linear;
AddressU = Clamp;
AddressV = Clamp;
};
struct VertexShaderInput
{
float4 Position : POSITION0;
float4 Normal : NORMAL0;
float2 TextureCoordinate : TEXCOORD0;
};

struct VertexShaderOutput
{
float4 Position : SV_POSITION;
float4 Color : COLOR0;
float3 Normal : TEXCOORD0;
float2 TextureCoordinate : TEXCOORD1;
};

VertexShaderOutput MainVS(VertexShaderInput input, float4 instanceTransform : POSITION1,
float2 atlasCoord : TEXCOORD1)
{
VertexShaderOutput output;

float4 pos = input.Position + instanceTransform;
pos = mul(pos, WorldViewProjection);

output.Position = pos;

output.TextureCoordinate = float2((input.TextureCoordinate.x / 2.0f) + (1.0f / 2.0f * atlasCoord.x),
(input.TextureCoordinate.y / 2.0f) + (1.0f / 2.0f * atlasCoord.y));

return output;

}

float4 MainPS(VertexShaderOutput input) : COLOR0
{

float4 textureColor = tex2D(textureSampler, input.TextureCoordinate);
textureColor.a = 1;
return textureColor * input.Color;
}

technique Instancing
{
pass P0
{
VertexShader = compile VS_SHADERMODEL MainVS();
PixelShader = compile PS_SHADERMODEL MainPS();
}
};

EDIT: and here is what it currently renders like:


Without Instancing the model: (Images are from seperate projects, using the first to learn instancing)

The first step you take is ensuring that you can draw the model once.
Then you instance it this also follows in the shader the instancing can be thought of as a add on to a working shader that draws a model once.
That said its actually easier to draw something like this manually with primitives then instance it.
verses using monogames model class…
If you are actually using the model class provided the texture has loaded their are still a number of problems with your currently shown shader that could cause it to draw black.
From what is shown it’s impossible to know the exact cause.
However here are some possible issues from what is shown.

  1. as stated the texture wasn’t correctly loaded that should be easy enough to check in a variety of ways the simplist is to just grab the texture from the model and draw it to the screen in a spritebatch.draw call.

  2. your vertex structure shows that you are passing in a color and you are using that color to multiply it against the texture colors per pixel in the pixel shader if that color were set to black or the alpha was zero it could zero out the texture colors to black.

  3. This below is highly likely the culprit you should simplify the below and ensure it isn’t a problem.

output.TextureCoordinate = float2((input.TextureCoordinate.x / 2.0f) + (1.0f / 2.0f * atlasCoord.x),
(input.TextureCoordinate.y / 2.0f) + (1.0f / 2.0f * atlasCoord.y));

to just use the regular texture coordinate first and make sure that draws before doing anything fancy.

Aka

output.TextureCoordinate = input.TextureCoordinate;

  1. your vertex structure shows that it is expecting a normal to be passed in then out but no normals are being used i.e. you aren’t actually doing any lighting calculations. This i thought would of errored out the shader so there maybe something else going on there.

Thanks for the reply.
Had a try with just output.TextureCoordinate = input.TextureCoordinate; But still no texture.
I think the problem lies in my lack of understanding of Shaders and the interactions. There is probably stuff I am still missing. I was going to go through all the shader examples in rbwhitaker to try to get an understanding then might revisit this.
If I can get it on primitives instead of the model as you said, I might try that then.
Thanks again for the help.

One thing on that page i linked to those are fully working examples eg all the code is there you only have to change the namespace names on them to make them work in your own project and copy paste them into your project and projects shader.

You should run the simpler one and take a look at how the structs in game1 match up to the structs in the shader. They essentially mirror each other.

You can also see how to manually set a texture into the shader to overide it so if you have loaded a texture and drawn it with spritebatch.draw then in your shader before you draw set that texture into it via.

myEffect.Parameters[ "inShaderNamedTexture" ].SetValue( textureLoadedInGame1 )

If it doesn’t draw then it’s something your doing wrong in the shader vs the pipeline not pulling in the texture from the model however that can happen as well if you are using the monogame model class which is in all honesty outdated and janky.

There is work on getting a better model loading solution in specifically gltf but that wont be ready for some time.

In the pixel shader you are using input.Color.

But you are not assigning any color to output.Color in the vertex shader, that’s why everything is black.

Thanks, I’ve been trying to learn more about shaders but even in the RBWhitakers Example shader here:
http://rbwhitaker.wikidot.com/first-shader
The models still are only black. My shader is exactly the same as the one in that example and the model code just is this:

        foreach (var mesh in ThisModel.Meshes)
        {
            foreach (var meshPart in mesh.MeshParts)
            {

                GraphicsDevice.Indices = meshPart.IndexBuffer;      
                Effect effect = ScreenManager.Instance.Content.Load<Effect>("Shaders/shader_02");
                effect.CurrentTechnique = effect.Techniques["Ambient"];
                effect.Parameters["World"].SetValue(transforms[mesh.ParentBone.Index]);
                effect.Parameters["View"].SetValue(ScreenManager.Instance.ViewMatrix);
                effect.Parameters["Projection"].SetValue(ScreenManager.Instance.ProjectionMatrix);
                foreach (var pass in effect.CurrentTechnique.Passes)
                {
                    pass.Apply();
                }

                GraphicsDevice.DrawInstancedPrimitives(PrimitiveType.TriangleList, 0, meshPart.StartIndex, 
                    meshPart.PrimitiveCount, 1);
            }
        }

I’m not sure if it is something I am supposed to be passing into the Shader from the model to get the textures or if it is as you said the model class being janky.
I found one forum post from a while ago with the same problem: Vertex colors are not displayed with custom shader
However the linked solution doesn’t load the website and other solution seems a bit advanced as I am already using the Content Pipeline for a lot of things.

At the end of the day all I am trying to do is improve game performance as best I can (Frustrum, Localized Collision, Occlusion, Instancing where possible) On high object counts. Frustrum is implemented, Localized Collision is Sorted into a Grid (Was using Octree but had issues), Occlusion I have working but I need to get that on multiple buffers somehow as it does only 1 model at a time which is too slow, and Instancing for repeat models such as the ground/grass/flowers.