[SOLVED] Monogame effect outputting nothing after adding vertex shader

I’m using Monogame 3.6 and dove into shaders today. I can’t find much documentation for Monogaeme effects. All the XNA ones don’t work. Even slightly older Monogame examples don’t do what you’d expect.

This is for a 2d game.

So I started today with trying to follow this tutorial: http://joshuasmyth.maglevstudios.com/post/Introduction-to-Pixel-Shaders-with-XNA-and-Monogame-Part2-Sepia-Desaturation-Scanlines-and-other-Techniques2. I got stuck at the scanline part. After awhile I discovered it was because of the input.position. Because I was using ps_4_0_level_9_1 I couldn’t access the position directly for some reason. On the Monogame forum I found that you had to pass the position separately from the vertex shader. So I tried adding the vertex shader. After a while, I found an answer here on this forum. This should be a basic effect with texture coordinates (which you don’t get if you just make an effect through Monogame contentPipeline. If you make a new Monogame SpriteEffect you don’t get a vertex shader.):

#if OPENGL
    #define SV_POSITION POSITION
    #define VS_SHADERMODEL vs_3_0
    #define PS_SHADERMODEL ps_3_0
#else
    #define VS_SHADERMODEL vs_4_0_level_9_1
    #define PS_SHADERMODEL ps_4_0_level_9_1
#endif

matrix WorldViewProjection;
sampler2D TextureSampler: register(s0); // added

struct VertexShaderInput
{
    float4 Position : SV_POSITION;
    float4 Color : COLOR0;
    float2 TexCoords: TEXCOORD0; // added
};

struct VertexShaderOutput
{
    float4 Position : SV_POSITION;
    float4 Color : COLOR0;
    float2 TexCoords: TEXCOORD0; // added
}; 

VertexShaderOutput MainVS(in VertexShaderInput input)
{
    VertexShaderOutput output = (VertexShaderOutput)0;

    output.Position = mul(input.Position, WorldViewProjection);
    output.Color = input.Color;
    output.TexCoords = input.TexCoords; // added

    return output;
}

float4 MainPS(VertexShaderOutput input) : COLOR
{
    return tex2D(TextureSampler, input.TexCoords);
}

technique BasicColorDrawing
{
    pass P0
    {
        VertexShader = compile VS_SHADERMODEL MainVS();
        PixelShader = compile PS_SHADERMODEL MainPS();
   }
};

This doesn’t output anything at all. Even if I replace the return tex2D(TextureSampler, input.TexCoords); with return float4(1.0f,0.0f,0.0f,1.0f); I still get nothing whatsoever.

Questions:

  • What is wrong with my shader?

  • What is the difference between a Sprite Effect and an Effect?

  • Is there anyplace where I can get documentation about modern Monogame effects?

how do you draw things in monogame

Hey,

In the main game class I have a field:

private Effect myEffect;

In LoadContent I load the effect.

myEffect = Content.Load<Effect>("effect");

In Draw i use the spriteBatch with the effect passed as parameter

spriteBatch.Begin( transformMatrix: camera.GetViewMatrix(),effect: myEffect);

Here is an example of the scene without a shader. The rectangles are a simple 1x1 white texture that is stretched using a destination rectangle and colored using the Color parameter of the Draw function.

With this shader it’s just completely CornflowerBlue.

I hope this makes my question clearer.

Kind regards

if you draw your shader with spritebatch you don’t have to have a vertex shader. You can do it like this

technique BasicColorDrawing
{
pass P0
{
//VertexShader = compile VS_SHADERMODEL MainVS();
PixelShader = compile PS_SHADERMODEL MainPS();
}
};

@NBenassou You shouldn’t add a vertex shader for an effect like this. If you do, it should use the same projection SpriteBatch uses. In that case the WVP matrix in the effect won’t be set to the matrix in the spriteBatch.Begin call (this is why you don’t see anything drawn), so you have to set that manually (just like in XNA). Usually effects applied to SpriteBatch only require a pixel shader which can be done like @kosmonautgames said.

The scan line effect looks very straightforward and should just work, can you show your rendering code (switching render targets, setting shader variable and stuff like that)?

Hey,

Thanks for the answers. Here is the scanline shader that i use.

#if OPENGL
	#define SV_POSITION POSITION
	#define VS_SHADERMODEL vs_3_0
	#define PS_SHADERMODEL ps_3_0
#else
	#define VS_SHADERMODEL vs_4_0_level_9_1
	#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;

//int ImageHeight;

sampler2D SpriteTextureSampler = sampler_state
{
	Texture = <SpriteTexture>;
};

struct VertexShaderOutput
{
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 TextureCoordinates : TEXCOORD0;
};

float4 MainPS(VertexShaderOutput input) : COLOR
{
	float4 color = tex2D(TextureSampler, input.TexCoords);
	int a = saturate((input.Position.y * 1080) % 4);// I just hardocde 1080 in here because that is the height of the screen. I can't really pass a texture height because there are different sprites with different heights.
	int b = saturate((input.Position.y * 1080 + 1) % 4);// I would expect the scanlines to then line up over the different textures.
	float m = min(a,b);

	color.rgb *= m;

	return color;
}

technique SpriteDrawing
{
	pass P0
	{
		PixelShader = compile PS_SHADERMODEL MainPS();
	}
};

This gives me the following error:

myEffect.fx(26,34): error X4502: Shader model ps_4_0_level_9_1 doesn’t allow reading from position semantics.

This forum post says that this is simply a limitation of the older version of that pixel shader. He advises to pass the position again from the vertex shader so that you can use it in the pixel shader. That’s why I wanted a vertex shader that does the same as the default but also passes the position through another variable.

@Jjagg
I have tried passing the WorldViewProjection to the effect with the vertex shader like so:

myEffect = Content.Load<Effect>("effect");
myEffect.Parameters["WorldViewProjection"].SetValue(camera.GetViewMatrix());

I don’t do anything fancy. This is the main game class.
> spriteBatch.Begin( transformMatrix: camera.GetViewMatrix(),effect: myEffect);
> player.Draw(spriteBatch);
> spriteBatch.End();

And the player then draws various rectangles like so:

Rectangle destinationRectangle = new Rectangle((int)position.X, (int)position.Y, 16, 16);
spriteBatch.Draw(square, destinationRectangle, color:Color.Yellow);

Since you’re rendering a full screen quad you can use the texture coordinate instead of position. Edit: oh, you’re drawing to the backbuffer directly? Then this won’t work. Though I recommend you render everything to a render target first, then render that to the backbuffer with the scanline effect applied.

This isn’t correct because you don’t include the projection spritebatch uses. The matrix you should multiply your matrix with should be defined like in SpriteEffect: https://github.com/MonoGame/MonoGame/blob/develop/MonoGame.Framework/Graphics/Effect/SpriteEffect.cs#L66

Either solution should do the trick :slight_smile:

Thank you!
Simply multiplying with a projection matrix did the trick!

1 Like