Can't sample arbitrary texture coordinates with tex2D() in effect shader

Hi,

I’m trying to write a shader that will create an outline around a sprite. In order to do this I want to sample the pixels around the given pixel in the pixel shader. However, when I try to add an offset to the texture coordinate, the shader won’t compile. Here is the code:

#if OPENGL
	#define SV_POSITION POSITION
	#define VS_SHADERMODEL vs_3_0
	#define PS_SHADERMODEL ps_3_0
#else
	#define VS_SHADERMODEL vs_4_0_level_9_1
	#define PS_SHADERMODEL ps_4_0_level_9_1
#endif

Texture2D SpriteTexture;

sampler2D SpriteTextureSampler = sampler_state
{
	Texture = <SpriteTexture>;
};

struct VertexShaderOutput
{
	float4 Position : SV_POSITION;
	float4 Color : COLOR0;
	float2 TextureCoordinates : TEXCOORD0;
};

float4 MainPS(VertexShaderOutput input) : COLOR
{
    float width;
	float height;
    SpriteTexture.GetDimensions(width, height);

    // Compiles fine
    float4 color = tex2D(SpriteTextureSampler, input.TextureCoordinates) * input.Color;

    if ((color.r != 0.0f && color.g != 0.0f && color.b != 0.0f) &&
		  (tex2D(SpriteTextureSampler, input.TextureCoordinates + float2(1.0f / width, 0.0f)).a == 0.0f
		|| tex2D(SpriteTextureSampler, input.TextureCoordinates + float2(-1.0f / width, 0.0f)).a == 0.0f
		|| tex2D(SpriteTextureSampler, input.TextureCoordinates + float2(0.0f, 1.0f / height)).a == 0.0f
		|| tex2D(SpriteTextureSampler, input.TextureCoordinates + float2(0.0f, -1.0f / height)).a == 0.0f))
    {
        color = float4(0.0f, 0.0f, 0.0f, 1.0f);
    }

    return color;
}

technique SpriteDrawing
{
	pass P0
	{
		PixelShader = compile PS_SHADERMODEL MainPS();
	}
};

The first call to tex2D() compiles, however the calls inside the subsequent if statement fail. I have tried putting the offsets into variables beforehand and using those instead, however the error remains the same.

The error message the MGCB pipeline tool returns is: “cannot map expression to pixel shader instruction set”.

Does anyone know what the problem here is? I am new to writing shaders so I don’t know if I’m missing something obvious, but resources and documentation for HLSL is pretty sparse so I haven’t been able to find an answer elsewhere. Thanks.

the issue is the GetDimensions method. To use that you need to change to this:

#if OPENGL
  #define SV_POSITION POSITION
  #define VS_SHADERMODEL vs_3_0
  #define PS_SHADERMODEL ps_3_0
#else
  #define VS_SHADERMODEL vs_4_0
  #define PS_SHADERMODEL ps_4_0
#endif

Removing the call GetDimensions() and setting width and height to a constant does make the program compile. I see from Microsoft’s documentation of GetDimensions() that it should be supported in the shader models you suggested I switch to, and not in the ones enabled in my code. However, even with the changes you proposed, the shader fails to compile if I use the GetDimensions() function. I’m going to look into other ways to get the size of the texture. Thanks for the help, I didn’t at all suspect that function was the issue.

I realized my project is running OpenGL, therefore it’s selecting shader model 3.0, which doesn’t support GetDimensions(). Unfortunately that means I have to convert my project to use DirectX instead, as you can’t use a higher shader model with OpenGL, but at least it should work after that.

Thanks to bangclash1 for pointing out that GetDimensions() was the issue.