How to update Texture2D Color buffer without using SetData?

Hi,

I am trying update my camera color 1920 x 1080 color buffer Texture2D for Rendering.

Is it possible to get Texture2D pixel buffer handle directly, so that I can copy my buffer directly into Texture2D.

Now I am doing 3 things get the color buffer from my camera to Texture2D.

  1. Allocating byte[] array in my application.
  2. Copy the Camera Color buffer to Byte Array.
  3. Set the Color Byte Array to Texture2D.SetData().

Since Tex2D.Setdata() taking more time for copying the buffer leads to frame drops.

I am getting only 5 or 6 frames sometimes…

Please help me is there any way around for this issue…

Thanks,
Bala

Hi,
SetData/GetData functions use CPU and that’s the reason why it is so slow.
In your case I would use GPU by drawing a fullscreen quad with a basic HLSL shader that samples the input texture. :slightly_smiling:
E.g.:
float4 PSCopy(VSOutput input) : COLOR0 { return tex2D(SourceTextureSampler, input.TexCoord); }

Hi Sizaar, Thanks for your reply!

Could you please elaborate bit more on how to implement it.

Regards,
Bala

Create a new HLSL shader like:

texture SourceTexture; // This is your input texture.
sampler SourceTextureSampler = sampler_state
{
   Texture = <SourceTexture>;
};

struct VSInput
{
   float4 Position : SV_POSITION;
   float2 TexCoord : TEXCOORD0;
};

struct VSOutput
{
   float4 Position : SV_POSITION;
   float2 TexCoord : TEXCOORD0;
};

VSOutput VSCopy(VSInput input)
{
   VSOutput output = (VSOutput)0;
   output.Position = input.Position;
   output.TexCoord = input.TexCoord;
   return output;
}

float4 PSCopy(VSOutput input) : COLOR0
{
   return tex2D(SourceTextureSampler, input.TexCoord);
}

technique Copy
{
   pass Pass0
   {
      VertexShader = compile vs_4_0_level_9_1 VSCopy();
      PixelShader = compile ps_4_0_level_9_1 PSCopy();
   }
}

Load the effect with Pipeline tool, set your parameters (SourceTexture) and then bind the shader as always with:
myEffect.CurrentTechnique.Passes[0].Apply();

And draw the fullscreen quad:
quadRenderer.RenderQuad(graphicsDevice, -Vector2.One, Vector2.One);
(I’m assuming you’re still using LightPrePass sample :slightly_smiling: ).

Texture data is in stored on the GPU in VRAM, so you can’t directly copy stuff over to it from. When doing stuff like this, you should not that data transfer between RAM and VRAM is pretty slow. Reading data from VRAM is also not supported on all platforms. So when you want to copy data from one texture to the other, like @Sizaar said it’s a lot faster (and alway supported) to render one texture to the other (the data doesn’t leave VRAM this way) than to copy over the data from VRAM to RAM and back.

You should make the texture you want to render to a RenderTarget2D rather than just a Texture2D so you can actually draw to it.

In what manner are you receiving data from the camera? If it is a byte buffer, then you have no choice but to use Texture2D.SetData(). Obtaining the Direct3D texture handle will not make it any quicker as you would be doing the same code that SetData() already does, but likely in a less robust manner. There is also no need to allocate a byte buffer and copy the camera color buffer to the byte buffer. Just pass the camera color buffer to SetData(). No duplicate buffers are required.

If it is in a texture already, why copy it to another texture? The suggestions of using a shader to render from one texture to a render target require the camera data to already be in a texture to start with. And the only that texture would work is if it was created from the Direct3D device context that MonoGame uses. Just use the original texture.

@Sizaar & @Jjagg: Thanks for the reply, but my case is not to copy one texture to another texture.

@KonajuGames: Yes. I am getting my byte buffer filled by calling camera’s API function for each frame. [This is one copy]

I can avoid this unwanted copy by passing the Camera’s buffer handle directly to my final Tex2d.SetData().

But the issue is I declared the Texture2D as COLOR format, so it expects RGB32. But the camera’s buffer handle contains YUY2 buffer format. I found no setting in camera to change buffer handle format.

So I am calling Camera’s GetBuffer() function with required format, so it internally converts the buffer from YUY2 to RGB32 and filling it into my byte buffer and then I am passing my byte buffer to Text2D.SetData().

Please advice me on this issue!!!

Thanks and Regards,
Bala

We don’t explicitly support it, but Direct3D does have a YUY2 surface format. I have seen comments that only AMD cards support this, but that was from several years ago. That would be the most efficient way to support it (dump the YUY2 data straight into a texture with that surface format).

In short, we don’t have direct support for it, but it could be added as a feature request.

Thanks @KonajuGames. I will try few other methods and come back to you with performance result.