Loading vertex positions as float array

Hi there,

I stumbled upon a question regarding VertexBuffers. Hope you can help : )

I want to load an array of floats (v1.x, v1.y, v1.z, v2.x, v2.y, …) representing my positions into a vertex buffer. It somehow works, but the mesh is distorted:

var allVerticesCount = 0;
var allIndicesCount = 0;

foreach( var mesh in meshArray )
{
  allVerticesCount += mesh.Positions.Length / 3;
  allIndicesCount += mesh.TriangleIndicesCount;
}

var vertexBuffer = new VertexBuffer(
  graphicsDevice,
  typeof( VertexPosition ),
  allVerticesCount,
  BufferUsage.None);

var indexBuffer = new IndexBuffer(
  graphicsDevice,
  typeof( int ),
  allIndicesCount,
  BufferUsage.WriteOnly );

var vertexOffset = 0;
var indexOffset = 0;

foreach( var mesh in meshArray )
{
  vertexBuffer.SetData( 
    vertexOffset * vertexBuffer.VertexDeclaration.VertexStride,
    mesh.Positions, 
    0, 
    mesh.Positions.Length, 
    vertexBuffer.VertexDeclaration.VertexStride );
  vertexOffset += mesh.Positions.Length / 3;

  indexBuffer.SetData(
    indexOffset,
    mesh.GetTriangleIndices(), 
    0, 
    mesh.TriangleIndicesCount );
  indexOffset += mesh.TriangleIndicesCount;
}

I am not quite sure if I understand, what the offset does. I want to load multiple meshes and set the offset vor every new mesh accordingly.

Thanks for having a look : )

Cheers,
Georg

Here is a more simple core example:

var vertexBuffer = new VertexBuffer(
  graphicsDevice,
  typeof( VertexPosition ),
  6,
  BufferUsage.WriteOnly);
var indexBuffer = new IndexBuffer(
  graphicsDevice,
  typeof( int ),
  6,
  BufferUsage.WriteOnly );

// Triangle 1
vertexBuffer.SetData( 
  0, 
  new float[]{0,0,0, 1,0,0, 0.5f,1,0}, 
  0, 
  9, 
  stride: 12 );
indexBuffer.SetData( 0, new[]{0,1,2}, 0, 3);

// Triangle 2
vertexBuffer.SetData( 
  3 * 12,
  new float[]{0,0,0, 0.5f,-1,0 0,0,1}, 
  0, 
  9, 
  stride: 12 );
indexBuffer.SetData( 3 * 4, new[]{3,4,5}, 0, 3);

Unfortunately only the first triangle works and only if stride := 0.

Found out that this works:

vertexBuffer.SetData( 
  vertexOffset, // offset in bytes from buffer start
  new[] { 0f, 0f, 0f, 500f, 0f, 0f, 250f, 500f, 0f }, // split positions 
  0, // start index in array
  9, // number or array elements
  4  // size of single array element
);

indexBuffer.SetData( indexOffset, new[] { 0, 1, 2 }, 0, 3 );


vertexBuffer.SetData(
  sizeof( float ) * 3 * 3,
  new[] { 0f, 0f, 0f, 250f, -500f, 0f, 500f, 0f, 0f }, 0, 9, 4 );
indexBuffer.SetData( sizeof( int ) * 3, new[] { 3, 4, 5 }, 0, 3 );

does not seem like a very safe way to do it. Can you think of something better?

You changed the stride parameter from 12 to 4 which made it work. So apparently the stride isn’t the size of a vertex, but the size of one element in the array. I didn’t know that either, my array elements generally match the vertex format, but that seems to be the explanation.

What exactly do you think is not safe?

That’s what I mean - vertex declaration and array elements do not match. With “Not safe” I mean - I fear this behaviour could change, since it might not be intended.

But for now it works.

I think what you are doing is supposed to work. It should be safe.

Alright. Thanks.