Also, better i18n support would be nice with SpriteFont
. I gotta imagine how it handles Arabic and Hebrew languages given that removing just one letter from a word in those languages can do some funky things… like making the word look longer, changing the appearance of other letters, etc. Actually iOS had a bug with banner notifications and Arabic text that caused the OS to crash if it tried to truncate a specific word sequence to make it fit in the banner (because removing characters from words in these languages CAN make the word take up more space)
You can’t do SM4 and more with OpenGL
spriteBatch.Draw with Matrix
or skewX,skewY
Make Fonts scalable like Texture2D
You mean with a rectangle? You can already pass a Vector2 for scaling fonts.
Passing textures to shades is kinda what we do with them, it’s kinda like saying 'which we can drive around with cars".
void SomeKindOfDraw (Texture2D MyTexture){
graphics.SetRenderTarget(MyRenderTarget);
MyShaderEffect.Parameters["MyTexture2DVariableName"].SetValue(MyTexture);
MyShaderEffect.Techniques[0].Passes[0].Apply();
screenQuad.RenderFullScreenQuad(MyShaderEffect);// You want to draw to some kind of vertex object like a screen fitted square made of 2 triangles.
}
Please add pipeline support for fx files in linux and mac. (Or even better support compiling GLSL files)
Something that pops up on a regular basis: Collision detection between rotated rectangles since Rectangle.Intersects() does not support rotation.
There are solutions for this on the web but it is still always a bit messy I think
Isn’t this already covered creating a separate UWP Project?
I thought creating an UWP Project made it work for Xbox, Windows Phone and Windows 10.
The post you quoted is more than 3 years old.
But afaik yes, UWP will work on XBox One
Yes, I can confirm UWP runs on XBox. There’s also a closed implementation of MonoGame specifically for XBox One. Tom, the MonoGame lead, can give you access to it if you’re a registered developer.
Oh thanks, will definitely contact Tom when the time to creating the UWP project comes up.
Is there a difference between running an UWP project on XBox rather than using this specific set of libraries you mention?
The UWP implementation is public, you don’t need access or anything. You can’t use the full Xbox capabilities with a UWP project. You can easily find more information on the XBox website.
- Xbox Live Creators Program without registration: Xbox
- ID@Xbox for registered developers: https://www.xbox.com/en-US/developers/id
mesh = new Mesh();
mesh.vertices
mesh.triangles = triangles;
mesh.RecalculateNormals();
would be nice
You can kind of do this if you include the MonoGame.Framework.Content.Pipeline
lib and using MeshBuilder
.
I have a mesh maker peon here, but im not sure it will help with what you were doing previously. As it seemed to me you were trying keep the mesh quads separate.
You basically feed this a vector3 array of points like a height map and it generates a mesh that will map to a texture and pretty much everything including normals and tangents for normal mapping i didn’t include bi-normals as i typically calculate them on the shader as its cheap.
Better model importing.
Iam thinking how can i shift texture position on each quad, but with shared vertecies its imposible.
Am I right?
Not sure what you mean by shared vertices. Could you open a new topic for this?
He was doing 3d tile mapping but a major component for him is to have each vertice of a portion of the mesh conceptually equivillent to a tile. Have a separate uv data set so that each tile may have interchangeable textures or texture coordinates. Yet the whole thing maps to a contiguous mesh and is built as one.
E,g, if you have a mesh with 6 vertices were vertices 0 to 3 make up quad 1 and vertices 2 to 5 make up quad2 then the shared uv coordinates at vertices 2 and 3 wont allow you to use two separate tiles in that mesh. This can be done with multi texturing and more data per vertice but then that starts to represent a whole unacceptably whole lot of data for all adjoining sides and possibly even more if the mesh is not comprised of grids.
So he was attempting to separate the quads and align them so that the edges of a quad 1 which is then made up off vertices 0 to 3 touch the ends of another quad made up of vertices 4 to 7 so each has its own separate uv components.
This would all be trivial if it were possible to attach uv coordinates to index data instead of the vertice data itself but as far as i know that’s not possible.
The point of this would then be to make a 3 dimensional tilemap / heightmap / model just like any 2d map were you could place tiles at a location with arbitrary textures coordinates or even different textures as well however the entire thing would be in 3d model form. Though the edges getting artifacts can be difficult to deal with when having the vertices not truly shared.
I had been meaning to write a version of that mesh class that will do that for tilemapping but i just never had the time or need as of yet.
I linked to it as he might alter it to do what he wants with a bit of effort.