Ok, so I tried using color.a = 0.5 with NonPremultiplied, and that seemed to draw it correctly (half alpha):
Then I tried using color.a = input.TextureCoordinates.x and I got this:
The problem has GOT to be the TextureCoordinates. Is it because of the type of Draw call I'm using?
Here's the method called in that Draw line (it's inside the TileSheet class I wrote, which stores the texture and info on the tile sizes):
Variations is an int defining how many variations each tile on the tilesheet has. So if it's 4, that means every tile on the tilesheet gets 4 cells with a slightly different graphic (for non-repetitive-textures-sake). I'm reasonably certain my math there is correct (it's supposed to be getting the lower-right 16th of a tile, depending on its size, which by default is 64x64, so the 16th in this case is 16x16.
TileWidth is the width of one tile, so in this case, 64. Same with TileHeight.
Width is the width of the entire texture. Same with Height. In this case they are both 1024.