Hello! I’ve been having some issues with the size of the .xnb files when building my project. I tried doing my research but I could not find anything that would help me with my problem.
This is the topic I found that resembles my issues but none of the answers helped me out.
I have around 30 MB worth of .png images that jump out to 3 GB of .xnb.deploy files, I’ve tried a lot of things to solve this but I can’t seem to solve this issue. I’m running the latest Monogame version (3.6) and tried many different conent pipeline settings without any luck. I tried in both Debug and Release mode.
@DexterZ Yea that would probably work but that’s not really solving the problem now is it xD There’s obviously something wrong when my files get 300 times bigger when converted to .xnb
Some more info that might be useful:
Tried in HiDef and Reach, there was a 3 MB difference… but that doesn’t mean much when I’m dealing with 3 GB
Tried loading the same files in a brand new Monogame project, same result. 8 MB was turned into 1,99 GB
Tried again with Compress set to True and HiDef - Build 274 succeeded, 0 failed, Time elapsed 00:02:58.69.
AND THAT last part somehow gave me 12 MB, I think I almost pissed myself when I saw the file sizes
So I tried it out in my main project with Compress set to True AND HiDef and hallelujah it worked. The entire bin folder makes up 25 MB. However now I have a different issue, the building time takes even longer than before. It used to take around 2 minutes now I’m up to 4 minutes. I mean I’ll take it but I’m planning on having 10-20 times the amount of content I have right now, that won’t work later.
Build 537 succeeded, 0 failed, Time elapsed 00:04:10.94.
gave me 25 MB
Yo’ I’m aware that some compiled content(texture) size is 8x bigger than the original source, the reason mostly I loaded my texture contents through stream, and from other content I wrote my own mesh loader not using the MG content pipeline tools.
I don’t really don’t think there’s something wrong with it, most likely there are additional information added to compiled content ^_^y
"Any suggestions for the building time?"
IMO I really don’t mind how long it takes on building contents, what I would mind is long it takes to load.
But let’s hope that an internist comes in to elaborate the compiled content size…
"Build 537 succeeded, 0 failed, Time elapsed 00:04:10.94."
Yo dude, I haven’t notice this, your rebuilding all your 537 content, while I only rebuild the newly added or modified content. If your using the “MonoGame Pipeline Tool” you can right click a single content you want to compile and select [Rebuild].
For sure if your only rebuilding a single newly added or modified content it wont take 4 minutes ^_^y
I think even if it worked when resizing them to a power of 2 that would mess up a lot of stuff in my project sadly
My Cpu is a Intel® Core™ i5-6400 CPU @ 2.70GHz 2.71 GHz
Alright that makes sense, I can only find Build and Copy as options in the Settings though… or do you mean to just build the individual new textures? Sorry I’m new to this in the past I used to only have low amount of textures so this was never an issue before lol
The file sizes are so large because when MG processed a texture the raw color data is stored without any compression.
To reduce build times, always run Build and not Rebuild. The Pipeline Tool is smart and stores the last modified timestamp of processed files to check if they changed since they were last built and skips them if they have not.
Are DDS files just passed through or do those go through everything as well? To be more specific, are the traits of the source DDS file preserved, such that a DXT3 DDS file w/ mips is not transformed into Color.
I’m not seeing anything related to the headers (for DDS) so I assume so, but just because I don’t see it doesn’t mean they’re not just being restructured. Edit: outside of the content-pipeline that is, which is pants-shittingly nuts in DDS handling.
Just for completeness in this thread , you can set the format of textures to compressed. This will use a lossy block compression format that can be natively rendered on the platform you’re targeting (e.g. dxt for desktop). I.e. the texture does not have to be decompressed to draw it to the screen. The savings in file size are (in almost all cases) not even close to what you’d get using the generic compression, but since the texture data does not have to be decompressed and the textures are smaller when sent to the gpu, loading and rendering the image is faster.