Monogame Pipeline file size too big

Hello! I’ve been having some issues with the size of the .xnb files when building my project. I tried doing my research but I could not find anything that would help me with my problem.

This is the topic I found that resembles my issues but none of the answers helped me out.

I have around 30 MB worth of .png images that jump out to 3 GB of .xnb.deploy files, I’ve tried a lot of things to solve this but I can’t seem to solve this issue. I’m running the latest Monogame version (3.6) and tried many different conent pipeline settings without any luck. I tried in both Debug and Release mode.

Appreciate any help with solving this issue!


Here are my properties:
Gyazo link

Isn’t this relating to that thing… umm where there are multiple sizes of the same image in the same image, forgot the name, mipmaps? or something… could be the reason why?

I don’t think so, my images are just frames that make up a sprite sheet that I use for animation. My processor parameter for GenerateMipmaps is set to False not sure if that helps.

What’s their dimensions?

Are they a ^2 format?

I’m afraid not… most of the images are 300x300 or 400x400 or similar resolutions. I also edited my settings properties to the main post

Don’t compile it, load it as stream, you can just manually rename it to .xnb so the end user wont bother to open it : - D Same size, no harms done, problem solve about the size ^_^y

@DexterZ Yea that would probably work but that’s not really solving the problem now is it xD There’s obviously something wrong when my files get 300 times bigger when converted to .xnb

Some more info that might be useful:

  • Tried in HiDef and Reach, there was a 3 MB difference… but that doesn’t mean much when I’m dealing with 3 GB

  • Tried loading the same files in a brand new Monogame project, same result. 8 MB was turned into 1,99 GB

  • Tried again with Compress set to True and HiDef - Build 274 succeeded, 0 failed, Time elapsed 00:02:58.69.

AND THAT last part somehow gave me 12 MB, I think I almost pissed myself when I saw the file sizes

So I tried it out in my main project with Compress set to True AND HiDef and hallelujah it worked. The entire bin folder makes up 25 MB. However now I have a different issue, the building time takes even longer than before. It used to take around 2 minutes now I’m up to 4 minutes. I mean I’ll take it but I’m planning on having 10-20 times the amount of content I have right now, that won’t work later.

  • Build 537 succeeded, 0 failed, Time elapsed 00:04:10.94.
    gave me 25 MB

Any suggestions for the building time?

Yo’ I’m aware that some compiled content(texture) size is 8x bigger than the original source, the reason mostly I loaded my texture contents through stream, and from other content I wrote my own mesh loader not using the MG content pipeline tools.

I don’t really don’t think there’s something wrong with it, most likely there are additional information added to compiled content ^_^y

"Any suggestions for the building time?"

IMO I really don’t mind how long it takes on building contents, what I would mind is long it takes to load.

But let’s hope that an internist comes in to elaborate the compiled content size…

Cheers ^ _^y

Yea, I’ll take what I can get for now and just deal with the long building time, really appreciate the help, cheers :smile:

1 Like

Have you tried manually resizing them to a power of 2 and seeing how it affects things?

What CPU are you rocking?

I went for a core i7 4C/8T and that reduced compile times dramatically…


Compared to a 4C/4T i5

I should probably add, the i5 was a desktop 2nd gen and the i7 a 6th gen i7 HQ

"Build 537 succeeded, 0 failed, Time elapsed 00:04:10.94."

Yo dude, I haven’t notice this, your rebuilding all your 537 content, while I only rebuild the newly added or modified content. If your using the “MonoGame Pipeline Tool” you can right click a single content you want to compile and select [Rebuild].

For sure if your only rebuilding a single newly added or modified content it wont take 4 minutes ^_^y

I think even if it worked when resizing them to a power of 2 that would mess up a lot of stuff in my project sadly :confused:

My Cpu is a Intel® Core™ i5-6400 CPU @ 2.70GHz 2.71 GHz

Alright that makes sense, I can only find Build and Copy as options in the Settings though… or do you mean to just build the individual new textures? Sorry I’m new to this in the past I used to only have low amount of textures so this was never an issue before lol

1 Like

The file sizes are so large because when MG processed a texture the raw color data is stored without any compression.

To reduce build times, always run Build and not Rebuild. The Pipeline Tool is smart and stores the last modified timestamp of processed files to check if they changed since they were last built and skips them if they have not.


Are DDS files just passed through or do those go through everything as well? To be more specific, are the traits of the source DDS file preserved, such that a DXT3 DDS file w/ mips is not transformed into Color.

I’m not seeing anything related to the headers (for DDS) so I assume so, but just because I don’t see it doesn’t mean they’re not just being restructured. Edit: outside of the content-pipeline that is, which is pants-shittingly nuts in DDS handling.

I don’t know how the DDS importer is implemented.

Just for completeness in this thread , you can set the format of textures to compressed. This will use a lossy block compression format that can be natively rendered on the platform you’re targeting (e.g. dxt for desktop). I.e. the texture does not have to be decompressed to draw it to the screen. The savings in file size are (in almost all cases) not even close to what you’d get using the generic compression, but since the texture data does not have to be decompressed and the textures are smaller when sent to the gpu, loading and rendering the image is faster.

The texture importer will reprocess DDS files through the same path as the other images. It will throw way the dxt compressions & mipmaps and re-create them.

I have a pass through DDS importer that currently works for DXT1 cubes. You could use it as a template to implement dxt3.

1 Like