XML: Loading via pipeline vs manual loading

Hi,
Is there any technical difference between loading XML/TMX files manually using XDocument (or XElement) vs building them via the Content Pipeline Tool? Im thinking in terms of performance, cross platform support etc.

I’ve just finnished porting some of my old XNA4 games to MonoGames 3.8 Open GL - and instead of dealing with getting automated XML/TMX serialization working via the Content Pipeline I just did the mapping manually via XDocument (by just setting the files as “copy” instead of “build” in the content pipeline tool).

I am old school, very olds chool, practically ancient…

For me XML is evil, pure evil. Why would I use "name value = “635388” " burning all those precious bytes in text when I could just have a binary int ?

Then there are the CPU t-states I am burning parsing the XML when I could just read an int, and the extra time it takes to read the data from disk.

MADNESS !! PURE MADNESS!!!

Sigh, but that is just me.

But if you are forced into using this terrible technology, at least the content manager compresses the XML file.

xnb files are compressed, xml files are not

In the real world this isn’t going to make a lot of difference, the image you ship will get bigger but these days that is not considered an issue. (my first hard drive held 20 megabytes of data and I wondered how I would ever fill it, now my phone has 20 gig )

1 Like

Hehe. I’m not a particular fan of XML in generall. In these cases it was mainly due to the fact that the games used Tiled as tile-editor. Maps made in Tiled are then exported in a custom XML format.

I can see your point with the files getting compressed.

Just pointing out, Towerfall uses XML… EVERYWHERE!

I like JSON :innocent:

Go away!

:stuck_out_tongue:

:fire: :rocket:

JSON is almost as verbose as XML.

INI files it is!

1 Like

THIS!

My preference is Json as I can serialize and deserialize to that format and it’s more compact than XML by a long shot. If you use a Binary Format then you have to worry about platform compatibility if you plan on doing multi-platform. INI files or just raw text files are easy, but require writing a bunch of parsing code and don’t work well with hierarchical data.

Original Link for the image

http://www.relatably.com/m/img/dr-evil-meme-right/s-l1000.jpg

[Just kidding]

@slakke If you’re ever targeting consoles, you might want to use the content pipeline with pipeline extensions + content readers to have something really cross-platform and a lot faster to load than plain XML.

source : Everything piece of data is in XML in my game engine and I ported that to consoles and learned things the hard way. :sunglasses:

1 Like

Thanks @monsieurmax. That was the type of input I was looking for! Are you saying that as long as you build the XML/Json or whatever format through the Content Pipeline (to the pipeline format) - this will safeguard some cross platform compability?

Well, by writing a pipeline importer / reader that serializes/deserializes exactly the way you want, you’ll ensure consistency and also you won’t rely on external libraries.

XML parsing on consoles is painfully slow, and any external lib used has an incompatibility risk. JSON.Net is troublesome on consoles. I think some people are using the Unity version of the lib to make things work.

What about raw XDocument.Load() for consoles and reading nodes manually? Then you have the issue of non compressed XML files I guess

Not sure about that, but given the behaviors i noticed, you’d better head to custom importers / readers.
It’s far less complex than it might seem, it’s even a bit fun and rewarding :slight_smile: