Subject: RE: Transforming large XML docs in small amounts of memory
From: "Michael Kay" <mike@xxxxxxxxxxxx>
Date: Mon, 30 Apr 2007 12:48:19 +0100
|
> It's purely XSLT 1.0, using Saxon (on Linux and Windows, if
> that matters...), although suggestions to change this would
> not be shunned.
Just as an experiment, try it on Saxon 8.x rather than Saxon 6.5. It doesn't
always make any difference, but on some occasions I've seen it make a
dramatic improvement.
> I think that transforming 150Mb of data in 400Mb
> of RAM would be a sensible target (is this sensible?)
That's ambitious. To achieve that, you're going to have to do something that
condenses the input document before transformation.
> I'm not sure how to tell what proportion of the memory is
> used for the input DOM, output DOM, etc...
If you really mean "DOM", then start by not using a DOM, and using a Saxon
TinyTree instead. (Saxon doesn't use any memory for the final result tree -
it's normally piped straight into the serializer).
Michael Kay
http://www.saxonica.com/
|