[Home] [By Thread] [By Date] [Recent Entries]

  • From: "Steven E. Harris" <steven.harris@t...>
  • To: xml-dev@l...
  • Date: Mon, 20 Nov 2000 13:56:57 -0800

Paul Tchistopolskii <paul@q...> writes:

> When processing the document of 1 Mb in size and if producing the result 
> document which is about 2 Mb in size, the amount of RAM required for this 
> is not 3 Mb. It is much bigger. *Much* bigger. When using key() 
> 'for speed' ( looks reasonable to use key() only for *large* documents, 
> right? ) - add some more RAM for building the in-memory index. 

Agreed. I had worked on a project for a while with XML files that got
up over 300MB. Anything other than stream-based processing with
constant memory usage was impossible.

Whatever happened to that "stream-processing XSLT profile" thread from
way back when? The closest thing to an implementation I've seen were
my own Perl modules and the XML::Twig Perl module.

-- 
Steven E. Harris        :: steven.harris@t...
Tenzing                 :: http://www.tenzing.com

Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member