Hello,
XML and XSLT struggle with large files due to the way the files are created. For example, in XSLT, the first thing that an XSLT engine does is load the entire file into memory and to top it off, the amount of memory used is usually about 2-4 bytes for every byte stored. So a 100 meg XML file would actually occupy about 200-500 meg of memory (depending on the engine used) etc.
I have done alot of research to try and get around this but as I was doing my research I read alot of people discussing the same issue. XSLT just has serious problems with large files (as do most XML parsers). For example, try and open a 100 meg XML file in internet explorer, it might open after a couple hours but the odds are it will just crash with memory errors.
In cases where XSLT just cannot do the job, a non parsing technique needs to be found. In other words, the file will need to be split as a text file instead of an XML file.