[xquery-talk] XQuery - good and fast tool
james.fuller.2007 at gmail.com
Tue Mar 10 11:43:19 PST 2009
I would be interested in stuffing this into eXist as a test ... I have
a few optimizations I do with large XML files which means splitting
into a few chunks but I would need to see the XML to see if it was
sounds like what u are working on is a confidential so email me
offlist if interested ... but with XML Prague coming up my time is
limited for the next week or so to respond fully.
cheers, Jim Fuller
On Tue, Mar 10, 2009 at 10:15 AM, Michalmas <michalmas at gmail.com> wrote:
> Thanks guys,
> I will check the most promising solutions.
> I will inform you about results of my investigations.
> On Tue, Mar 10, 2009 at 10:13 AM, Michael Kay <mike at saxonica.com> wrote:
>> There are two ways of handling XML that is too large to fit in memory:
>> * with an XML database
>> * with a streaming processor
>> Which you use depends on the overall workload. For example, if you are
>> filtering a data feed and discarding most of the incoming data, then a
>> streaming processor is clearly the right approach.
>> Saxon-SA will execute a subset of XQuery in streaming mode (meaning that
>> you don't need to have the whole source document in memory.)
>> Michael Kay
>> From: talk-bounces at x-query.com [mailto:talk-bounces at x-query.com] On Behalf
>> Of Michalmas
>> Sent: 10 March 2009 07:46
>> To: talk at x-query.com
>> Subject: [xquery-talk] XQuery - good and fast tool
>> Hello guys,
>> I am looking for good XQuery tool. But there are some requirements: XML
>> file may exceed the size of 5-8 GB.
>> The tools i am using now, like Saxon or Altova XML Spy, can't really
>> handle such file.
>> I have hound database engine MonetDB, for fast xqueries on big data sets.
>> But it seems to be still in developing phase (and a lot of things are
>> missing, like connectors).
>> Do you have any suggestions?
> talk at x-query.com
More information about the talk