xelnag

Memory issue with XOG file Write process

Discussion created by xelnag on Jul 30, 2013
Latest reply on Jul 30, 2013 by xelnag
Hello,

Currently I need to generate a file -with a gel- which is the output of a really big SQL Query that could potentially grow bigger in the future because of the data it feeds from. In order to avoid Out of Memory issues, what i've done is to split that file into several files inside the GEL.

i.e. Let's say the complete Query has 10k rows (it's way more).

I would generate

File1 - Rows 1 to 1000
File2 - Rows 1001 to 2000
...
File10 - Rows 9001 to 10000

However when looking at the files, what happens is this:

File1 - Rows 1 to 1000
File2 - Rows 1 to 2000
File3 - Rows 1 to 3000
...
FileN - Rows 1 to N -> OOM

What's really happening is that it seems like it is doing an incremental writting where the last file contains all previous data. In the end this leads to an out of memory (<file:line (file:line)> Java heap space). I've tried both core:remove and resetting vars before the writing to "reset" what seems to be a cache issue, but it didn't solve anything.

Any ideas on how to do it?

Outcomes