Test Data Manager

  • 1.  Is there a record size limitation in Javelin Bulkcopy?

    Posted Apr 19, 2018 10:32 AM

    Hi,

    We are trying to perform a bulk copy from Oracle to MS SQL using Javelin. We have successfully copied ~14K records in about 6 seconds. However, when attempting to perform a complete bulk copy (~156 million) record, it failed (freezes) for over a day and crashed.

    Anyone has any clue why is this happening? I don't want to have to create thousands of subsets to copy.

    FYI, there are only 9 columns in the table.

     

    Thanks.



  • 2.  Re: Is there a record size limitation in Javelin Bulkcopy?
    Best Answer

    Broadcom Employee
    Posted Apr 19, 2018 10:52 AM

    You would most likely be running out of memory due to the large data set. Can you set the BatchSize parameter to maybe 20K or so and try? This should enable the bulk copy to run in batches rather than running the full data set at once. 

     

    You could also try by only setting the AvoidOutofMemmoryIssue parameter to True first to see if that helps as well before setting the batchsize.



  • 3.  Re: Is there a record size limitation in Javelin Bulkcopy?

    Posted Apr 19, 2018 11:05 AM

    Thank you Anil! Let me try these.



  • 4.  Re: Is there a record size limitation in Javelin Bulkcopy?

    Posted Apr 19, 2018 03:59 PM

    I don't see the AvoidOutofMemmoryIssue parameter, so I tried the BatchSize with 50k, and it works!

    Many thanks, Anil!