[SalesForce] Bulk API Query – Splitting large query into batches

I'm currently using the bulk-api to return all records for a given sObject.

I can make it work via one very large batch, but I think it would be more efficient if I could split it into multiple batches and process them while more are being retrieved.

Is this possible? If so, how?

I know there would be various ways to pre-process the records or use something such as DateCreated fields to split on but that won't be a consistent enough solution to be worth implementing.

Thanks.

Best Answer

Hi Tom for your question we need to divide this requirement into two parts:-

1) Retrieving the data from BulkApi queries, I hope no problem for you in that.

The data will get recieved in the form of files with the max. size per file is 1GB.

So as soon as u will get a CSV file out there the second task is to parse this first CSV file(considerring 2-3 more files are in quue) into the salesforce in batches.

So your second requirement will get fulfilled with the help of the function available here:- http://boards.developerforce.com/t5/Apex-Code-Development/Batch-Insert-of-Records-in-CSV-file/td-p/131413

This function will parallely start your procedure. and combination of two fulfill your task completly