[SalesForce] How to bypass Governors limit when inserting more than 10000 records from the execute method of a Apex Batch

Edits : Adding the code snippet here for review

 global class Agg_Batch implements Database.Batchable<sObject>, Database.Stateful, System.Schedulable {

 global Agg_Batch() {

        //Initialize the collections and set
        grpLfcMappedQuarterImports = new List<ParentObject>();
        setLFCMappedQyStrings = new Set<String>();
        updatedLfcMappedQuarterImports = new List<ParentObject>(); // this is to update Sync to report flag on the ParentObject

    }

     public static void schedule(){

        String CRON_EXP = '0 0 * * * ?';//every hour
        //cannot be tested in Apex tests, causes AsyncException if class already scheduled.
        if( !Test.isRunningTest() )
            System.schedule(JOB_NAME, CRON_EXP, new Agg_Batch() );

    }


    private Id enqueueBatchProcess(){

        if( BatchUtilityService.getBatchJobRunningCount( CLASS_NAME ) > 0 )
            return null;
        return Database.executeBatch( new Agg_Batch(), 1 );

    }


    global void execute( SchedulableContext sc ){
       enqueueBatchProcess();
    }


//ParentObject - This is parent object, and each of the associated child object can have more than 10000 records.
//ChildObject - This is child records
//AggObject - Aggregate object.
global List<ParentObject> start( Database.BatchableContext BC ){


        for( ParentObject mappedImport1 :
                    [SELECT Id, Revision_Status__c, ImportID__c, AggView_Period__c,
                    Year__c, Quarter__c, Account_ID__C FROM ParentObject WHEN Sync_To_Report__c = false
                    ])
        {

            updatedLfcMappedQuarterImports.Add(new ParentObject(id = mappedImport1.Id, Sync_To_Report__c = false)); // the parent record, once read needs to be flagged

            if (!setLFCMappedQyStrings.contains(mappedImport1.ImportID__c)) {
                setLFCMappedQyStrings.add(mappedImport1.ImportID__c);
                ParentObject imp = new ParentObject(LFC_Quarter_Id__c = mappedImport1.LFC_Quarter_Id__c,
                        Account_Id__c = mappedImport1.Account_Id__c, Year__c=mappedImport1.Year__c, Quarter__c = mappedImport1.Quarter__c); 

                grpLfcMappedQuarterImports.add(imp); // Grouping unique ParentObject by an Alternate ID i.e. ImportID__c
            }
        }

        return grpLfcMappedQuarterImports;


}

global void execute(Database.BatchableContext BC, List<sObject> scope) {

        for ( ParentObject mappedImport : grpLfcMappedQuarterImports) {
            generateAggregateReportData(mappedImport);
        }
    }

 public  void generateAggregateReportData( ParentObject mappedImport ) {


        String importID = mappedImport.Account_Id__c + '^' + String.valueOf(mappedImport.Quarter__c ) + '^' + (mappedImport.Year__c == null ? '': String.valueOf(Integer.valueOf(mappedImport.Year__c)));
        String periodString = String.valueOf(mappedImport.Quarter__c ) + ' ' + (mappedImport.Year__c == null ? '': String.valueOf(Integer.valueOf(mappedImport.Year__c)));

        // Generate new AggObject records
        Map<String, Integer> keyToOccurenceCountMap = new Map<String, Integer>();

        Map<String, List<AggregateLineGroup>> aggregateMap = new Map<String, List<AggregateLineGroup>>(); 
        List<ChildObject> lines = LFCMappedLinesSelector.getMappedLineItemsByMappedKeyImportId( importID ); // This call gets more than 10000 records
        List<AggObject> aggrInsertList = new List<AggObject>();

        for( ChildObject line : lines ){

            //Custom agg Logic to aggregate into aggrInsertList

        }

        if (aggrInsertList.isEmpty()) {
            insert aggrInsertList; // this call FAILS [wondering if this can be BATCHED Too]
        }

}

 global void finish(Database.BatchableContext BC) {

        if ( !updatedLfcMappedQuarterImports.isEmpty())
            update updatedLfcMappedQuarterImports;

        System.debug('MappedLineAggregateBatch2 finished Processing');

}
}

Here are the sequence of events I am trying to execute :

In the start method of the Batch, I am trying to query Parent Object ( Keeping Batch Size =1, this is kept minimum as we are subsequently querying a very large set of child record later ).

We are passing each Parent Record into the execute method to query for child records and then Aggregating results of the child records and finally inserting into a Aggregate table ( this is a new object that needs to store the aggregate results from the child rows).

Now issue is, in step 2 I am getting from than 10,000 records ( we have to get all records as we are running aggregating on the entire set), and while trying to insert it-I am running into governors limit. I am wondering if there is way the aggregate collection in step2 above also be Batched or if there is any other option to consider. I even tried to implement queyarable interface, but running into governor limit which is 100 max job.

Any advice?

Best Answer

You should do by using queuable implementation but you have to chain the job one after other.Each job can process 10k and I believe we can chain 50 jobs please check documentation. The challenge here is you need to pass only subset of records for each job from the remaining records..!

Related Topic