[SalesForce] First error: Too many query rows: 50001 in a batch apex class without any inner query

I am having a strange First error: Too many query rows: 50001 when my batch run. I know that this will occur if I am trying to execute queries inside my execute, as those query result list count against the limit per transaction (50,000). But I am getting the error in the start method, where this normally should be able to query over millions of record. Then why this message ?

Here is the code for the execute :

   global void execute (Database.Batchablecontext BC, List <sobject> scope){
    System.debug('## Start method execute'); 

    map<string,set<string>> cskeyStatutMap = new map <string,set<string>> ();
    map<string,map<string,string>> campaignIdStatutCountMap = new map<string,map<string,string>>();

    map<string,string> statutCountMap = new map<string,string>();
    set<string>statutCamMemSet = new set<string>();
    list<Campaign> campaignList = new list<Campaign>();

    //from the aggregate result, build a map of campaign id, to count peur status
    for (Sobject so : scope)  {
        AggregateResult ar = (AggregateResult) so;

        //System.debug('## campaign id ' + ar.get('camId') + '- statut ' + ar.get('cmStatut') + ' - count ' + ar.get('countMembers') );

        if(campaignIdStatutCountMap.get(string.valueOf(ar.get('camId'))) == null)
        {
            campaignIdStatutCountMap.put(string.valueOf(ar.get('camId')), new Map<string,string>{string.valueOf(ar.get('cmStatut')) => string.valueOf(ar.get('countMembers'))});
        }
        else
        {
            campaignIdStatutCountMap.get(string.valueOf(ar.get('camId'))).put(string.valueOf(ar.get('cmStatut')), string.valueOf(ar.get('countMembers')));
        }                   
     }
    system.debug('## campaignIdStatutCountMap size :' + campaignIdStatutCountMap.size()); 
     //loop in custom setting - building map field API->status
    for(CS88GestionStatutCampagneMember__c cs : CS88GestionStatutCampagneMember__c.getAll().values()){
        for(string str : cs.Statut__c.split(';')){
            if(cskeyStatutMap.containsKey(cs.Name))
                (cskeyStatutMap.get(cs.Name)).add(str);
            else
                cskeyStatutMap.put(cs.Name,new set <string>{str});          
        }   
    }

    //update each campaign
    for(string campaignId : campaignIdStatutCountMap.keyset()){

        sObject c = new campaign();         
        c.put('id',campaignId);
        //update the time batch executed only on the last step
        if (finalStep) c.put('TECH_DerniereDateMAJStats__c', System.now());

        statutCountMap = campaignIdStatutCountMap.get(campaignId);

        for(string keyStatus:statutCountMap.keyset()){
            for(string csField: cskeyStatutMap.keyset()){
                statutCamMemSet = cskeyStatutMap.get(csField);
                //retrieve API name of field to be updated
                if(statutCamMemSet.contains(keyStatus)){  
                    if(statutCountMap.get(keyStatus) != null) c.put(csField,decimal.valueOF(statutCountMap.get(keyStatus))); 
                }                   
             }
        }
        campaignList.add((Campaign)c);
    }      

    List<Database.saveResult> updateResults = Database.update(campaignList, false);

    for(integer i=0;i<updateResults.size();i++){
        if (updateResults.get(i).isSuccess())  successs += 1;
        else{
            // DML operation failed
            Database.Error err = updateResults.get(i).getErrors().get(0);
            string s = 'L\'enregistrement avec l\'id suivante a échoué: ' +  updateResults.get(i).Id  + '. L\'erreur est du a :' + string.valueof(err.getMessage());
            errorGenerated.add(s);
        }
    }   

    system.debug('## AP117_BatchStatsCampagne >> Execute Ends');        
}

Best Answer

AggregateResult does not support a QueryLocator, and the Start method only supports more than 50,000 rows if you use a QueryLocator. The net result is that you cannot batch process aggregate results for more than 50,000 rows. You'll instead want to use Database.Stateful, and iterate over the individual elements to sum them up, and possibly do something with them in the finish method (probably chain to a second batch process).