Apex – Salesforce Bulkification Across Transactions

apexbulkificationgovernorlimitstransactionvisual-workflow

Here is a fun question I have been puzzling over.

So, I have been learning a lot lately about salesforce bulkification and governor limits, specifically as they relate to bulk upload of data into salesforce. One thing has me confused. Salesforce performs bulkification when uploading lots of data. My understanding of this is that it splits the large data set into batches of 200 records each, and applies govern limits to each of these transactions separately (rather than treating the entire upload as a single transaction, as a large data set could easily fail the governor limits).
The other thing is that, in the context of a given transaction, salesforce has bulkification rules that prevent excessive operations. For example, in the context of a flow, the 200 records would be flowing through, then stop at a bulkifiable event (such as a DML element) until all the records arrived at that element, then perform the operation once for all the records in that transaction to conserve for governor limits. Sounds awesome. But this brings me to my question.

I have a process that looks like this:

  • a custom button on a list view (related list) sends the parent record ID to an invoked flow.
  • I then pass the parent record ID to a custom apex action.
  • Within the apex action, I do a soql query to get all the child records, then I loop over the child records.
    • For every 10 records, I use the Apex interview class to create a new flow interview for the next flow that will perform extensive logic.
    • The very first thing I do in that next flow is call a pause element.
    • So for example, if I had 100 child records, the next flow would have 10 interviews created, then immediately paused.
    • This in essence gives me 10 brand new transactions (I am doing this to reset governor limits, as the logic I am processing in the next flow is rather extensive and fails if it runs on many more than 10 records).
      • I have confirmed that I now have 10 independent transactions by viewing the Time-Based Workflow queue, and can see that the next flow has 10 pending jobs.

But here is the tricky part. The first thing I do in the paused flow when it is resumed is

  • call an update records element (which is triggering a record triggered flow that does some logic and record manipulation).
  • This is a bulkifiable event in salesforce, and I am trying to figure out how the batches are working.
  • I would EXPECT that since I have 10 independent transactions calling this update records element that the update would be called separately for each transaction (and that the resulting record triggered flow would be called 10 separate times).
  • However, is salesforce smart enough to recognize that I am only updating 10 records at a time, and somehow combine these update calls together (within its 200 record batch size limit)?
  • The reason I am confused is that when I open the debug logs, I would EXPECT that if the updates are being triggered independently, I would have 10 separate debug logs (one for each transaction). However, I just have one debug log that is recording governor limit use.
  • As a side note, the behavior in the log is odd. It is adding up the SOQL governor limits almost to a point of failure (which is why I want the behavior split), but the CPU time is NOT adding up.
  • Although I THINK that 10 separate transactions should be running, the logs tend to indicate that maybe that is not happening????

In summary, does salesforce bulkify update records flow elements across transactions?

  • And do I have the correct understanding that a flow pause element would create separate independent transactions when calling flow interviews from an apex loop (for every 10 records)?

I would appreciate if someone was able to clarify some of these details. Also, I have struggled finding any salesforce documentation that answers these questions, so if you could point me to the relevant docs, that would be a plus. Thanks so much!! Here is the initial apex action where I am looping over the records and calling the flow interview for every 10 records:

public class TestInterviewSize {
    public class FlowInputWrapper {
        @InvocableVariable(required=false)
        public String listId;
    }
    
    @InvocableMethod(label='Get Accounts by List ID')
    public static void getAccountsByListId(List<FlowInputWrapper> inputList) {
        System.debug('Invoking method');
        System.debug(inputList);
        FlowInputWrapper input = inputList[0];
        String ListId = input.listId;
        Integer iterater = 0;
        List<List_Entry__c> batchedRecords = new List<List_Entry__c>();
        List<ID> batchedRecordIds = new List<ID>();
        Map<String,Object> inputVariables = new Map<String,Object>();
        
        String flowName = 'Create_Accounts_for_List_Entries_v_2';
        
        
        List<List_Entry__c> listEntries = [SELECT Individual_Email__c, Organization_Website__c, Multiple_Org_Matches_Found__c, Multiple_Individual_Matches_Found__c, Create_Individual__c, Create_Organization__c FROM List_Entry__c WHERE List__c = :input.listId AND (Create_Individual__c = true OR Create_Organization__c = true)];
        
        Integer numberOfListEntries = listEntries.size();
        System.debug(numberOfListEntries);
        
        for(List_Entry__c entry : listEntries) {
            batchedRecords.add(entry);
            batchedRecordIds.add(entry.Id);
            iterater += 1;
            if(math.mod(iterater, 10) == 0 || iterater == numberOfListEntries) {
                System.debug('Testing!!!');
                System.debug(iterater);
                System.debug(batchedRecords.size());
                inputVariables.put('ListId', input.listId);
                inputVariables.put('ListofListEntries', batchedRecords);
                inputVariables.put('ListEntryIds', batchedRecordIds);
                Flow.Interview myFlow = Flow.Interview.createInterview(flowName, inputVariables);
                myFlow.start();
                batchedRecords.clear();
                batchedRecordIds.clear();
                inputVariables.clear();
            }
        }
    }

}

Edit: I am attaching the debug log for reference. Notice that the governor limits are stated quite a few times (especially toward the end of the log). Is it possible that salesforce recognizes that several transactions take place around the same time, and so just generates a single debug log? However, it is odd because the SOQL limitation continues to accumulate throughout the entire log, insinuating / behaving like a single transaction. At the same time, the CPU time limitation does NOT behave this way – it seems to reset often, as it is not accumulating. Not sure if this means I have a single debug log housing multiple transaction (which doesn't make sense with the SOQL limits accumulating), or if salesforce did indeed get smart enough to batch my 10 transactions together (which doesn't make sense with the CPU limits NOT accumulating). https://drive.google.com/file/d/1yb_LXS2EU-m__AadT5Uyq0gYksc8YUPl/view?usp=sharing

Best Answer

One debug log is one transaction. Full stop. What happened here is that you placed a bunch of interviews into a paused state, so they went to the asynchronous queue. When Salesforce had time, it selected up to 1,000 paused jobs that were due for execution, batched them by user/owner, then ran those in groups of 200.

You can see this in the various parts of the log:

FLOW_START_INTERVIEWS_BEGIN|200
FLOW_START_INTERVIEWS_BEGIN|200
FLOW_START_INTERVIEWS_BEGIN|173
FLOW_START_INTERVIEWS_BEGIN|12
FLOW_START_INTERVIEWS_BEGIN|12
FLOW_START_INTERVIEWS_BEGIN|9
FLOW_START_INTERVIEWS_BEGIN|2
FLOW_START_INTERVIEWS_BEGIN|2

As you can see, 610 interviews in total were executed across 8 batches. It appears that some of the limits reset between each batch, but others do not.

Consider using a Queueable or something if you want to ensure your interviews are actually separate transactions.