Prevent CPU Time Exceeded Governor limit

apexbulk-apicpulimitgovernorlimitsvisual-workflow

I have a very small flow. It simply calls an apex action, then an update records element. This is set to run on create or update on the account object. Here's a picture:

enter image description here

Although this flow is pretty simple, I am facing a problem; I am consistently getting a CPU Time Exceeded error when bulk uploading new accounts. I understand the bulkification process pretty good, that salesforce takes all my bulk records and splits them into groups of 200 records for processing (that is, all the governor limits apply to the groups rather than the entire upload). So, I am a bit puzzled why I am receiving the CPU limit with such a small flow. I do expect the issue lies within the apex action I am calling, but still a bit puzzled, because the apex is not grossly inefficient, and I have written my invocable methods to allow for the bulkification process. Here's the code:

public class CompareOrganizationNames {
    
    public class ComparisonParams {
        @InvocableVariable
        public String orgName;
        
        // Default constructor
        public ComparisonParams() {
        }

        // Constructor with orgName parameter
        public ComparisonParams(String orgName) {
            this.orgName = orgName;
        }
    }
    
    @InvocableMethod(label='Compare Organization Names' description='Compares two organization names, regardless of casing, punctuation, and organization suffix (ex. WAL-MART, INC. and walmart would result in a match.')
    public static List<String> compareOrgNames(List<ComparisonParams> requests) {
        System.debug('Starting!');
        List<String> results = new List<String>();
        
        //Regex pattern for punctuation
        String punctuationRegex = '[\\p{Punct}\\s]+|(?i)(llc|corp|inc|co|edu)$';
            
            for (ComparisonParams request : requests) {
                String org = cleanOrgName(request.orgName, punctuationRegex);
                results.add(org);
            }
            System.debug('Returning!');
            return results;
        }
    
    private static String cleanOrgName(String organizationName, String punctuationRegex) {
        return organizationName.trim().toLowerCase().replaceAll(punctuationRegex, '');
    }
}

Just as a brief summary of this code, I am expecting 1 – 200 account records (200 in the case of bulk upload) that all have an org name property. I am wanting to 'Clean' the org name so that I have something to compare to in other automations (which are irrelevant to this problem). The idea is to set the cleaned org name to all lower case, remove any spaces, remove any punctuation, and remove any common trailing company suffix (ex. llc, edu, etc…). Also of note is that this flow runs completely fine if I were to create or update a single records. It also completes with a limited number batch size (50 to 100 records complete no problem). It fails when I attempt to upload a large number of records and the flow runs on a transaction size of 200 records.

After some testing, it seems that this apex is getting through about 150 records pretty consistently before erroring out (so pretty close). However, this is puzzling to me, because it doesn't seem overly inefficient. I know it is bad to put a loop in a loop, or if I were to retrieve all the accounts within the apex itself it would have to process a lot more records because it wouldn't operate in the bulkification process, but since I only have a single loop and it is guaranteed that the max number of records coming through the invocable method is only 200, I am confused why it is failing, or even anywhere close, to the cpu governor limit of 10 seconds.

Is there any way to add some sort of record limit that enters a flow (ideally only allow 100 records at a time, instead of 200 – and for context, this subflow will eventually be invoked as a subflow, rather than from the data upload tool)? Or, do I possibly need to add some sort of bulkification process to the apex itself?

Any thoughts on how to make my code more efficient would be very appreciated. Thanks!!

Best Answer

Your problem isn't the code, it's the flow. An After Save Flow causes all triggers to run a second time when updating the current records. This nearly doubles the CPU time you're using. It sounds like your trigger logic takes at least 37ms per record, so when you double that, it's closer to 75ms minimum. I'd write a unit test too verify. At any rate, changing the flow to a Before Save Flow will reduce the CPU usage by about 50%, and as such, you'll be allowed to build update with impunity.

Related Topic