[SalesForce] “Unable To Obtain Exclusive Access To This Record” Error Occuring Sporadically When Unit Test Run

I have a unit test WE_MasterOpportunityTriggerTest for my trigger WE_MasterOpportunityTrigger.

When this test is run (for validation before / during deployment or several times in the course of a couple of minutes), about 50% of the time it hits the error

System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, WE_MasterOpportunityTrigger: execution of AfterInsert caused by: System.DmlException: Upsert failed. First exception on row 0; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record: [] Class.WE_OppUpdates.recursiveInsert: line 20, column 1 Trigger.WE_MasterOpportunityTrigger: line 20, column 1: []
Stack Trace: Class.WE_MasterOpportunityTriggerTest.testTrigger: line 21, column 1

I'm assuming that as long as nothing changes in the org's configuration, the unit test's performance should essentially be the same.

I've seen this documentation on data skews and each of the 200 Opportunities will have 12 child records linked to them by another piece of code. So could the execution of update opps; be happening too soon after the insertion of the records (while the sharing rules are still being calculated) & if so what's the best way to manage this in the unit test?

If that's not likely to be the cause of the issue, can anyone explain why this a test might only fail, due to this error, some of the time when it is run?

trigger WE_MasterOpportunityTrigger on Opportunity (
    before insert, after insert, 
    before update, after update, 
    before delete, after delete)
{

    if (Trigger.isBefore) {
        if (Trigger.isInsert) { }
        if (Trigger.isUpdate) {
            WE_OppUpdates.updateOpportunities(Trigger.old,Trigger.new);
        }
        if (Trigger.isDelete) { }
    }

    if (Trigger.isAfter) {
        if (Trigger.isInsert) {
            WE_OppUpdates.recursiveInsert(Trigger.new);

            WE_MRFv3 forecast = new WE_MRFv3(true,Trigger.newMap);
            forecast.generateMRF();

            WE_TargetManager target = new WE_TargetManager(Trigger.newMap);
            target.generateNewTargets();
        } 
        if (Trigger.isUpdate) {
            WE_MRFv3 forecast = new WE_MRFv3(Trigger.oldMap,Trigger.newMap);
            forecast.generateMRF();

            WE_TargetManager target = new WE_TargetManager(Trigger.oldMap,Trigger.newMap);
            target.updateTargets();
        }
        if (Trigger.isDelete) { }
    }
}

@isTest
public class WE_MasterOpportunityTriggerTest {

    static testMethod void testTrigger() {

        User u = WE_TestDataUtility.createStandardUser();

        System.runas(u) {

            List<Opportunity> opps = WE_TestDataUtility.createOpportunities(
                'EU Opps',              // oppRef
                200,                    // noOpps
                'CP Virtual Prepaid MC',// recordTypeName
                'id2',                  // identifier
                '1) Suspect',           // stageName
                date.today()+1,         // closeDate
                'Standard',             // rampProfile
                100.00);                // settlement
            insert opps;

            for(Opportunity o : opps) {
                o.CloseDate = date.today()+2;
            }
            update opps;
        }
    }

}

public class WE_OppUpdates {

    public static void recursiveInsert(Opportunity[] records) {

        Opportunity[] oppsToClone = new Opportunity[]{};

        if(WE_ApexUtility.apxCntrlrOppMrfStatusAssessed == false) {
            WE_ApexUtility.checkApexControllerStatus('Opportunity MRF');
        }
        if (!WE_ApexUtility.forecastDisabled.contains(true)) {
            Id userId = userinfo.getUserId();
            WE_VRC__c setting = WE_VRC__c.getInstance(userId);
            setting.All_Opportunity_Disabled__c = True;
            upsert setting;

            WE_ApexUtility apxUtil = new WE_ApexUtility();
            apxUtil.retrieveValidRecordTypes('EU NA Opps');

            for(Opportunity o : records) {
                if (WE_ApexUtility.validEuNaRecordTypeIds.contains(o.RecordTypeId) ) {
                    oppsToClone.add(o);
                }
            }
            if (oppsToClone.size() > 0) {
                WE_ApexUtility.recursiveOppUpdate(oppsToClone,true);
            }
            setting.All_Opportunity_Disabled__c = False;
            upsert setting;
        }
    }

    public static void updateOpportunities(Opportunity[] oldRecords, Opportunity[] newRecords) {
        if(WE_ApexUtility.apxCntrlrOppMrfStatusAssessed == false) {
            WE_ApexUtility.checkApexControllerStatus('Opportunity MRF');
        }
        if (!WE_ApexUtility.forecastDisabled.contains(true)) {
            WE_ApexUtility apxUtil = new WE_ApexUtility();
            apxUtil.retrieveValidRecordTypes('EU NA Opps');

            for(Integer index = 0, size = newRecords.size(); index < size; index++) {
                Opportunity oldRecord = oldRecords[index], newRecord = newRecords[index];

                if (WE_ApexUtility.validEuNaRecordTypeIds.contains(newRecord.RecordTypeId) ) {
                    Date cd = newRecord.CloseDate;

                    if (WE_ApexUtility.recursiveOppUpdate ||
                        oldRecord.CloseDate != newRecord.CloseDate || 
                        oldRecord.Ramp_Profile__c != newRecord.Ramp_Profile__c) 
                    {
                            if (newRecord.Ramp_Profile__c == 'Immediate') {
                                newRecord.Implementation_Revenue__c = cd.addDays(15);

                                if (oldRecord.Implementation_Revenue__c == null ||
                                    oldRecord.Implementation_Revenue__c.month() != newRecord.Implementation_Revenue__c.month() ||
                                    oldRecord.Implementation_Revenue__c.year() != newRecord.Implementation_Revenue__c.year()) {
                                    newRecord.Revenue_Commencement__c = newRecord.Implementation_Revenue__c.addMonths(1).toStartOfMonth();
                                    newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(3);
                                    newRecord.Implementation_Revenue_as_a_of_FSR__c = 90/2;
                                }
                            } else if (newRecord.Ramp_Profile__c == 'Standard') {
                            newRecord.Implementation_Revenue__c = cd.addMonths(3);
                            newRecord.Revenue_Commencement__c = newRecord.Implementation_Revenue__c.addMonths(1).toStartOfMonth();
                            newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(3);
                            newRecord.Implementation_Revenue_as_a_of_FSR__c = 90/2;//1st MRF month set to 10%
                        } else if (newRecord.Ramp_Profile__c == 'Medium') {
                            newRecord.Implementation_Revenue__c = cd.addMonths(3);
                            newRecord.Revenue_Commencement__c = newRecord.Implementation_Revenue__c.addMonths(1).toStartOfMonth();
                            newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(6);
                            newRecord.Implementation_Revenue_as_a_of_FSR__c = 90/5;
                        } else {
                            newRecord.Implementation_Revenue__c = cd.addMonths(3);
                            newRecord.Revenue_Commencement__c = newRecord.Implementation_Revenue__c.addMonths(1).toStartOfMonth();
                            newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(12);
                            newRecord.Implementation_Revenue_as_a_of_FSR__c = 90/11;
                        }
                    } else if (oldRecord.Implementation_Revenue__c.month() != newRecord.Implementation_Revenue__c.month() || 
                             oldRecord.Implementation_Revenue__c.year() != newRecord.Implementation_Revenue__c.year()) {
                        newRecord.Revenue_Commencement__c = newRecord.Implementation_Revenue__c.addMonths(1).toStartOfMonth();

                              if (newRecord.Ramp_Profile__c == 'Immediate' || newRecord.Ramp_Profile__c == 'Standard') {
                                  newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(3);
                              } else if (newRecord.Ramp_Profile__c == 'Medium') {
                            newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(6);
                        } else {
                            newRecord.Forecast_Full_Service_Revenue_Date__c = newRecord.Revenue_Commencement__c.addMonths(12);
                        }
                    }
                }
            }
        }
    }

}

Best Answer

Trigger recursion is typically handled by a static boolean since a static boolean will last throughout the life of an execution context. The issue we have with your trigger is that you have both your Before and After methods in the same class. You also have your base trigger logic split out into Before, then Insert, Update and Delete followed by as separate After, then again Insert, Update and Delete sections. So placement of a Boolean becomes somewhat difficult.

In a trigger platform, all records would typically be sent to a Dispatcher Class where the work would then be sent to different handlers for each type of operation, a BeforeInsert, BeforeUpdate, etc. Control of reentry into the trigger is much simpler when that type of architecture is used. When trying to control reentry to a base trigger, I recommend using a separate triggerUtility class of the type that follows:

Public class trigUtilHelper
{

   Public static reEntry1 = false;
   Public static reEntry2 = false;
   Public static reEntry3 = false;

}

What we now have to be careful of is we use the above, is where to place it when we test it. Your trigger may have both before, after, insert, update and delete records in it all at the same time (think in terms of the case where data comes in via dataloader). In that situation, we may need to have several static variables.

We don't want to locate these in the class you call because new instances of the trigger could be initiated while another instance is running. If that happens, current static variables affecting data flow in your classes could prevent other instances from running. So, it would seem we need to test beforeUpdate and again at afterInsert & afterUpdate.

Now, here's where we're going to run into a bit of a conundrum. If an after Insert trigger runs, workflow fires, the afterUpdate trigger will fire following workflow. You need to decide whether you want the afterUpdate trigger to operate on those records or not. If you don't, you'll need to either add a test for those records to prevent them from going to your class or from being operated on when they get there.

If you do want them operated on, then there's no issue except you could continue to have the file locking contention issues. Where I'm going with this, is that if you don't, I'd recommend sending all of your records to a class first, then sorting them for handling and have your recursion boolean as an instance variable of that class so it won't interfere with new instances. That works best with an abstraction as used in a trigger pattern such as those which use the Itrigger interface so you get a new instance. .

With all of that said, using your existing strategy, here's what I'd try to do:

trigger WE_MasterOpportunityTrigger on Opportunity (
    before insert, after insert, 
    before update, after update, 
    before delete, after delete)
{

    if (Trigger.isBefore && !trigUtilHelper.reEntry1) {

        if (Trigger.isInsert) { }

        trigUtilHelper.reEntry1 = true;
        // this prevents 2nd time through following Workflow on AfterUpdate

        if (Trigger.isUpdate) {
            WE_OppUpdates.updateOpportunities(Trigger.old,Trigger.new);
        }
        if (Trigger.isDelete) { }
    }

    if (Trigger.isAfter) {
        if (Trigger.isInsert && trigUtilHelper.reEntry2) {

            trigUtilHelper.reEntry2 = true;

            WE_OppUpdates.recursiveInsert(Trigger.new);

            WE_MRFv3 forecast = new WE_MRFv3(true,Trigger.newMap);
            forecast.generateMRF();

            WE_TargetManager target = new WE_TargetManager(Trigger.newMap);
            target.generateNewTargets();
        } 
        if (Trigger.isUpdate && trigUtilHelper.reEntry3) {
           // *** See Discussion Below***//                

            trigUtilHelper.reEntry3 = true;

            WE_MRFv3 forecast = new WE_MRFv3(Trigger.oldMap,Trigger.newMap);
            forecast.generateMRF();

            WE_TargetManager target = new WE_TargetManager(Trigger.oldMap,Trigger.newMap);
            target.updateTargets();
        }
        if (Trigger.isDelete) { }
    }
}

* **Discussion *** if you want to prevent AfterInsert records from going through you need to test for !trigUtilHelper.reEntry3, but this also prevents any new afterUpdate records the 1st time around. What to do? Could have one set of rules for when both booleans are true and another for when reEntry2 is true and reEntry3 is false. Can also choose to wait until end of the code block to set reEntry3 = true (my recommendation for here if you need to be distinguish between the two).

If this were in a trigger pattern, you could have the helper set the reentry boolean to true for you. However, you can't in this situation since it's not located in this code and this isn't a class where it can be called from. Setting it in the external class would cause it to be applied to all helper instances which would not be a good thing and is something you want to avoid.

Related Topic