[SalesForce] Can Queueable Apex Jobs Run in Parallel

I've got a queueable Apex class that can create parent records for sources records passed into it.

It maintains a map as it goes so that if two source records should relate to the same parent and it doesn't find a parent in the database, it can check to see if it's created one in memory already.

Issue

This queueable job is queued up using Apex.enqueueJob from a trigger context, i.e. after source records are inserted/updated. During a dataload the matching failed, and two sources that should have resulted in a single parent actually generated separate parent records.

Testing

Deleting the parent records and running the same code manually on the two sources together resulted in the correct operation proving that the cache map is working as expected.

Deleting the new parent record, running the code manually on the two source records separately also returns the correct result: an account is created for the first, the second one finds the same parent via SOQL and lives happily ever after.

Questionable Conclusion

The account IDs initially generated were sequential. Given that the cache map works on the source records and generates the correct result I'm lead to believe that the two source records were processed in separate batches of 200 from the trigger, perhaps at the end of one and start of the next, but if everything still ran sequentially with the queueable job the second would have picked up the correct account via SOQL.

Either I've got a logic bug that I still need to find that my testing has failed to reproduce, or the queueable jobs can run side by side. The former is quite likely, but I want to know if two jobs can run simultaneously because if so I'm going to have to reconsider the implementation. So, can Apex Queueable jobs run in parllel?

    Map<String, Map<Object, Account>> fieldToValueToAccount = new Map<String, Map<Object, Account>>();
    Map<Customer_Source__c, Account> sourceToAccount = new Map<Customer_Source__c, Account>();
    Map<Id, Account> accountsToUpdate = new Map<Id, Account>();

    activeRuleNumbers.sort();

    if(activeRuleNumbers.size() == 0)
        return;

    for(Customer_Source__c source : records)
    {
        String whereClause = '';
        List<String> ruleFields = new List<String>();

        // only query on rules that are active, and that have a value in the record being tested
        for(Integer ruleNumber : activeRuleNumbers)
        {
            String ruleField = 'MatchRule' + ruleNumber + '__c';

            if(source.get(ruleField) == null || (String)source.get(ruleField) == '')
                continue;

            ruleFields.add(ruleField);
            whereClause += ' or ' + ruleField + ' = \''  + source.get(ruleField) + '\' ';
        }

        if(ruleFields.size() == 0)
        {
            continue;
        }

        whereClause = whereClause.removeStart(' or ');

        String query = 'select Id, Customer__c from Customer_Source__c '
                            + ' where Customer__c != null and (' +  whereClause + ') '
                            + ' limit 1 ';

        System.debug(query);
        System.debug(source);

        List<Customer_Source__c> matches = Database.query(query);

        if(matches.size() > 0)
        {
            // found one, map it, bam. Next!
            trackAccountChange(source, matches[0].Customer__c, accountsToUpdate);
            source.Customer__c = matches[0].Customer__c;
            sourcesToUpdate.add(source);
            continue;
        }

        Account acct = null;

        // check the populated match fields for this source against our map to see if an account
        // has already been created that this source would relate to
        for(String field : ruleFields)
        {
            String sourceValue = String.valueOf(source.get(field)).toLowerCase();

            System.Debug('Searching for ' + sourceValue + ' in map for field ' + field);

            if(fieldToValueToAccount.get(field) == null)
            {
                System.Debug('No acount map found');
                continue;
            }

            acct = fieldToValueToAccount.get(field).get(sourceValue);

            if(acct != null)
            {
                System.Debug('Found account in map: ' + acct);
                break;
            }
        }

        // no account found, create a new one
        if(acct == null)
        {
            System.Debug('Creating new account');
            acct = new Account(RecordTypeId = personAccountRTID);
        }

        // map this source's values to the account found/created in the uber map
        for(String field : ruleFields)
        {
            String sourceValue = String.valueOf(source.get(field)).toLowerCase();

            if(fieldToValueToAccount.get(field) == null)
            {
                fieldToValueToAccount.put(field, new Map<Object, Account>{sourceValue => acct});
            }
            else
            {
                fieldToValueToAccount.get(field).put(sourceValue, acct);
            }
        }

Best Answer

Yes, queueable methods can run in parallel. As a proof of concept, here's some code that I wrote:

public class TenSecondQueueable implements Queueable {
    public void execute(QueueableContext context) {
        System.debug(LoggingLevel.ERROR, DateTime.now());
        Long start = DateTime.now().getTime();
        while(DateTime.now().getTime()-start<10000);
        System.debug(LoggingLevel.ERROR, DateTime.now());
    }
}

This method will run for approximately 10 seconds. I then ran this execute anonymous script:

System.enqueueJob(new TenSecondQueueable());
System.enqueueJob(new TenSecondQueueable());

The resulting output for both scripts was as follows:

>>>> JOB 1 <<<<
19:56:22.0 (4570498)|USER_DEBUG|[3]|ERROR|2017-03-22 00:56:22
19:56:32.4 (10004427521)|USER_DEBUG|[6]|ERROR|2017-03-22 00:56:32

>>>> JOB 2 <<<<
19:56:22.0 (2618567)|USER_DEBUG|[3]|ERROR|2017-03-22 00:56:22
19:56:32.1 (10001759396)|USER_DEBUG|[6]|ERROR|2017-03-22 00:56:32

As you can see, they both started and ended at the exact same time. Mind you, there's also no guarantee this will always happen. However, it's safe to say that it certainly can and will happen.


Odds are, you will probably want to use a locking algorithm to prevent this from happening. I think you'll probably want to add a sentry object (a record you can use to hold a mutex lock), so that the each additional action will need to wait. For example, you might add this near the beginning of your code:

    insert new Mutex__c();
    Mutex__c[] lock;
    while(lock == null) {
        try {
            lock = [SELECT Id FROM Mutex__c LIMIT 1 FOR UPDATE];
        } catch(QueryException e) {

        }
    }

As well as clean up near the end of your code:

    delete lock;

What will happen here is that if the record you inserted was the oldest lock, your code will have the record lock and proceed immediately. If not, it'll wait idly by until the prior Queueable has finished doing what it needs to do. This should prevent any race conditions from Queueables that run close together, although this also essentially prevents more than one queueable running at a time, even if they wouldn't touch the same record(s).

You might also simply be able to get away with using a FOR UPDATE locking statement on your dynamically generated query. This will "probably" be safe enough for most purposes, but you'll need to do some testing.

Related Topic