[SalesForce] Bulk API and Static Variable State

Our trigger logic has some recursion detection built in. Essentially, we have some static variables that act as counters for trigger executions. When our set "max" gets hit, our triggers quit to avoid issues with recursive triggers and governor limits.

We found out the hard way that static variables keep their state across individual Bulk API transactions. This happens in both parallel and non-parallel mode. That means our counters keep getting incremented as triggers fire because the variables are never reset. This is strange because governor limits are reset for each individual transaction like you would expect them too.

Has anyone else experienced this? Is there any documentation around this that would explain the behavior of static variables during Bulk API transactions?

Also, within a trigger, is there a way to detect whether the current transaction is running within the Bulk API? If there were, we could come up with some sort of custom solution just for the Bulk API context.

Best Answer

Has anyone else experienced this?

I have and to say it was quite annoying was an understatement. I had to call an update trigger either after it was inserted or after it's second update. I got it to work fine, but as soon as the records were inserted through DataLoader, it bombed because of this issue.

I asked a similar question and the response reflected the following:

Answered by @ScottW

From the documentation:

Use static variables to store information that is shared within the confines of the class. All instances of the same class share a single copy of the static variables. For example, all triggers that are spawned by the same transaction can communicate with each other by viewing and updating static variables in a related class. A recursive trigger might use the value of a class variable to determine when to exit the recursion.

So apparently, even though the trigger limits are being reset, it is still considering a bulk API call a single transaction. For example, a bulk API insert of 700 records is broken up into 200 records to be worked on by trigger, each batch is given of 200 records is given its own API limits. However, this is still one transaction, which means the static variables are never reset.

It is actually recommended for complex triggers that you avoid using the Bulk API altogether (this was found here):

Minimize Number of Triggers

You can use parallel mode with objects that have associated triggers if the triggers don't cause side-effects that interfere with other parallel transactions. However, salesforce.com doesn't recommend loading large batches for objects with complex triggers. Instead, you should rewrite the trigger logic as a batch Apex job that is executed after all the data has loaded.

Interestingly, recommendation eh?

Also, within a trigger, is there a way to detect whether the current transaction is running within the Bulk API?

From what I could find, no. The only way I could see to resolve this is possibly a reset for your counter; essentially toggling when a certain event should fire.

I hope that clears some things up.

Related Topic