[SalesForce] Too many SOQL queries: 101 and Triggers

SHORT VERSION:

I'm getting a "Too many SOQL queries: 101" error on my production environment when updating some Leads through an external application. The theory is that it's due to multiple nested triggers applying, but I have been unable to find proof testing through Unit Tests since the number of SOQL I'm getting caps at 14. I'm not sending more than 200 Lead updates per transaction. Any other ideas that might be causing the bloat I'm experiencing?

LONG WINDED EXPLANATION:

I know that similar questions have already popped up, but I have a pretty specific scenario at the moment and after playing around with unit tests to try out scenarios, I'm at a loss with how to deal with things.

OK, so I have an external import system. This import system in a certain scenario will update a Custom Object on a chunk of Leads. This activates a Trigger that will check if the trigger was activated by the expected user and if so, it will check if the Custom Object is now defined, for all of those where the Custom Object is defined, it will proceed to call database.ConvertLead passing a list of Leads.

During the Conversion of those Leads, a few triggers happen associated to Leads and Contacts, and then after that, 4 more queries are done to update references of other Custom Objects to stop pointing to the Lead and start pointing to the new Contact. No triggers are associated to these 4 queries that I know of.

So on our production environment we are getting a ton of "Too Many SQOL queries" errors. So around 7000 Lead updates with the intent to transform them to Contacts are happening. This aren't being done in a single instance, the c# code that is calling the Salesforce API is already doing all upserts in chunks of 200 at most.

Experimenting with Unit Tests in a test Sandbox I've noticed a few things. First off that for each block of 200, I'm supposedly consuming 14 queries (I'm using Limits.getQueries()). If I update more than 200, they get automatically partitioned in the Trigger, but they still are tallying up for the SOQL Limit, so If I update 500, the trigger will do 3 executions, two of 200 and one of 100, but my total SOQL usage at the end of it all would be 42. This is still well under what is supposedly happening in production where each Update has 200 elements at most.

But I tested a bit more in the Unit Test and if I do two separate Updates of 100 each, the Trigger treated each chunk with a separate trigger of 100, but the SOQL count is still adding up for both (28 total SOQL queries). This might just be because the Unit Test is still a single instance of execution, but now I'm even wondering if each instance of an update is not treated as a separate transaction, because I can't see any other explanation for the behavior in Production.

Currently the ideas I have in mind are to either disable triggers when doing these operations, it would certainly reduce number of SOQL queries, but it also means not doing some operations that we would prefer to do just in case to ensure system coherence. Second option is to reduce the volume of the calls that come from code, but I'm now unsure if that would even help if the queries are getting compounded across multiple upsert requests. The final idea I have in mind is to use the Future anotation, this sounds like it would in theory solve my issues, but it also leaves me blind since now I'm not getting full responses from Salesforce so something important might get lost.

Sorry for the gigantic info dump, but I'm at a point where I just don't know how to proceed anymore.

Best Answer

It's important to get some core platform concepts in place to try to troubleshoot this issue. The big one is the distinction between DML operations, trigger invocations, and transactions.

Your transaction is the unit you need to be worrying about. The transaction, if I can summarize ruthlessly, spans everything from your entry point (wherever execution started, be it a UI update, an Anonymous Apex invocation, a unit test being executed, or an API call) to the end of all synchronous functionality spawned as a consequence thereof.

In your scenario, the external API call marks the start of a transaction. All triggers that execute subsequent to that call are part of the same transaction. Any Asynchronous Apex (future methods, Queueable Apex, Scheduled Apex, and Batch Apex) takes place in a separate transaction.

Experimenting with Unit Tests in a test Sandbox I've noticed a few things. First off that for each block of 200, I'm supposedly consuming 14 queries (I'm using Limits.getQueries()). If I update more than 200, they get automatically partitioned in the Trigger, but they still are tallying up for the SOQL Limit,

This is how the SOQL limit (and many other limits as well) work. They are bounded by the transaction, not by the DML Operation or trigger invocation. Your trigger may be called upon to process more than one block of 200 records in any given transaction.

if I do two separate Updates of 100 each, the Trigger treated each chunk with a separate trigger of 100, but the SOQL count is still adding up for both (28 total SOQL queries).

The updates are still taking place in the same transaction. Each unit test takes place in one transaction.

Overall, unit tests are unlikely to be the most effective tool in analyzing this whole problem. They may help you understand the limits consumption of your code in specific scenarios, which is good, and help you guard against specific expensive mistakes. But what you need to do is understand the entire path of execution during your external-call-initiated sequence of operations and where, specifically, SOQL is being used in that sequence, with real data or data representative of what your code is seeing in production.

If your real-world code is using an order of magnitude more queries than your unit tests, you should examine your logs to see whether you may have trigger recursion taking place, or a call stack much deeper than you think it is due to misshapen logic that your unit tests are not designed to catch.

All that said, ...

This aren't being done in a single instance, the c# code that is calling the Salesforce API is already doing all upserts in chunks of 200 at most.

Turning your batch size down is often a quick-and-dirty solution to this kind of issue.

Related Topic