Apex – Avoid Governor Limits with Large Data Sets

We have a pretty complex trigger suite on our org. We've built a lot of custom functionality. I'm starting to worry about governor limits and not really understanding how to manage them. I've read through http://wiki.developerforce.com/page/Apex_Code_Best_Practices and I've made all my triggers accordingly, using as few SOQL and DML statements as possible, querying the database only when I have to and getting all the records I need at once, etc.

What I can't figure out is – what if I just need to exceed the governor limit? What if I need to update 4,000 records when a certain other record they're all related to changes? Am I out of luck? Is SalesForce really viable with large data sets?

Right now my client wants to update 20 different Product records, which could potentially update as many as 46,061 related opportunity line item records. They want a trigger that will update every related opportunity line item, which then will fire triggers that update other custom objects related to the opportunity line items – it multiplies fast, and hits governor limits hard. I'm not sure how to implement @future methods, but even then, those have limits that I'm not sure if I'll hit or not.

What do I do?

Best Answer

I would look at using Batch Apex. It can handle the large data sets. I have found that the code is generally cleaner to write than triggers (e.g., different logic for insert/update, etc.) when the data model is very complex.

Even if your list size is greater than 200, Salesforce will break up your lists into chunks of 200 for trigger processing. Each chunk contributes to the same shared limit, though. If you were to perform one bulkified SOQL query in a single trigger you'd be able to process 100 * 200 = 20,000 records of that object in a single transaction until you hit the 101 max. If you are just using DML you'd have 150 * 200 = 30,000. That only accounts for a situation where a single query or single dml statement is issued.

In a complex data model you can definitely have many more queries occuring in your triggers, because your single save will trigger saves on other related records (e.g., roll-up summary triggering save in parent). It is worth keeping in mind what happens when you save a record when you design. If you design your solution to have triggers on every object it can be cumbersome to figure out why something is getting updated on a related record and can be harder to maintain.

On top of all of that, if your data model is really that complex you have to consider what will happen when you set up unit tests. If you have triggers on every object with many queries and updates you can definitely get into a situation where you approach the 101 SOQL Query limit when you are setting up the unit test data.

There's a developer force blog post that has a some high level summaries on working with large data sets and links to other resources.

Related Topic