[SalesForce] Is there still a batch size limit in trigger

When I go through the Salesforce developer interview questions, I have got this one:

What’s the maximum batch size in a single trigger execution?

Well, I don't know the answer of this one. And I don't even know what is batch size. So I googled about it. And see the following document:
https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_changing_batch_size.htm

In it:

You can change the batch size (the number of rows that are returned in
the query result object) that’s returned in a query() or queryMore()
call from the default 500 rows.

The maximum batch size is 2,000 records

The batch size will be no more than 200 if the SOQL statement selects
two or more custom fields of type long text. This is to prevent large
SOAP messages from being returned.

So according to this document, it seems to me the default number of rows returned in a SOQL call is 500 and the maximum we can set is 2000. However, I do remember I have queried more than that number before (maybe I remembered wrong?)

Is that limit still there? And is it only for triggers or for everything?

Best Answer

Triggers now come in two sizes, batches of 200 and batches of 2,000. The APIs that chunked down to 100 records per trigger chunk are/will be retired in June 2021. For Platform Events, expect triggers to contain up to 2,000 events, and for all other normal DML triggers, triggers will have at most 200 records per chunk.

As a best practice, you should code the trigger to handle an "unlimited" number of records (barring CPU time), for the best performance and in consideration of governor limits. For example, if you want to do a callout per record, remember that you're limited to 100 callouts per transaction, so you'd use Queueable, Batchable, or future methods to handle groups of 100 records (fewer, if you need additional callouts for tokens, etc).

See the original answer, below, for historic trigger functionality.


You're confusing queries for DML operations. DML operations are always batched to sizes of 200 records maximum (100 if it's a really old API version, for backwards compatibility reasons). The 200/500/2000 row limit applies to the size of a single query result without using queryMore.

See Triggers where they discuss this "batch size" in regards to older API versions:

In API version 20.0 and earlier, if a Bulk API request causes a trigger to fire, each chunk of 200 records for the trigger to process is split into chunks of 100 records. In Salesforce API version 21.0 and later, no further splits of API chunks occur. If a Bulk API request causes a trigger to fire multiple times for chunks of 200 records, governor limits are reset between these trigger invocations for the same HTTP request.

I can't seem to find the documentation that states that triggers execute in chunks of 200 records, but you'll notice that it's pretty much in all the official (and unofficial) literature out there.