Salesforce Flow Limitations – Loop vs Batch Records

apexbatchtriggervisual-workflowworkflow

I have recently been digging into salesforce flows. I created quite a large flow (basically a series of subflows). It all worked as expected, and was based on when one of my objects was created or updated. The problem is that the purpose is to update a series of fields when the object updates, based on date calculation compared to the current month. As a result, at the turn of the month, the fields become outdated. So, I decided to add a scheduled flow to run (my idea was once a month, but flow only allows daily or weekly, so I just set it to run once a week to keep it updated for the most part). When I run the scheduled flow, I ran into an interesting problem.

I created my scheduled flow to run a loop, which meant that the flow was running 1 time for all instances of my object (and I called a sub-flow within the loop – so this process obviously took a while). When I ran the debugging tool, I found that this was getting an error after about 9-10 seconds. This makes sense, because I see in the Salesforce documentation that a transaction limitation is a maximum CPU time on SF servers of 10,000 ms. To fix this, I tried setting the scheduled flow so that it ran a separate interview for each instance of my object (rather than running all instances through a loop). I believe this was called running it for a batch of records. Upon doing this and testing the flow, it all completed correctly.

I am struggling to understand the difference between a flow interview and a transaction, and why the 10 second transaction limit seems to have disappeared when I set the scheduled flow for multiple interviews?

Of note, I spent some time reading about bulkification, and it appears that when setting my scheduled flow to run on a batch of records (causing multiple interviews), my flow is bulkified. If this is the case, is it possible that bulkification creates several transactions and therefore resets the governing limits?

To summarize my question:

  1. What is the difference between a transaction and interview?
  2. Do governing limits apply to a transaction or an interview?
  3. When I set my scheduled flow to run on a batch of records (therefore causing multiple interviews), how exactly did this differ from my old solution of running a loop that stopped with an error after 10 seconds?
  4. If bulkification is the cause of the dreaded "Now it works but I don't know why!", why exactly does it eliminate the 10 second limit? Is it possibly creating a new transaction after each step in the bulkification process?

I sure would appreciate if someone with more knowledge than me could explain how this process works. Thanks!

Best Answer

What is the difference between a transaction and interview?

A transaction is one complete execution unit as far as the server is concerned. I've always explained it as this: one debug log equals one transaction. An interview is one or more transactions that the Flow Runtime executes to process all of the elements.

Do governing limits apply to a transaction or an interview?

They apply per transaction, not per interview.

When I set my scheduled flow to run on a batch of records (therefore causing multiple interviews), how exactly did this differ from my old solution of running a loop that stopped with an error after 10 seconds?

Because each separate execution context is its own transaction. By using Pause/Wait elements or Asynchronous Paths, you're creating multiple transactions to break up the workload.

If bulkification is the cause of the dreaded "Now it works but I don't know why!", why exactly does it eliminate the 10 second limit? Is it possibly creating a new transaction after each step in the bulkification process?

As hopefully obvious by now, because each transaction has its own 10 second limit, so by breaking up the flow, you end up with more CPU time to complete the workload.