[SalesForce] Debug log – limit summary is inaccurate

(edited after I've decided to try out sfdcfox's remarks)

I'm rewriting a batch job to (among others) make it run faster.

I've attached debug log to my user and monitor the job at random intervals and I've noticed that the limits summary is lying. It always shows 0 rows queries, 0 DML statements and 0 DML rows.

Here's sample debug log (with profiling settings on max and the rest as small as possible). I've edited it a bit – marked as (... – to anonymize fields and skip boring parts.

26.0 APEX_CODE,ERROR;APEX_PROFILING,FINEST;CALLOUT,ERROR;DB,FINEST;SYSTEM,WARN;VALIDATION,ERROR;VISUALFORCE,ERROR;WORKFLOW,ERROR
14:05:45.014 (14394000)|EXECUTION_STARTED
14:05:45.014 (14436000)|CODE_UNIT_STARTED|[EXTERNAL]|01pM0000000F4wB|BatchRecalculateTeamMembers
14:05:45.033 (33331000)|SOQL_EXECUTE_BEGIN|[49]|Aggregations:0|select (...) from ERP_Account_Team_Member__c where (...) and User__r.isActive = true)
14:05:45.087 (87223000)|SOQL_EXECUTE_END|[49]|Rows:0
14:05:45.159 (159917000)|DML_BEGIN|[126]|Op:Delete|Type:Id|Rows:428
14:05:47.069 (2069878000)|DML_END|[126]
14:05:47.194 (2108452000)|CUMULATIVE_LIMIT_USAGE
14:05:47.194|LIMIT_USAGE_FOR_NS|(default)|
  Number of SOQL queries: 0 out of 200
  Number of query rows: 0 out of 50000
  Number of SOSL queries: 0 out of 20
  Number of DML statements: 0 out of 150
  Number of DML rows: 0 out of 10000
  Number of code statements: 889 out of 1000000
  Maximum CPU time: 0 out of 60000
  Maximum heap size: 0 out of 12000000
  Number of callouts: 0 out of 0
  Number of Email Invocations: 0 out of 10
  Number of fields describes: 0 out of 100
  Number of record type describes: 0 out of 100
  Number of child relationships describes: 0 out of 100
  Number of picklist describes: 0 out of 100
  Number of future calls: 0 out of 0

  (... skipping the STATIC_VARIABLE_LIST block)

14:05:47.194|CUMULATIVE_LIMIT_USAGE_END

14:05:47.249|CUMULATIVE_PROFILING_BEGIN
14:05:47.249|CUMULATIVE_PROFILING|SOQL operations|
Class.BatchRecalculateTeamMembers.execute: line 49, column 1: [SELECT (...)
            FROM ERP_Account_Team_Member__c
            WHERE (...) AND User__r.isActive = true]: executed 1 time in 56 ms

14:05:47.249|CUMULATIVE_PROFILING|No profiling information for SOSL operations
14:05:47.249|CUMULATIVE_PROFILING|DML operations|
Class.BatchRecalculateTeamMembers.processCompanyTeamMembers: line 126, column 1: Database.delete(LIST<Id>, Boolean): executed 3 times in 1926 ms
Class.BatchRecalculateTeamMembers.execute: line 64, column 1: Database.insert(LIST<ERP_Account__Share>, Boolean): executed 1 time in 0 ms

(... skipping the CUMULATIVE_PROFILING|method invocations block)

14:05:47.108 (2108737000)|CODE_UNIT_FINISHED|BatchRecalculateTeamMembers
14:05:47.108 (2108751000)|EXECUTION_FINISHED

So there's clearly:

  • 1 SOQL (0 rows)
  • 1 Database.delete (428 ids – actually mentioned as 3 chunks of 200 rows max I think)
  • 1 Database.insert (0 rows)

Even if Id list wouldn't count as DML rows (which I doubt, that would be a nice exploit) I'd expect to see at least 2 DML operations in the summary…

Any idea what the hell?

There was a similar question related to Winter'14 but I think it's bit different: Debug logs still measuring code statements and not CPU time (there's interesting comment by sfdcfox though).


The batch is supposed to recalculate sharing (AccountShare, AccountTeamMember, Custom_Object__Share) based on data from external system. It's in a batch and not say triggers because quite often data flies in before the user is activated and sharing inserts with inactive user would fail. Similarly the batch is responsible for "cascade delete" of all team members that aren't found in the master data.

Batch class is on API v 26. The profiling summary is the only one in whole file (there are no triggers; you can't even make triggers on AccountShare etc) and I didn't hit the *** MAXIMUM DEBUG LOG SIZE REACHED ***. No managed packages hit. No Database.Stateful.

Org – full sandbox on Winter'14.

Debug log – I've tried in normal Setup -> Debug logs and in Developer Console, same results. With so much filtering applied logs rarely exceed 5 KB now.


Help?

It's not a big deal really, I can see that it performs faster just by looking at total execution time. But I'd prefer the summary to be accurate so in other batches I can for example spot SOQL in loops easier…

Best Answer

I presume that your log size reached its maximum size, and therefore the information you were looking for was omitted. Limits are output at the end of each execution unit (e.g. a DML statement), so you end up with a log like this:

Code Event           Limits Output
Start Transaction    
Start Trigger 1      
Start Trigger 2
End Trigger 2        Limits for 2
End Trigger 1        Limits for 1 + 2
Start Trigger 3
Start Trigger 4
End Trigger 4        Limits for 4
Start Trigger 5
End Trigger 5        Limits for 5
End Trigger 3        Limits for 3 + 5
End Transaction      Limits for 1 + 2 + 3 + 4 + 5, Cumulative

If your log gets cut off before the end, you'll miss out on the cumulative block, and you won't see your totals correctly. Try setting the profiling level up, and the remaining levels down to the minimum levels to try and capture just profiling information.

Related Topic