[SalesForce] Ability of Apex Rest services to accept 50MB file, how is the heap size calculated in this case

I have the below code where in we are accepting the blob data from the REST Apex class. We are inserting the document (blob) that we are receiving as a feed item. I was able to send file sizes upto 54 MB after which it started erroring out.

The governor limit states that the heap size for synchronous transactions is 6MB then how is it that this REST class is able to accept file sizes 9 times the heap size limit? and when we print the limits in the debug even that is not showing a high number at all. Where is this file stored when the class receives it ? How is the heap size getting calculated?

@RestResource(urlMapping='/uploadFile/*')
global class REST_StoreAttachment {

    @httpPost
    global static String postMethod(){

        Blob requestBody=RestContext.request.requestBody;

        system.debug('Heap size before'+ limits.getHeapSize() + ' ==== '+ Limits.getLimitHeapSize());

        ContentVersion content = new ContentVersion();
        content.Title = 'test';
        content.PathOnClient = 'fileName.txt';
        content.VersionData = requestBody;
        content.Origin = 'H';

        insert content;

        FeedItem post = new FeedItem();
        post.ParentId = '00000000000000';         //hard coded to a Id
        post.Body = 'New attachment added';
        post.RelatedRecordId = content.Id; 
        post.Type = 'ContentPost';
        insert post;

        system.debug('Heap size after'+ limits.getHeapSize() + ' ==== '+ Limits.getLimitHeapSize());
        return 'success';
    }  
}

System Debug

Heap size before2189 ==== 6000000

Heap size after2514 ==== 6000000

Best Answer

Interesting. I suspect that what might be going on is, certain objects retrieved from certain methods point directly to underlying Java representations without being processed as part of Apex, and remain so until manipulated. Such objects sometimes possess weird behaviours compared to objects created directly in Apex.

One example I've seen is after deserialization of JSON. If a JSON attribute maps to an Integer attribute in the Apex class I deserialize it to, but a number like 1.00 was in the JSON, Apex somehow jams a Decimal into an Integer container instead of converting to the integer 1. Only when you try and actually do Integer math on it does it complain that it's not an Integer.

Just a thought: suppose you redundantly convert the blob to Base64 and back, and assign that to VersionData. Does it start enforcing heap limits now?