The limit for retrieving files via standard APIs

filesrest-apiretrieve

We are trying to archive some unused files from Salesforce to an external DMS system. This is not a one-time task but an ongoing feature in order to preserve storage space in SF.

For smaller files we can proactively send the files directly to the external system using synchronous (6 MB limit) or asynchronous (12 MB limit) Apex Callouts.

But for larger files we are planning to send a pull request to the DMS system containing the ContentVersion Id and let the external system retrieve the file via the standard REST API.

<INSTANCE URL>/services/data/v47.0/sobjects/ContentVersion/<ID>/VersionData

I have researched about the maximum file size that can be retrieved in this way and had conflicting results. In the documentation of the ContentVersion object there is a limit given with 50MB compressed file size limit whereas in various other articles there is talk of larger file sizes being possible.
What is the upper limit of file size when exporting files in such a way and is it possible to export files with several GBs size as well?

Thanks in advance

Best Answer

I can't find any limits for the REST API but I would assume you can download the full 2GB max file size. Salesforce is a professional service and your requirement doesn't seem unusual. The documentation doesn't mention any limits.

De bulk API does mention a max file size but only in the context of chunking large downloads:

Maximum retrieved file size: 1 GB.
If processing of the batch results in 1 GB of retrieved data, then those results are saved to disk, and then the batch is put back on the queue to be resumed later. This also counts as one of the 15 retries.

Maximum number of retrieved files: 15 files.
If the query returns more than 15 files, add filters to the query to return less data. Bulk batch sizes aren’t used for bulk queries.