We're attempting to create multi-batch jobs in the Bulk API 2.0. Our previous single-batch workflow was to do:
POST /services/data/v43.0/jobs/ingest
PUT /services/data/v43.0/jobs/ingest/<job-id>/batches
PATCH /services/data/v43.0/jobs/ingest/<job-id>
to set the status toUploadComplete
However, if I make multiple PUT
requests with my separate batches, then the PATCH
request that closes the job receives a 400 with the following body:
[
{
"errorCode":"INVALIDJOB",
"message":"Found multipe contents for job: <7500H00000H673g>, please 'Close' / 'Abort' / 'Delete' the current Job then create a new Job and make sure you only do 'PUT' once on a given Job."
}
]
This indicates that making multiple PUT
requests to /batches
is not the correct way to upload multiple batches, but the examples in the API 2.0 docs don't use multiple batches, and the documentation for the PATCH
endpoint doesn't comment to the effect that it should only be used once, and also doesn't say what you're supposed to do instead.
What is the correct approach is here?
Best Answer
Bulk API 2.0 simplifies uploading large amounts of data by breaking the files into batches automatically. All you have to do is upload a CSV file with your record data and check back when the results are ready.
Files must be in UTF-8 format. Files are converted to base64 when received by Salesforce. This conversion can increase the data size by approximately 50%. To account for the base64 conversion increase, upload data that does not exceed 100 MB.