Salesforce Dictionary - Free Salesforce GlossarySalesforce Dictionary
Full Bulk API 2.0 entry
How-to guide

How to use Bulk API 2.0

Using Bulk API 2.0 follows a predictable five-step lifecycle: create the job, upload CSV, close the job, poll for completion, retrieve results. Most enterprise integration platforms wrap this lifecycle into a single configuration step. For custom integration code, follow the steps in order and handle the asynchronous job completion correctly with polling and exponential backoff.

By Dipojjal Chakrabarti · Founder & Editor, Salesforce DictionaryLast updated May 16, 2026

Using Bulk API 2.0 follows a predictable five-step lifecycle: create the job, upload CSV, close the job, poll for completion, retrieve results. Most enterprise integration platforms wrap this lifecycle into a single configuration step. For custom integration code, follow the steps in order and handle the asynchronous job completion correctly with polling and exponential backoff.

  1. Confirm Bulk is the right tool

    Use Bulk API 2.0 for jobs over 2,000 records. Below that, REST or SOAP API is faster. For under 50,000 records and interactive needs, REST may still fit; Bulk shines on truly large data loads.

  2. Create the job

    POST /services/data/vXX.X/jobs/ingest with JSON body specifying object, operation (insert, update, upsert, delete, hardDelete), externalIdFieldName (for upsert), and lineEnding. The response includes the jobId and contentUrl.

  3. Upload the CSV data

    PUT the CSV body to the contentUrl returned in the previous step. Content-Type: text/csv. Maximum 150 MB per job. Sort by master ID for master-detail loads to reduce lock contention.

  4. Close the job to mark ready for processing

    PATCH /services/data/vXX.X/jobs/ingest/JOB_ID with state=UploadComplete. Salesforce starts processing asynchronously after this step.

  5. Poll the job state until processing completes

    GET /services/data/vXX.X/jobs/ingest/JOB_ID. The state field transitions through Open, UploadComplete, InProgress, JobComplete, Failed, Aborted. Poll every 30 seconds with exponential backoff for very large jobs.

  6. Retrieve successful and failed records

    GET /services/data/vXX.X/jobs/ingest/JOB_ID/successfulResults and /failedResults. Both return CSV bodies. Log the failed-record CSV for manual review and retry where applicable.

  7. Build error-handling logic for the failed records

    Each failed record includes the original CSV fields plus error description columns. Parse the errors, classify by type (validation, lock, network), and decide whether to retry. Some errors (validation failures) are not retriable without data fixes.

  8. Clean up completed jobs

    Salesforce retains job metadata for 7 days; older jobs are removed automatically. For audit purposes, store the job ID and result CSVs in your own logging system before the retention window closes.

Key options
Operation (insert, update, upsert, delete, hardDelete, query)remember

The DML operation the job performs. Upsert needs an externalIdFieldName; hardDelete bypasses the Recycle Bin.

Concurrency Moderemember

Parallel (default, fast) or Serial (for jobs with inter-record dependencies). Serial trades speed for ordering guarantees.

External ID Field (for upsert)remember

The field used to match incoming records against existing records. Must be marked External ID and Unique on the target object.

Gotchas
  • Lock contention on master-detail data produces UNABLE_TO_LOCK_ROW errors. Sort the CSV by master ID before upload to put records targeting the same parent in the same chunk.
  • Job retention is 7 days. Failed and successful result CSVs are also retained for 7 days. Store the results in your own logging system if you need them longer.
  • Parallel processing can produce ordering surprises. Records that depend on earlier records being processed in the same job should use Serial concurrency mode instead.
  • hardDelete bypasses the Recycle Bin and is irreversible. Triple-check the input CSV before running a hardDelete job because there is no way to recover deleted records.
  • Bulk API daily limits apply alongside REST and SOAP limits. Monitor combined API consumption to avoid blowing the daily cap during nightly ETL windows.

See the full Bulk API 2.0 entry

Bulk API 2.0 includes the definition, worked example, deep dive, related terms, and a quiz.