A few options, ordered by typical fit:
- Data Loader (or Dataloader.io) — the workhorse for 50k records. Export the records you need, edit the CSV, run an Update operation. Use the Bulk API for large jobs (recommended above ~5,000 rows). Failures land in a per-batch error CSV you can iterate on.
- Data Import Wizard — only goes up to 50,000 rows, supports a small set of objects, and is more guided. Fine for a one-off where the user's not comfortable with Data Loader.
- Mass Update Records / Mass Transfer — declarative, but limited to specific actions (transfer ownership, mass-update addresses). Useful for the narrow cases they support.
- Flow — record-triggered flow for ongoing updates, or a scheduled flow / scheduled-trigger flow with batches for one-off backfills. Good for "future" updates; not ideal for a one-time 50k operation.
- SOQL Update via Workbench — fine for small fixes, painful at 50k.
Before clicking Run on any of those, four checks are non-negotiable:
- Sandbox first — never run a 50k update straight against production. Refresh a sandbox or use a Partial Copy and dry-run there.
- Validation rules and required fields — they fire on Data Loader updates. A rule that disallows blank values will fail every row that has a blank in your CSV.
- Triggers, flows, and process automation — they all fire on update. A 50k update can chain into governor-limit issues if a trigger isn't bulkified. You can sometimes set a "Skip Automation" custom field that triggers respect, run the update, then unset it.
- Field history and audit — you'll generate up to 50k history rows per tracked field. Confirm you have storage and that audit knows this is coming.
