Bulkification means writing trigger code that handles a list of records efficiently — not one-record-at-a-time. It's mandatory because triggers are invoked for every DML, and DML can include up to 200 records per batch.
The anti-pattern (don't do):
apex trigger Bad on Account (after update) { for (Account a : Trigger.new) { Contact[] contacts = [SELECT Id FROM Contact WHERE AccountId=:a.Id]; // SOQL in loop! for (Contact c : contacts) { c.Phone = a.Phone; } update contacts; // DML in loop! } }
Hits the 100 SOQL / 150 DML limit at 100/150 accounts respectively.
The correct pattern:
`apex trigger Good on Account (after update) { Set<Id> accIds = new Set<Id>(); for (Account a : Trigger.new) accIds.add(a.Id);
Map<Id, Account> accMap = new Map<Id, Account>(Trigger.new); List<Contact> toUpdate = new List<Contact>();
for (Contact c : [SELECT Id, AccountId FROM Contact WHERE AccountId IN :accIds]) { c.Phone = accMap.get(c.AccountId).Phone; toUpdate.add(c); }
update toUpdate; // one DML } `
One SOQL, one DML, scales to 200 accounts (the Apex bulk size).
Bulkification matters because:
- Data Loader processes records in batches of 200.
- Bulk API can submit 10k+ rows.
- Mass updates in flow or reporting tools push lots of records through your trigger.
Untestable rule: if you write SOQL or DML inside a loop, your trigger is broken at scale.
