Asynchronous processing is crucial for building scalable Salesforce applications that can handle large data volumes without hitting governor limits.
Among the various asynchronous processing options in Salesforce, Queueable Apex stands out as one of the most flexible and powerful tools available to developers.
Queueable Apex is an asynchronous processing framework that allows you to execute long-running operations in the background. It combines the benefits of both @future methods and Batch Apex, providing:
Unlike @future methods that only accept primitive data types, Queueable classes can work with:
Execute multiple queueable jobs in sequence, perfect for complex multi-step processes.
Let’s start with a simple example that demonstrates the core concepts:
public class AccountUpdateQueueable implements Queueable {
private List<Account> accountsToUpdate;
private String updateReason;
public AccountUpdateQueueable(List<Account> accounts, String reason) {
this.accountsToUpdate = accounts;
this.updateReason = reason;
}
public void execute(QueueableContext context) {
try {
// Process accounts in batches to avoid governor limits
List<Account> processedAccounts = new List<Account>();
for (Account acc : accountsToUpdate) {
acc.Description = 'Updated: ' + updateReason + ' on ' + DateTime.now();
acc.LastModifiedDate = DateTime.now();
processedAccounts.add(acc);
}
// Perform DML operation
if (!processedAccounts.isEmpty()) {
update processedAccounts;
System.debug('Successfully updated ' + processedAccounts.size() + ' accounts');
}
} catch (Exception e) {
System.debug('Error in AccountUpdateQueueable: ' + e.getMessage());
// Log error or send notification
handleError(e, context.getJobId());
}
}
private void handleError(Exception e, Id jobId) {
// Create error log record or send email notification
System.debug('Job ID: ' + jobId + ', Error: ' + e.getMessage());
}
}
// Create and enqueue the job
List<Account> accounts = [SELECT Id, Name, Description FROM Account LIMIT 100];
AccountUpdateQueueable job = new AccountUpdateQueueable(accounts, 'Bulk Update Process');
// Enqueue and get job ID for monitoring
Id jobId = System.enqueueJob(job);
System.debug('Job enqueued with ID: ' + jobId);
Here’s a more complex example that demonstrates processing large datasets:
public class DataMigrationQueueable implements Queueable {
private static final Integer BATCH_SIZE = 200;
private List<Legacy_Record__c> recordsToMigrate;
private Integer currentBatch;
private Integer totalBatches;
public DataMigrationQueueable(List<Legacy_Record__c> records) {
this.recordsToMigrate = records;
this.currentBatch = 1;
this.totalBatches = (Integer) Math.ceil((Decimal) records.size() / BATCH_SIZE);
}
public DataMigrationQueueable(List<Legacy_Record__c> records, Integer batch, Integer total) {
this.recordsToMigrate = records;
this.currentBatch = batch;
this.totalBatches = total;
}
public void execute(QueueableContext context) {
List<Modern_Record__c> newRecords = new List<Modern_Record__c>();
List<Legacy_Record__c> recordsToUpdate = new List<Legacy_Record__c>();
try {
// Process current batch
for (Legacy_Record__c legacy : recordsToMigrate) {
// Transform legacy record to new format
Modern_Record__c newRecord = transformRecord(legacy);
newRecords.add(newRecord);
// Mark legacy record as migrated
legacy.Migration_Status__c = 'Migrated';
legacy.Migration_Date__c = DateTime.now();
recordsToUpdate.add(legacy);
}
// Insert new records
if (!newRecords.isEmpty()) {
insert newRecords;
}
// Update legacy records
if (!recordsToUpdate.isEmpty()) {
update recordsToUpdate;
}
System.debug('Batch ' + currentBatch + ' of ' + totalBatches + ' completed successfully');
// Chain next batch if there are more records to process
chainNextBatch();
} catch (Exception e) {
System.debug('Error in DataMigrationQueueable: ' + e.getMessage());
handleMigrationError(e, context.getJobId());
}
}
private Modern_Record__c transformRecord(Legacy_Record__c legacy) {
return new Modern_Record__c(
Name = legacy.Name,
Legacy_ID__c = legacy.Id,
Transformed_Data__c = legacy.Old_Data__c,
Migration_Source__c = 'Queueable Migration'
);
}
private void chainNextBatch() {
if (currentBatch < totalBatches) {
// Query next batch of records
List<Legacy_Record__c> nextBatch = [
SELECT Id, Name, Old_Data__c
FROM Legacy_Record__c
WHERE Migration_Status__c != 'Migrated'
LIMIT :BATCH_SIZE
];
if (!nextBatch.isEmpty()) {
DataMigrationQueueable nextJob = new DataMigrationQueueable(
nextBatch,
currentBatch + 1,
totalBatches
);
System.enqueueJob(nextJob);
}
}
}
private void handleMigrationError(Exception e, Id jobId) {
// Send email notification to administrators
Messaging.SingleEmailMessage email = new Messaging.SingleEmailMessage();
email.setToAddresses(new String[]{'[email protected]'});
email.setSubject('Data Migration Error - Job ID: ' + jobId);
email.setPlainTextBody('Error occurred during data migration: ' + e.getMessage());
Messaging.sendEmail(new Messaging.SingleEmailMessage[]{email});
}
}
Queueable chaining allows you to execute multiple jobs in sequence, which is perfect for complex workflows that require ordered processing.
public class ChainedProcessingQueueable implements Queueable {
private String processStep;
private Map<String, Object> processData;
public ChainedProcessingQueueable(String step, Map<String, Object> data) {
this.processStep = step;
this.processData = data;
}
public void execute(QueueableContext context) {
try {
switch on processStep {
when 'STEP_1' {
executeStep1();
chainNextStep('STEP_2');
}
when 'STEP_2' {
executeStep2();
chainNextStep('STEP_3');
}
when 'STEP_3' {
executeStep3();
// Final step - no chaining needed
System.debug('All processing steps completed successfully');
}
}
} catch (Exception e) {
System.debug('Error in step ' + processStep + ': ' + e.getMessage());
handleChainError(e, context.getJobId());
}
}
private void executeStep1() {
// Step 1: Data validation and preparation
System.debug('Executing Step 1: Data Validation');
List<Account> accounts = (List<Account>) processData.get('accounts');
for (Account acc : accounts) {
if (String.isBlank(acc.Name)) {
acc.Name = 'Default Account Name';
}
}
update accounts;
processData.put('step1_completed', true);
}
private void executeStep2() {
// Step 2: Related record processing
System.debug('Executing Step 2: Related Record Processing');
List<Account> accounts = (List<Account>) processData.get('accounts');
List<Contact> contactsToCreate = new List<Contact>();
for (Account acc : accounts) {
Contact newContact = new Contact(
LastName = acc.Name + ' Contact',
AccountId = acc.Id,
Email = 'contact@' + acc.Name.toLowerCase().replaceAll(' ', '') + '.com'
);
contactsToCreate.add(newContact);
}
insert contactsToCreate;
processData.put('contacts_created', contactsToCreate.size());
}
private void executeStep3() {
// Step 3: Final cleanup and notifications
System.debug('Executing Step 3: Final Processing');
Integer contactsCreated = (Integer) processData.get('contacts_created');
// Send completion notification
sendCompletionNotification(contactsCreated);
}
private void chainNextStep(String nextStep) {
ChainedProcessingQueueable nextJob = new ChainedProcessingQueueable(nextStep, processData);
System.enqueueJob(nextJob);
}
private void handleChainError(Exception e, Id jobId) {
System.debug('Chain broken at step: ' + processStep + ', Job ID: ' + jobId);
// Implement error recovery or notification logic
}
private void sendCompletionNotification(Integer recordCount) {
System.debug('Processing completed. ' + recordCount + ' contacts created.');
}
}
public class StatefulChainQueueable implements Queueable {
public class ProcessState {
public Integer currentStep = 1;
public Integer totalSteps = 5;
public Map<String, Object> data = new Map<String, Object>();
public List<String> errors = new List<String>();
public DateTime startTime = DateTime.now();
}
private ProcessState state;
public StatefulChainQueueable(ProcessState processState) {
this.state = processState;
}
public void execute(QueueableContext context) {
try {
System.debug('Executing step ' + state.currentStep + ' of ' + state.totalSteps);
switch on state.currentStep {
when 1 { processDataExtraction(); }
when 2 { processDataTransformation(); }
when 3 { processDataValidation(); }
when 4 { processDataLoad(); }
when 5 { processCleanupAndNotification(); }
}
// Continue to next step if not the last one
if (state.currentStep < state.totalSteps) {
state.currentStep++;
StatefulChainQueueable nextJob = new StatefulChainQueueable(state);
System.enqueueJob(nextJob);
} else {
System.debug('All steps completed in: ' +
(DateTime.now().getTime() - state.startTime.getTime()) + 'ms');
}
} catch (Exception e) {
state.errors.add('Step ' + state.currentStep + ': ' + e.getMessage());
handleStepError(e, context.getJobId());
}
}
private void processDataExtraction() {
// Extract data from external source or complex query
List<sObject> extractedData = Database.query(
'SELECT Id, Name FROM Account WHERE CreatedDate = TODAY LIMIT 1000'
);
state.data.put('extracted_records', extractedData);
}
private void processDataTransformation() {
List<Account> accounts = (List<Account>) state.data.get('extracted_records');
for (Account acc : accounts) {
acc.Description = 'Processed on: ' + DateTime.now();
}
state.data.put('transformed_records', accounts);
}
private void processDataValidation() {
List<Account> accounts = (List<Account>) state.data.get('transformed_records');
List<Account> validAccounts = new List<Account>();
for (Account acc : accounts) {
if (acc.Name != null && acc.Name.length() > 0) {
validAccounts.add(acc);
}
}
state.data.put('valid_records', validAccounts);
state.data.put('validation_count', validAccounts.size());
}
private void processDataLoad() {
List<Account> accounts = (List<Account>) state.data.get('valid_records');
update accounts;
state.data.put('updated_count', accounts.size());
}
private void processCleanupAndNotification() {
Integer updatedCount = (Integer) state.data.get('updated_count');
System.debug('Processing completed. Updated ' + updatedCount + ' records.');
// Send notification or create log record
createProcessingLog();
}
private void createProcessingLog() {
// Create a custom log record to track the processing
Processing_Log__c log = new Processing_Log__c(
Process_Name__c = 'StatefulChainQueueable',
Start_Time__c = state.startTime,
End_Time__c = DateTime.now(),
Records_Processed__c = (Integer) state.data.get('updated_count'),
Status__c = state.errors.isEmpty() ? 'Success' : 'Completed with Errors',
Error_Details__c = String.join(state.errors, '\n')
);
insert log;
}
private void handleStepError(Exception e, Id jobId) {
System.debug('Error in step ' + state.currentStep + ': ' + e.getMessage());
// Decide whether to continue or stop the chain based on error severity
}
}
public class GovernorLimitAwareQueueable implements Queueable {
private List<sObject> recordsToProcess;
private static final Integer MAX_DML_OPERATIONS = 150; // Safe limit
private static final Integer MAX_SOQL_QUERIES = 100; // Safe limit
public void execute(QueueableContext context) {
// Monitor limits throughout execution
Integer dmlCount = Limits.getDmlStatements();
Integer queryCount = Limits.getQueries();
if (dmlCount > MAX_DML_OPERATIONS || queryCount > MAX_SOQL_QUERIES) {
// Chain to new job to reset limits
chainContinuation();
return;
}
// Continue processing...
}
private void chainContinuation() {
// Create new job with remaining records
System.enqueueJob(new GovernorLimitAwareQueueable(recordsToProcess));
}
}
public class ResilientQueueable implements Queueable {
private Integer retryCount = 0;
private static final Integer MAX_RETRIES = 3;
public void execute(QueueableContext context) {
try {
// Risky operation
performRiskyOperation();
} catch (Exception e) {
handleRetryLogic(e);
}
}
private void handleRetryLogic(Exception e) {
if (retryCount < MAX_RETRIES) {
retryCount++;
System.debug('Retrying operation, attempt: ' + retryCount);
// Exponential backoff could be implemented here
ResilientQueueable retryJob = new ResilientQueueable();
retryJob.retryCount = this.retryCount;
System.enqueueJob(retryJob);
} else {
System.debug('Max retries exceeded, logging failure');
logFinalFailure(e);
}
}
private void performRiskyOperation() {
// Implementation here
}
private void logFinalFailure(Exception e) {
// Log to custom object or send notification
}
}
@IsTest
public class QueueableTest {
@TestSetup
static void setupTestData() {
List<Account> testAccounts = new List<Account>();
for (Integer i = 0; i < 100; i++) {
testAccounts.add(new Account(Name = 'Test Account ' + i));
}
insert testAccounts;
}
@IsTest
static void testAccountUpdateQueueable() {
List<Account> accounts = [SELECT Id, Name FROM Account];
Test.startTest();
AccountUpdateQueueable job = new AccountUpdateQueueable(accounts, 'Test Update');
Id jobId = System.enqueueJob(job);
Test.stopTest();
// Verify results
List<Account> updatedAccounts = [SELECT Id, Description FROM Account];
for (Account acc : updatedAccounts) {
System.assert(acc.Description.contains('Test Update'), 'Account not updated correctly');
}
// Verify job completed
AsyncApexJob jobRecord = [SELECT Status, NumberOfErrors FROM AsyncApexJob WHERE Id = :jobId];
System.assertEquals('Completed', jobRecord.Status, 'Job should complete successfully');
System.assertEquals(0, jobRecord.NumberOfErrors, 'Job should have no errors');
}
@IsTest
static void testQueueableChaining() {
List<Account> accounts = [SELECT Id, Name FROM Account LIMIT 10];
Map<String, Object> processData = new Map<String, Object>{'accounts' => accounts};
Test.startTest();
ChainedProcessingQueueable job = new ChainedProcessingQueueable('STEP_1', processData);
System.enqueueJob(job);
Test.stopTest();
// Verify all steps executed (in test context, chained jobs execute synchronously)
List<Contact> createdContacts = [SELECT Id FROM Contact WHERE AccountId IN :accounts];
System.assert(!createdContacts.isEmpty(), 'Contacts should be created by chained process');
}
@IsTest
static void testFinalizerExecution() {
List<Account> accounts = [SELECT Id, Name FROM Account LIMIT 5];
Test.startTest();
RobustProcessingQueueable job = new RobustProcessingQueueable(accounts, 'TEST_PROCESS');
System.enqueueJob(job);
Test.stopTest();
// Verify finalizer executed (success case)
List<Success_Log__c> successLogs = [SELECT Id, Process_ID__c FROM Success_Log__c WHERE Process_ID__c = 'TEST_PROCESS'];
System.assert(!successLogs.isEmpty(), 'Success log should be created by finalizer');
}
@IsTest
static void testFinalizerWithError() {
List<Account> accounts = new List<Account>();
// Create more than 500 accounts to trigger error condition
for (Integer i = 0; i < 501; i++) {
accounts.add(new Account(Name = 'Error Test Account ' + i));
}
insert accounts;
Test.startTest();
RobustProcessingQueueable job = new RobustProcessingQueueable(accounts, 'ERROR_TEST');
System.enqueueJob(job);
Test.stopTest();
// Verify error handling in finalizer
List<Error_Log__c> errorLogs = [SELECT Id, Process_ID__c FROM Error_Log__c WHERE Process_ID__c = 'ERROR_TEST'];
System.assert(!errorLogs.isEmpty(), 'Error log should be created by finalizer');
}
}
public class OptimizedBatchQueueable implements Queueable {
private List<sObject> recordsToProcess;
private Integer batchSize;
private Integer currentIndex = 0;
public OptimizedBatchQueueable(List<sObject> records, Integer batchSize) {
this.recordsToProcess = records;
this.batchSize = batchSize;
}
public void execute(QueueableContext context) {
List<sObject> currentBatch = new List<sObject>();
Integer endIndex = Math.min(currentIndex + batchSize, recordsToProcess.size());
// Process current batch
for (Integer i = currentIndex; i < endIndex; i++) {
currentBatch.add(recordsToProcess[i]);
}
// Perform DML on current batch
if (!currentBatch.isEmpty()) {
processBatch(currentBatch);
}
// Chain next batch if more records exist
if (endIndex < recordsToProcess.size()) {
OptimizedBatchQueueable nextBatch = new OptimizedBatchQueueable(recordsToProcess, batchSize);
nextBatch.currentIndex = endIndex;
System.enqueueJob(nextBatch);
}
}
private void processBatch(List<sObject> batch) {
// Optimized processing logic
update batch;
}
}
public class MemoryOptimizedQueueable implements Queueable {
private String queryString;
private Integer offset = 0;
private static final Integer BATCH_SIZE = 1000;
public MemoryOptimizedQueueable(String query, Integer startOffset) {
this.queryString = query;
this.offset = startOffset;
}
public void execute(QueueableContext context) {
// Query only what we need for this batch
String batchQuery = queryString + ' LIMIT ' + BATCH_SIZE + ' OFFSET ' + offset;
List<sObject> records = Database.query(batchQuery);
if (!records.isEmpty()) {
// Process records
processRecords(records);
// Clear the list to free memory
records.clear();
// Chain next batch
MemoryOptimizedQueueable nextJob = new MemoryOptimizedQueueable(queryString, offset + BATCH_SIZE);
System.enqueueJob(nextJob);
}
}
private void processRecords(List<sObject> records) {
// Process each record individually to minimize memory usage
for (sObject record : records) {
processIndividualRecord(record);
}
}
private void processIndividualRecord(sObject record) {
// Individual processing logic
}
}
public class MonitoredQueueable implements Queueable, Finalizer {
private String processName;
private DateTime startTime;
private Map<String, Object> metrics;
public MonitoredQueueable(String name) {
this.processName = name;
this.startTime = DateTime.now();
this.metrics = new Map<String, Object>();
}
public void execute(QueueableContext context) {
// Record job start
recordJobStart(context.getJobId());
try {
// Your processing logic here
performProcessing();
// Record success metrics
metrics.put('status', 'SUCCESS');
metrics.put('records_processed', 100); // Example
} catch (Exception e) {
metrics.put('status', 'ERROR');
metrics.put('error_message', e.getMessage());
throw e;
}
}
public void execute(FinalizerContext context) {
// Record job completion
recordJobCompletion(context.getAsyncApexJobId(), context.getException());
}
private void recordJobStart(Id jobId) {
Job_Monitoring__c monitor = new Job_Monitoring__c(
Job_ID__c = String.valueOf(jobId),
Process_Name__c = processName,
Start_Time__c = startTime,
Status__c = 'RUNNING'
);
insert monitor;
}
private void recordJobCompletion(Id jobId, Exception error) {
Job_Monitoring__c monitor = [SELECT Id FROM Job_Monitoring__c WHERE Job_ID__c = :String.valueOf(jobId)];
monitor.End_Time__c = DateTime.now();
monitor.Duration_Ms__c = DateTime.now().getTime() - startTime.getTime();
monitor.Status__c = error == null ? 'COMPLETED' : 'FAILED';
monitor.Error_Details__c = error?.getMessage();
monitor.Metrics__c = JSON.serialize(metrics);
update monitor;
}
private void performProcessing() {
// Your processing logic
}
}
public class QueueableLogger {
private static List<String> logEntries = new List<String>();
private static String currentJobId;
public static void setJobId(Id jobId) {
currentJobId = String.valueOf(jobId);
}
public static void log(String level, String message) {
String timestamp = DateTime.now().format('yyyy-MM-dd HH:mm:ss.SSS');
String logEntry = timestamp + ' [' + level + '] Job: ' + currentJobId + ' - ' + message;
logEntries.add(logEntry);
System.debug(logEntry);
// Flush logs periodically to avoid memory issues
if (logEntries.size() >= 100) {
flushLogs();
}
}
public static void info(String message) {
log('INFO', message);
}
public static void warn(String message) {
log('WARN', message);
}
public static void error(String message) {
log('ERROR', message);
}
public static void flushLogs() {
if (!logEntries.isEmpty()) {
// Save logs to custom object or external system
Debug_Log__c debugLog = new Debug_Log__c(
Job_ID__c = currentJobId,
Log_Entries__c = String.join(logEntries, '\n'),
Timestamp__c = DateTime.now()
);
insert debugLog;
logEntries.clear();
}
}
}
// Usage in Queueable class:
public class LoggedQueueable implements Queueable, Finalizer {
public void execute(QueueableContext context) {
QueueableLogger.setJobId(context.getJobId());
QueueableLogger.info('Starting queueable execution');
try {
// Your processing logic
QueueableLogger.info('Processing completed successfully');
} catch (Exception e) {
QueueableLogger.error('Processing failed: ' + e.getMessage());
throw e;
}
}
public void execute(FinalizerContext context) {
QueueableLogger.info('Finalizer executing');
QueueableLogger.flushLogs(); // Ensure all logs are saved
}
}
// WRONG - This will cause Mixed DML error
public class ProblematicQueueable implements Queueable {
public void execute(QueueableContext context) {
User newUser = new User(/*user fields*/);
insert newUser; // Setup object
Account newAccount = new Account(Name = 'Test');
insert newAccount; // Non-setup object - MIXED DML ERROR!
}
}
// CORRECT - Separate setup and non-setup operations
public class CorrectQueueable implements Queueable {
private Boolean isSetupOperation;
public CorrectQueueable(Boolean setupOp) {
this.isSetupOperation = setupOp;
}
public void execute(QueueableContext context) {
if (isSetupOperation) {
// Handle setup objects (User, Profile, etc.)
User newUser = new User(/*user fields*/);
insert newUser;
// Chain non-setup operation
System.enqueueJob(new CorrectQueueable(false));
} else {
// Handle non-setup objects
Account newAccount = new Account(Name = 'Test');
insert newAccount;
}
}
}
public class SafeChainQueueable implements Queueable {
private Integer chainDepth = 0;
private static final Integer MAX_CHAIN_DEPTH = 100;
public SafeChainQueueable(Integer depth) {
this.chainDepth = depth;
}
public void execute(QueueableContext context) {
// Prevent infinite chaining
if (chainDepth >= MAX_CHAIN_DEPTH) {
System.debug('Maximum chain depth reached, terminating chain');
return;
}
// Your processing logic
// Safe chaining with depth tracking
if (shouldContinueChaining()) {
SafeChainQueueable nextJob = new SafeChainQueueable(chainDepth + 1);
System.enqueueJob(nextJob);
}
}
private Boolean shouldContinueChaining() {
// Your logic to determine if chaining should continue
return true; // Placeholder
}
}
Use queueable classes for migrating large datasets from legacy systems while maintaining system performance and providing detailed progress tracking.
Implement multi-step integration processes that require sequential API calls to external systems with proper error handling and retry mechanisms.
Process large volumes of records with complex business logic that would exceed synchronous processing limits.
Perform regular maintenance tasks like archiving old records, cleaning up temporary data, or optimizing data structures.
Queueable Apex provides a powerful and flexible framework for asynchronous processing in Salesforce. By understanding and implementing the concepts covered in this guide, you can:
[…] Mastering Queueable Apex in Salesforce […]