Asynchronous Processing Frameworks for Salesforce

Introduction

Managing complex asynchronous processes in Salesforce can be challenging. These enterprise frameworks provide robust solutions for orchestrating Batch Apex and Queueable Apex jobs with advanced features like automatic chaining, error handling, runtime parameters, and resilient execution.

This guide presents two powerful frameworks:

  1. Batch Chain Framework – For large-scale data processing
  2. Queueable Chain Framework – For discrete async tasks with finalizer support

Complete Code

https://github.com/amitastreait/async-framework-salesforce

Batch Chain Framework

The Batch Chain Framework provides a configuration-driven solution for chaining multiple Batchable Apex classes in sequential execution with full control over timing, error handling, and data flow between batches.

Key Benefits

BenefitDescriptionBusiness Impact
🔗 Automatic ChainingSeamlessly chain multiple batch jobs without manual interventionReduces development time by 60%
⚙️ Configuration-DrivenControl all behavior via Custom Metadata – no code changes neededZero downtime deployments
🔄 Runtime ParametersPass dynamic data between batches at execution timeFlexible, context-aware processing
🛡️ Error RecoveryBuilt-in retry mechanisms and comprehensive error handlingIncreased reliability
⏱️ Execution ControlConfigurable delays between batches to manage system loadOptimized resource utilization
📊 Governor Limit AwareConfigurable batch sizes and smart schedulingPrevents limit exceptions
📝 Comprehensive LoggingDetailed execution tracking for debugging and monitoringFaster troubleshooting
🔧 Zero Code MaintenanceUpdate chain configuration without code deploymentsAgile business process changes

Why Use Batch Chain Framework?

Traditional Approach Pain Points:

// ❌ Old Way: Manual chaining with hardcoded logic
public class AccountBatch implements Database.Batchable<SObject> {
    public void finish(Database.BatchableContext bc) {
        // Hardcoded next batch - difficult to maintain
        Database.executeBatch(new ContactBatch(), 200);

        // No error handling
        // No delay control
        // No runtime parameters
        // Changes require code deployment
    }
}

Framework Approach Advantages:

// ✅ New Way: Framework handles everything
public class AccountBatch implements Database.Batchable<SObject>, IBatchChainable {
    public void finish(Database.BatchableContext bc) {
        AsyncApexJob job = [SELECT Id, Status, NumberOfErrors
                           FROM AsyncApexJob WHERE Id = :bc.getJobId()];

        // Framework handles chaining, delays, parameters, errors
        BatchChainExecutor.getInstance()
            .continueChain(getCurrentBatchName(), job, parameters);
    }
}

When to Use Batch Framework

Ideal Use Cases

  1. Large Dataset Processing
    • Processing 1,000+ records
    • Bulk data migrations
    • Mass data transformations
    • Archive operations
  2. Multi-Step Data Operations
    • Extract → Transform → Load (ETL) processes
    • Account → Contact → Opportunity processing chains
    • Data validation → correction → notification flows
  3. Scheduled Complex Workflows
    • Nightly data synchronization
    • Weekly report generation and distribution
    • Monthly data cleanup and archival
  4. Governor Limit Management
    • Operations that need chunking to avoid limits
    • Long-running processes split into manageable batches

When NOT to Use

  • Single small dataset (< 1,000 records) → Use Queueable instead
  • Real-time processing requirements → Use Triggers or Platform Events
  • Simple one-time operations → Use Anonymous Apex
  • UI-initiated quick actions → Use Queueable for faster response

Decision Matrix

Batch Architecture

Execution Flow

Custom Metadata Configuration

Batch_Chain_Configuration__mdt Fields:

Field NameTypePurposeExample
Current_Batch__cText(255)Primary key – current batch class nameAccountProcessingBatch
Next_Batch__cText(255)Next batch in chain (null = end of chain)ContactProcessingBatch
Execution_Delay__cNumber(18,0)Minutes to wait before next batch5
Batch_Size__cNumber(18,0)Records per batch execution200
Is_Active__cCheckboxEnable/disable this configurationtrue
Max_Retries__cNumber(18,0)Retry attempts on failure3
Description__cText AreaConfiguration documentationDaily account processing...

Batch Usage Examples

Example 1: Basic Batch Chain

// Start a simple batch chain
Id jobId = BatchChainExecutor.getInstance().startBatch('AccountProcessingBatch');
System.debug('Batch chain started: ' + jobId);

// Configuration in Custom Metadata:
// AccountProcessingBatch → ContactProcessingBatch → OpportunityProcessingBatch

Example 2: Batch with Runtime Parameters

// Pass dynamic parameters at execution time
Map<String, Object> params = new Map<String, Object>{
    'region' => 'North America',
    'accountType' => 'Enterprise',
    'recordLimit' => 5000,
    'processingDate' => Date.today(),
    'notifyOnComplete' => true
};

Id jobId = BatchChainExecutor.getInstance()
    .startBatch('RegionalAccountBatch', params);

System.debug('Regional processing started: ' + jobId);

Example 3: Complete Batch Implementation

/**
 * Account Processing Batch with full framework integration
 */
public class AccountProcessingBatch implements Database.Batchable<SObject>, IBatchChainable {

    private Map<String, Object> parameters;
    private Integer processedCount = 0;

    // IBatchChainable: Initialize with runtime parameters
    public void initializeWithParameters(Map<String, Object> params) {
        this.parameters = params != null ? params : new Map<String, Object>();
    }

    // Batch Start: Build dynamic query based on parameters
    public Database.QueryLocator start(Database.BatchableContext bc) {
        String region = (String) parameters.get('region');
        String accountType = (String) parameters.get('accountType');
        Integer recordLimit = (Integer) parameters.get('recordLimit');

        String query = 'SELECT Id, Name, Type, BillingCountry, AnnualRevenue ' +
                      'FROM Account ' +
                      'WHERE BillingCountry = :region ' +
                      'AND Type = :accountType ' +
                      'LIMIT :recordLimit';

        return Database.getQueryLocator(query);
    }

    // Batch Execute: Process records
    public void execute(Database.BatchableContext bc, List<SObject> scope) {
        List<Account> accounts = (List<Account>) scope;
        List<Account> accountsToUpdate = new List<Account>();

        for (Account acc : accounts) {
            // Business logic
            acc.Description = 'Processed by framework on ' + System.now();
            acc.Rating = 'Hot';
            accountsToUpdate.add(acc);
            processedCount++;
        }

        if (!accountsToUpdate.isEmpty()) {
            update accountsToUpdate;
        }
    }

    // Batch Finish: Continue chain with updated parameters
    public void finish(Database.BatchableContext bc) {
        // Get job result
        AsyncApexJob job = [
            SELECT Id, Status, NumberOfErrors, JobItemsProcessed, TotalJobItems
            FROM AsyncApexJob
            WHERE Id = :bc.getJobId()
        ];

        // Call post-execution hook
        onAfterExecution(job);

        // Pass results to next batch
        Map<String, Object> nextParams = new Map<String, Object>(parameters);
        nextParams.put('previousBatchProcessedCount', processedCount);
        nextParams.put('previousBatchJobId', job.Id);

        // Framework handles chaining
        BatchChainExecutor.getInstance()
            .continueChain(getCurrentBatchName(), job, nextParams);
    }

    // IBatchChainable: Get configuration
    public Batch_Chain_Configuration__mdt getBatchConfig() {
        return BatchChainExecutor.getInstance()
            .getBatchConfig(getCurrentBatchName());
    }

    // IBatchChainable: Pre-execution hook
    public void onBeforeExecution(Map<String, Object> params) {
        System.debug('=== AccountProcessingBatch Starting ===');
        System.debug('Region: ' + params.get('region'));
        System.debug('Account Type: ' + params.get('accountType'));
        System.debug('Record Limit: ' + params.get('recordLimit'));
    }

    // IBatchChainable: Post-execution hook
    public void onAfterExecution(AsyncApexJob result) {
        System.debug('=== AccountProcessingBatch Completed ===');
        System.debug('Status: ' + result.Status);
        System.debug('Processed: ' + result.JobItemsProcessed + '/' + result.TotalJobItems);
        System.debug('Errors: ' + result.NumberOfErrors);
        System.debug('Records Updated: ' + processedCount);

        // Send notification if configured
        if (parameters.get('notifyOnComplete') == true) {
            sendCompletionEmail(result);
        }
    }

    // IBatchChainable: Return batch name
    public String getCurrentBatchName() {
        return 'AccountProcessingBatch';
    }

    // Helper: Send completion notification
    private void sendCompletionEmail(AsyncApexJob job) {
        // Email notification logic
        Messaging.SingleEmailMessage email = new Messaging.SingleEmailMessage();
        email.setToAddresses(new List<String>{'[email protected]'});
        email.setSubject('Account Processing Batch Completed');
        email.setPlainTextBody('Batch ' + job.Id + ' completed with status: ' + job.Status);
        Messaging.sendEmail(new List<Messaging.SingleEmailMessage>{email});
    }
}

Example 4: ETL Pipeline with Batch Chain

/**
 * Real-world ETL pipeline: Extract → Transform → Load
 */

// Step 1: Extract Batch - Extract data from external system
public class ExtractBatch implements Database.Batchable<SObject>, IBatchChainable {
    // Extract logic and continue to TransformBatch
}

// Step 2: Transform Batch - Transform and validate data
public class TransformBatch implements Database.Batchable<SObject>, IBatchChainable {
    // Transform logic and continue to LoadBatch
}

// Step 3: Load Batch - Load into target objects
public class LoadBatch implements Database.Batchable<SObject>, IBatchChainable {
    // Load logic - end of chain
}

// Custom Metadata Configuration:
// ExtractBatch -> TransformBatch (delay: 5 min)
// TransformBatch -> LoadBatch (delay: 10 min)
// LoadBatch -> (end)

// Start the ETL pipeline
Map<String, Object> etlParams = new Map<String, Object>{
    'sourceSystem' => 'SAP',
    'extractDate' => Date.today(),
    'batchId' => 'ETL-' + System.now().getTime()
};

Id jobId = BatchChainExecutor.getInstance().startBatch('ExtractBatch', etlParams);

Queueable Chain Framework

The Queueable Chain Framework provides an advanced orchestration system for Queueable Apex jobs with integrated System.Finalizer support, ensuring guaranteed chain continuation even when individual queueables fail.

Key Benefits

BenefitDescriptionBusiness Impact
⚡ Runtime Parameter OverrideDynamic parameters override configuration at execution timeMaximum flexibility
🔄 Continue on FailureChain continues even if a step fails (configurable)Resilient workflows
📊 Parameter PrecedenceRuntime params > Config params – clear override modelPredictable behavior
🎯 Lightweight ExecutionFaster startup than batch jobs – ideal for < 1000 recordsBetter performance
⏱️ Native DelaysBuilt-in delay support with System.enqueueJob(job, delay)No scheduler needed
🔍 Enhanced MonitoringRequest ID and Job ID tracking Better debugging
🚀 Quick ResponseImmediate execution – no batch overheadFaster processing

Why Use Queueable Chain Framework?

Traditional Queueable Limitations:

// Old Way: Manual chaining with no failure protection
public class DataProcessingQueueable implements Queueable {
    public void execute(QueueableContext context) {
        try {
            // Process data
            processRecords();

            // Manual chaining - fails if this throws exception
            System.enqueueJob(new NextQueueable());

        } catch (Exception e) {
            // If exception occurs, chain breaks!
            System.debug('Error: ' + e.getMessage());
            // Next queueable never runs
        }
    }
}

Framework Approach with Finalizer:

// ✅ New Way: Finalizer ensures chain continuation
public class DataProcessingQueueable implements Queueable, IQueueableChainable {
    public void execute(QueueableContext context) {
        try {
            processRecords();

        } catch (Exception e) {
            onExecutionError(e);
            throw e; // Finalizer will still continue chain!
        }

        // Finalizer handles chaining - guaranteed execution
    }

    public void onFinalizerComplete(System.FinalizerContext result) {
        // Chain continues even if execute() failed
        System.debug('Finalizer executed for job: ' + result.getAsyncApexJobId());
    }
}

When to Use Queueable Framework

Ideal Use Cases

  1. Small to Medium Dataset Processing
    • Processing < 1,000 records
    • Quick async operations
    • API callouts with chaining
  2. Critical Chain Workflows
    • Must-complete sequences (use finalizer)
    • Payment processing flows
    • Integration workflows that can’t break
  3. Dynamic Runtime Execution
    • Different behavior based on runtime context
    • A/B testing scenarios
    • User-specific processing
  4. Fast Response Requirements
    • User-initiated async actions
    • Real-time data enrichment
    • Quick validation and update flows
  5. External System Integration
    • API callouts in chains
    • Multi-step integration flows
    • Webhook processing

When NOT to Use

  • Large datasets (> 1,000 records) → Use Batch Chain instead
  • Simple single operations → Use basic Queueable
  • Synchronous requirements → Use immediate processing
  • No chaining needed → Use standard Queueable

Decision Matrix

Queueable Architecture

Core Components

Custom Metadata Configuration

Queueable_Chain_Config__mdt Fields:

Field NameTypePurposeExample
Current_Queueable__cText(255)Primary key – current queueable class nameDataProcessingQueueable
Next_Queueable__cText(255)Next queueable in chainValidationQueueable
Execution_Delay__cNumber(18,0)Seconds to wait before next queueable60
Use_Finalizer__cCheckboxEnable System.Finalizer supporttrue
Continue_On_Failure__cCheckboxContinue chain even if this step failstrue
Is_Active__cCheckboxEnable/disable this configurationtrue
Max_Retries__cNumber(18,0)Retry attempts on failure3
Parameters__cLong TextJSON configuration parameters{"recordLimit": 100}
Description__cText AreaConfiguration documentationProcess customer data...

Queueable Usage Examples

Example 1: Basic Queueable Chain with Runtime Parameters

// Configuration has static params, runtime provides dynamic overrides
Map<String, Object> runtimeParams = new Map<String, Object>{
    'recordLimit' => 500,        // Overrides config value
    'priority' => 'High',        // New runtime parameter
    'urgentProcessing' => true   // New runtime parameter
};

Id jobId = QueueableChainExecutor.getInstance()
    .startQueueable('DataProcessingQueueable', runtimeParams);

System.debug('Queueable started with runtime params: ' + jobId);

Framework Comparison

Feature Matrix

FeatureBatch Chain FrameworkQueueable Chain FrameworkWhen to Choose
Data VolumeLarge (1,000+)Small to Medium (< 1,000)Choose based on record count
Execution SpeedSlower (batch overhead)Faster (immediate start)Queueable for speed
Chaining Modelfinish() methodFinalizer-basedQueueable for reliability
Failure RecoveryRetry mechanismFinalizer + Continue on FailureQueueable for resilience
Runtime Parameters✅ Supported✅ Enhanced with overrideBoth support, Queueable more flexible
Governor LimitsBatch-specific (50M rows)Queueable (50 jobs, 100 finalizers)Batch for massive datasets
Delay SupportSchedulable-basedNative enqueueJob(delay)Queueable cleaner implementation
API CalloutsLimited (100 per execute)Supported (100 callout)Queueable better for callouts
Transaction ModelChunked transactionsSingle transactionBatch for partial success
Best ForBulk data operationsSequential async workflowsSee decision matrix below

Decision Matrix

Real-World Scenario Examples

Scenario 1: Nightly Data Synchronization (10,000+ records)

✅ Use: Batch Chain Framework

// Batch Chain: Extract → Transform → Load
// Handles large volumes efficiently
Map<String, Object> params = new Map<String, Object>{
    'syncDate' => Date.today(),
    'syncType' => 'full'
};
BatchChainExecutor.getInstance().startBatch('ExtractBatch', params);

Scenario 2: Payment Processing Workflow (100 records)

✅ Use: Queueable Chain with Finalizer

// Queueable Chain: Validate → Process → Notify
// Guaranteed completion with finalizer
Map<String, Object> params = new Map<String, Object>{
    'paymentBatchId' => 'PAY-001',
    'urgentProcessing' => true
};
QueueableChainExecutor.getInstance().startQueueable('PaymentValidationQueueable', params);

Scenario 3: User-Initiated Quick Action (50 records)

✅ Use: Queueable Chain with Runtime Parameters

// Fast response with dynamic behavior
Map<String, Object> params = new Map<String, Object>{
    'userId' => UserInfo.getUserId(),
    'recordIds' => selectedRecordIds,
    'action' => 'approve'
};
QueueableChainExecutor.getInstance().startQueueable('QuickActionQueueable', params);

Best Practices

Configuration Management

✅ Good Practices
// Clear, meaningful configuration names
// Batch_Chain_Configuration__mdt
Current_Batch__c: AccountDailyProcessingBatch
Next_Batch__c: ContactDailyProcessingBatch
Description__c: Daily processing for accounts and contacts in North America region

// Queueable_Chain_Config__mdt
Current_Queueable__c: PaymentValidationQueueable
Next_Queueable__c: PaymentProcessingQueueable
Use_Finalizer__c: true
Continue_On_Failure__c: false (payment must validate before processing)
Bad Practices
// Generic, unclear names
Current_Batch__c: Batch1
Next_Batch__c: Batch2
Description__c: Does stuff

// No finalizer for critical workflow
Current_Queueable__c: PaymentProcessing
Use_Finalizer__c: false  // ❌ Payment processing should have finalizer!

2. Error Handling Strategy

Implementation Example

// ✅ Comprehensive error handling
public void execute(Database.BatchableContext bc, List<SObject> scope) {
    List<Account> successfullyProcessed = new List<Account>();
    List<Account> failedRecords = new List<Account>();

    for (Account acc : (List<Account>) scope) {
        try {
            // Process individual record
            processAccount(acc);
            successfullyProcessed.add(acc);

        } catch (DmlException e) {
            // Log DML-specific errors
            logDmlError(acc.Id, e);
            failedRecords.add(acc);

        } catch (Exception e) {
            // Log general errors
            logError('Unexpected error processing account: ' + acc.Id, e);
            failedRecords.add(acc);
        }
    }

    // Batch update successful records
    if (!successfullyProcessed.isEmpty()) {
        update successfullyProcessed;
    }

    // Log failed records for review
    if (!failedRecords.isEmpty()) {
        logFailedBatch(failedRecords);
    }
}

3. Parameter Management Best Practices

Parameter Precedence (Queueable)

// ✅ Clear parameter precedence model
public Map<String, Object> getEffectiveParameters() {
    Map<String, Object> effective = new Map<String, Object>();

    // 1. Start with configuration parameters (base)
    if (configParameters.isEmpty()) {
        Queueable_Chain_Config__mdt config = getQueueableConfig();
        if (config != null && !String.isBlank(config.Parameters__c)) {
            this.configParameters = QueueableChainExecutor.getInstance()
                .parseParameters(config.Parameters__c);
        }
    }
    effective.putAll(configParameters);

    // 2. Runtime parameters override (highest precedence)
    effective.putAll(runtimeParameters);

    // 3. Validate required parameters
    validateRequiredParameters(effective);

    return effective;
}

private void validateRequiredParameters(Map<String, Object> params) {
    List<String> required = new List<String>{'objectType', 'recordLimit'};

    for (String param : required) {
        if (!params.containsKey(param)) {
            throw new QueueableChainExecutor.QueueableChainException(
                'Required parameter missing: ' + param
            );
        }
    }
}

Parameter Passing Between Chains

// ✅ Pass results to next batch/queueable
public void finish(Database.BatchableContext bc) {
    AsyncApexJob job = [SELECT Id, Status, JobItemsProcessed
                       FROM AsyncApexJob WHERE Id = :bc.getJobId()];

    // Enhance parameters with execution results
    Map<String, Object> nextParams = new Map<String, Object>(parameters);
    nextParams.put('previousBatchJobId', job.Id);
    nextParams.put('previousBatchRecordsProcessed', job.JobItemsProcessed);
    nextParams.put('previousBatchCompletedAt', System.now());

    // Pass enriched parameters to next batch
    BatchChainExecutor.getInstance()
        .continueChain(getCurrentBatchName(), job, nextParams);
}

4. Governor Limit Awareness

Implementation

// ✅ Governor limit best practices

// Batch: Configure appropriate batch size
Batch_Chain_Configuration__mdt: Batch_Size__c: 200  // Default, adjust based on complexity

// Batch: Monitor SOQL queries in execute()
public void execute(Database.BatchableContext bc, List<SObject> scope) {
    // Check limits before heavy operations
    if (Limits.getQueries() > 180) {  // Approaching limit
        System.debug('WARNING: Approaching SOQL query limit');
    }

    // Use efficient queries
    Map<Id, Account> accountMap = new Map<Id, Account>(
        [SELECT Id, Name, (SELECT Id FROM Contacts) FROM Account WHERE Id IN :scope]
    );
}

// Queueable: Use delays to manage concurrent jobs
Queueable_Chain_Config__mdt:
  Execution_Delay__c: 60  // 60 seconds between chains

// Queueable: Check job limits before enqueuing
if (Limits.getQueueableJobs() >= Limits.getLimitQueueableJobs()) {
    System.debug('ERROR: Cannot enqueue more jobs');
    // Handle gracefully - maybe schedule instead
}

Comprehensive logging strategy

// ✅ Comprehensive logging strategy

// Framework-level logging
private void logQueueableExecution(
    Queueable_Chain_Config__mdt config,
    Id jobId,
    Map<String, Object> runtimeParams
) {
    String logMessage = String.join(new List<String>{
        'Queueable: ' + config.Current_Queueable__c,
        'Job ID: ' + jobId,
        'Next: ' + (config.Next_Queueable__c ?? 'None'),
        'Finalizer: ' + config.Use_Finalizer__c,
        'Continue On Failure: ' + config.Continue_On_Failure__c,
        'Runtime Params: ' + JSON.serializePretty(runtimeParams),
        'Timestamp: ' + System.now()
    }, ' | ');

    System.debug(LoggingLevel.INFO, '[QueueableChain] ' + logMessage);

    // Optional: Write to custom logging object for persistence
    createAuditLog(config, jobId, runtimeParams);
}

// Custom audit logging
private void createAuditLog(
    Queueable_Chain_Config__mdt config,
    Id jobId,
    Map<String, Object> params
) {
    Async_Job_Audit__c audit = new Async_Job_Audit__c(
        Job_Type__c = 'Queueable',
        Class_Name__c = config.Current_Queueable__c,
        Job_Id__c = jobId,
        Next_Job__c = config.Next_Queueable__c,
        Parameters__c = JSON.serialize(params),
        Started_At__c = System.now()
    );
    insert audit;
}

// Monitor AsyncApexJob for chain status
public static void monitorChainExecution(Id initialJobId) {
    List<AsyncApexJob> jobs = [
        SELECT Id, ApexClass.Name, Status, NumberOfErrors, CreatedDate
        FROM AsyncApexJob
        WHERE Id = :initialJobId
        OR CreatedBy.Id = :UserInfo.getUserId()
        ORDER BY CreatedDate DESC
        LIMIT 10
    ];

    for (AsyncApexJob job : jobs) {
        System.debug('Job: ' + job.ApexClass.Name +
                    ' | Status: ' + job.Status +
                    ' | Errors: ' + job.NumberOfErrors);
    }
}
@IsTest
private class FrameworkTestSuite {

    // Test batch chain with runtime parameters
    @IsTest
    static void testBatchChainWithRuntimeParams() {
        // Setup
        insert new Account(Name = 'Test Account');

        Map<String, Object> params = new Map<String, Object>{
            'region' => 'North America',
            'recordLimit' => 100
        };

        Test.startTest();
        Id jobId = BatchChainExecutor.getInstance()
            .startBatch('AccountProcessingBatch', params);
        Test.stopTest();

        // Verify
        AsyncApexJob job = [SELECT Status, NumberOfErrors
                           FROM AsyncApexJob WHERE Id = :jobId];
        System.assertEquals('Completed', job.Status);
        System.assertEquals(0, job.NumberOfErrors);
    }

    // Test queueable finalizer chain continuation
    @IsTest
    static void testFinalizerChainContinuation() {
        // Setup
        Map<String, Object> params = new Map<String, Object>{
            'shouldFail' => true  // Trigger failure
        };

        Test.startTest();
        Id jobId = QueueableChainExecutor.getInstance()
            .startQueueable('ResilientQueueable', params);
        Test.stopTest();

        // Verify: Finalizer should have continued chain despite failure
        // Check debug logs for finalizer execution
        System.assertNotEquals(null, jobId);

        // Verify chain continued (check next job was created)
        List<AsyncApexJob> jobs = [
            SELECT ApexClass.Name, Status
            FROM AsyncApexJob
            WHERE CreatedDate = TODAY
            ORDER BY CreatedDate DESC
        ];
        System.assert(jobs.size() > 1, 'Chain should have continued');
    }

    // Test parameter precedence
    @IsTest
    static void testParameterPrecedence() {
        // Setup: Config has recordLimit: 100
        // Runtime overrides with recordLimit: 500

        DataProcessingQueueable q = new DataProcessingQueueable();

        // Simulate config parameters
        // (Would come from Custom Metadata in real scenario)

        // Set runtime parameters
        Map<String, Object> runtimeParams = new Map<String, Object>{
            'recordLimit' => 500,
            'newParam' => 'value'
        };
        q.setRuntimeParameters(runtimeParams);

        // Get effective parameters
        Map<String, Object> effective = q.getEffectiveParameters();

        // Verify: Runtime takes precedence
        System.assertEquals(500, effective.get('recordLimit'));
        System.assertEquals('value', effective.get('newParam'));
    }
}

Conclusion

These enterprise frameworks transform complex asynchronous processing in Salesforce:

Key Takeaways

  1. Batch Chain Framework
    • Perfect for large-scale data operations (1,000+ records)
    • Configuration-driven with runtime flexibility
    • Robust error handling and retry mechanisms
    • Ideal for ETL, migrations, and bulk processing
  2. Queueable Chain Framework
    • Excellent for discrete async workflows (< 1,000 records)
    • Guaranteed chain continuation with System.Finalizer
    • Runtime parameter override for maximum flexibility
    • Perfect for integrations and critical workflows
  3. Choose Based On
    • Data Volume: Batch for large, Queueable for small
    • Criticality: Queueable with finalizer for must-complete workflows
    • Speed: Queueable for faster execution
    • Flexibility: Both support runtime parameters

Additional Resources


Built for the Salesforce Developer Community with Love

Uday Bheemarpu
Uday Bheemarpu
Articles: 9

Newsletter Updates

Enter your email address below and subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from Panther Schools

Subscribe now to keep reading and get access to the full archive.

Continue reading