Testing a Database.Batchable implementation

I have some processing code that makes use of batch Apex to avoid hitting governor limits. Running a representative test generates this error:

System.UnexpectedException: No more than one executeBatch can be called from within a testmethod. Please make sure the iterable returned from your start method matches the batch size, resulting in one executeBatch invocation.

Testability is an important feature but sadly missing in a few areas of Force.com including this one…

The normal work-around is suggested in the error message. But using that means that the code does not get tested across a batch boundary: such testing is particularly important for Database.Batchable implementations that also implement Database.Stateful to maintain state across batches.

My processing code also makes use of batch chaining where in the finish method of the Database.Batchable sometimes a further instance is created and executed. Unfortunately (but not surprisingly) that chained execution also generates the above error and there is no easy work-around.

Below is a small class I created to work-around both these problems. Instead of invoking Database.executeBatch, invoke BatchableExecutor.executeBatch. When called from a test, this method makes the start/execute(s)/finish pattern of calls synchronously itself and so avoids the above errors. As long as the test uses a small batch size e.g. 3 and makes sure a moderate number of records are returned by the start method e.g. 10 the Database.Batchable logic can be pretty fully tested without hitting any governor limits.

/*
 * Allows basic testing of a Database.Batchable using more than one batch.
 */
public class BatchableExecutor {
    
    private static final String KEY_PREFIX = AsyncApexJob.SObjectType.getDescribe().getKeyPrefix();
    
    public static Id executeBatch(Database.Batchable<SObject> batchable, Integer scopeSize) {
        
        if (!Test.IsRunningTest()) {
            return Database.executeBatch(batchable, scopeSize);
        } else {
            return executeBatchSynchronously(batchable, scopeSize);
        }
    }
    
    private static Id executeBatchSynchronously(Database.Batchable<SObject> batchable, Integer scopeSize) {
        
        // Fake implementation of this interface could be added as neeed
        Database.BatchableContext bc = null;
        
        // Invoke start (assumes QueryLocator is being used)
        Database.QueryLocator start = (Database.QueryLocator) batchable.start(bc);
        Database.QueryLocatorIterator iter = start.iterator();
        List<SObject> sobs = new List<SObject>();
        try {
            // Invoke execute
            while(iter.hasNext()) {
                sobs.add(iter.next());
                if (sobs.size() == scopeSize) {
                    // These calls could be wrapped in try/catch too for negative tests
                    batchable.execute(bc, sobs);
                    sobs.clear();
                }
            }
            if (sobs.size() > 0) {
                batchable.execute(bc, sobs);
            }
        } finally {
            // Invoke finish
            batchable.finish(bc);
        }
        
        // Fake id
        return KEY_PREFIX + '000000000000';
    }
}

Bear in mind that this code does an OK job of emulating the happy path only (as demonstrated by the test below). Also it obviously cannot emulate the transactions, asynchronous execution and object lifecycle of the real mechanism.

@isTest
private class BatchableExecutorTest {
    
    private class Fixture {
        List<Account> accounts = new List<Account>();
        Fixture addAccounts(Integer objectCount) {
            for (Integer i = 0; i < objectCount; i++) {
                accounts.add(new Account(Name = 'target-' + i, Site = null));
            }
            insert accounts;
            return this;
        }
        Fixture execute(Boolean useDatabaseExecuteBatch, Integer batchSize) {
            Test.startTest();
            // Test that both mechanisms produce the same results
            Id jobId = useDatabaseExecuteBatch
                    ? Database.executeBatch(new BatchableExecutorTestBatchable(), batchSize)
                    : BatchableExecutor.executeBatch(new BatchableExecutorTestBatchable(), batchSize)
                    ;
            Test.stopTest();
            System.assertNotEquals(null, jobId);
            return this;
        }
        Fixture assert(Integer expectedBatches) {
            System.assertEquals(1, [select Count() from Account where Name = 'start']);
            System.assertEquals(expectedBatches,  [select Count() from Account where Name = 'execute']);
            System.assertEquals(1,  [select Count() from Account where Name = 'finish']);
            System.assertEquals(accounts.size(), [select Count() from Account where Name like 'target-%' and Site = 'executed']);
            return this;
        }
    }
    
    @isTest
    static void batchableExecutorExecuteBatch() {
        new Fixture().addAccounts(10).execute(false, 3).assert(4);
    }
    
    @isTest
    static void databaseExecuteBatch() {
        new Fixture().addAccounts(10).execute(true, 10).assert(1);
    }
}
// Has to be a top-level class
public class BatchableExecutorTestBatchable implements Database.Batchable<SObject>, Database.Stateful {
    
    public Database.QueryLocator start(Database.BatchableContext bc) {
        Database.QueryLocator ql = Database.getQueryLocator([select Name from Account order by Name]);
    	insert new Account(Name = 'start');
        return ql;
    }
    
    public void execute(Database.BatchableContext bc, List<SObject> scope) {
    	for (SObject sob : scope) {
    		Account a = (Account) sob;
            a.Site = 'executed';
        }
        update scope;
        insert new Account(Name = 'execute');
    }
    
    public void finish(Database.BatchableContext bc) {
    	insert new Account(Name = 'finish');
    }
}

Using one page and controller for both a “Detail Page Button” and a “List Button”

See Correction section below the original post

Original Post

I have a Visualforce page that I want to use both for single objects (via a “Detail Page Button”) and for multiple objects (via a “List Button” with “Display Checkboxes” enabled). The page presents a table of the objects (with one row only for the single object case) and does its processing on each of the objects in the table.

This can be done with a single page and controller if the “URL” and “OnClick JavaScript” options are chosen for the “Content Source” of the custom buttons because then arbitrary parameters can be passed to an arbitrary page. But this post describes a way to get the “Visualforce Page” option for the “Content Source” to work instead. That means that the coupling between the buttons and page is more explicit, and more use is made of standard platform code and conventions.

The controller follows this pattern:

public with sharing class CustomObjectController {

  public CustomObject__c[] customObjects {get; private set;}

  public CustomObjectController(ApexPages.StandardSetController controller) {
    Id id = (Id) ApexPages.currentPage().getParameters().get('id');
    if (id != null) {
      controller.setSelected([
          select Id, Name, CustomField1__c, CustomField2__c
          from CustomObject__c
          where Id = :id
          ]);
    }
    this.customObjects = (CustomObject__c[]) controller.getSelected();
  }

  // Other controller logic
}

which is essentially a standard list controller with the bodge of special casing the presence of an “id” parameter. That bodge supports the single object case, at the cost of an explicit query being done.

The page also follows the standard list controller pattern by defining the recordsetvar attribute:

<apex:page
        standardcontroller="CustomObject__c"
        extensions="CustomObjectController"
        recordsetvar="customObjects"
        >
    <!--  Page content -->
</apex:page>

The good news is that this page can then be selected in the “Detail Page Button” case as well as the “List Button” case, allowing the single Visualforce page and Apex controller to be used in both cases.

Correction

I posted this too soon: I was mistaken that the list controller page could be assigned to a “Detail Page Button”. (A few too many similarly named pages in my development org…)

Here is a not very elegant work around. The list page shown above (lets call it “CustomObjectList”) remains unchanged. But a second page is needed (lets call it “CustomObject”) that does not have the recordsetvar attribute so that it can be assigned to the “Detail Page Button”. All that page does is forward to the “CustomObjectList” via this additional code (including a second constructor) added to the CustomObjectController:

public with sharing class CustomObjectController {

  // Other code as above

  private Id id;

  public CustomObjectController(ApexPages.StandardController controller) {
    id = controller.getRecord().Id;
  }
  
  public PageReference forwardToListController() {
    PageReference pr = Page.CustomObjectList;
    pr.getParameters().put('id', id);
    return pr;
  }
}
<apex:page
        standardController="CustomObject__c"
        extensions="CustomObjectController"
        action="{!forwardToListController}"
        />

Managed packages can take hours to become available

The “upload” of a managed package – the creation of a new version – typically takes a few minutes to complete. But there is a second step – some sort of queued replication – that must also complete before the package version becomes available on other instances (e.g. na14 or eu2 or ap1).

Recently we’ve seen delays of up to 5 hours. The delay also applies to actions like “undeprecating” a version: a customer deployment was stalled for over 2 hours waiting for that change to propagate.

For those of us using continuous integration and the newly introduced automation for installing packages (see e.g. Andrew Fawcett’s post “Look ma, no hands!” : Automating Install and Uninstall of Packages!) these delays can result in a series of broken builds until the version becomes available.

I’ve checked with salesforce support and they have responded that:

At present there is no SLA on the installation process as it is an async process that depends on the server availability.

which is a clear position. I guess it’s a case of “hope for the best and plan for the worst” as far as these delays are concerned.