Mapping an AngularJS client into a Force.com server

Client-side MVC frameworks such as AngularJS move most of an application into the browser. The server’s job is largely just to deliver static files – HTML, CSS, JavaScript, images – normally located in a tree of directories. And to provide a JSON-based RESTful API using HTTP GET/PUT operations to read and write model objects.

Force.com’s @RestResource Apex classes are an effective way of implementing a JSON-based RESTful API. But Force.com is by design not a general purpose web server, so mapping a typical tree of client files into the page and static resource mechanisms that are available is a bit awkward. Below is one approach to that problem.

One of the pleasures of developing this sort of client-side logic is that (at least on a Mac) all it takes to get started is to put a few files in a directory and then run “python -m SimpleHTTPServer 8000” in that directory. As you edit and add, you see the results immediately: no 10 second delays while files are copied up to a remote server. You pretty naturally end up with a directory tree. So I make this layout of the files essentially the “master”. (A downside of working this way is that browsers like Chrome block requests to servers – such as the remote RESTful API one – other than the server that the pages came from. But handily Chrome has a --disable-web-security command-line option to turn this checking off while you are developing.) This layout can also be zipped and used pretty much directly in tools like Adobe PhoneGap Build so your HTML 5 application can be delivered as an iOS or Android etc app.

But how to transform this master layout into Force.com pages and static resources so the entire app can be served from Force.com? The good news is that thanks to URLFOR, a static resource can be a zip file and a page can reference files and directories within that zip file. So CSS, JavaScript and images can all go into a single static resource. However, the HTML that includes references to the other objects has to go into pages as that is the context that URLFOR works in. The raw HTML also has to be wrapped in Visualforce:

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" applyHtmlTag="false">
    ....
</apex:page>

and the naming flattened as directories are not supported. Visualforce will also report any “well formed XML” violations.

This re-organization of course means that references between the files also have to be adjusted. For a limited size project, the changes can be worked out by hand and treated as a set of search/replace strings (the filter elements below).

Doing all this transformation work by hand is obviously tedious and error-prone. So below is the source code of Ant task that can be configured like this:

<webtosf fromdir="." todir="../EepServer/src">
	<fileset dir="css"/>
	<fileset dir="data"/>
	<fileset dir="js"/>
	<filter token="lib/bootstrap/bootstrap.min.css" value="//netdna.bootstrapcdn.com/bootstrap/3.0.0/css/bootstrap.min.css"/>
	<filter token="lib/angular/angular.min.js" value="//ajax.googleapis.com/ajax/libs/angularjs/1.2.0-rc.3/angular.min.js"/>
	<filter token="js/app.js" value="{!URLFor($Resource.appzip, 'js/app.js')}"/>
	<filter token="data/Acme-420x114.png" value="{!URLFor($Resource.appzip, 'data/Acme-420x114.png')}"/>
	<filter token="partials/login.html" value="partialslogin"/>
</webtosf>

to do all the work repeatedly and reliably.

Feel free to use this code and change it as you like:

package com.claimvantage.ant;

import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;

import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.DirectoryScanner;
import org.apache.tools.ant.Project;
import org.apache.tools.ant.Task;
import org.apache.tools.ant.types.FileSet;

/**
 * Transforms a fairly free-form JavaScript web app layout into Force.com resources.
 * A set of text replacements are applied to any .css, .js or .html files
 * to e.g. fix resource references and necessary name changes.
 * Any .html files are converted into individual Force.com pages.
 * Other files are all added to a single zip file static resource called appzip.
 */
public class WebToSf extends Task {

    private static final String LF = System.getProperty("line.separator");

    public static class Filter {

        private String token;
        private String value;

        public void setToken(String token) {
            this.token = token;
        }

        public void setValue(String value) {
            this.value = value;
        }
    }

    // Root to read from and write to
    private File fromDir;
    private File toDir;

    // Files that go into the named zip
    private List<FileSet> zipContents = new ArrayList<FileSet>();

    // Replacements
    private List<Filter> filters = new ArrayList<Filter>();

    public void setFromDir(File fromDir) {
        this.fromDir = fromDir;
    }

    public void setToDir(File toDir) {
        this.toDir = toDir;
    }

    public void addFileset(FileSet zipContent) {
        zipContents.add(zipContent);
    }

    public Filter createFilter() {
        Filter filter = new Filter();
        filters.add(filter);
        return filter;
    }

    public void execute() {

        if (fromDir == null) {
            throw new BuildException("fromdir must be set");
        }
        if (!fromDir.exists()) {
            throw new BuildException("fromdir " + fromDir.getAbsolutePath() + " does not exist");
        }
        if (toDir == null) {
            throw new BuildException("todir must be set");
        }
        for (int i = 0; i < filters.size(); i++) {
            if (filters.get(i).token == null) {
                throw new BuildException("token missing from filter index " + i);
            }
        }
        for (int i = 0; i < filters.size(); i++) {
            if (filters.get(i).value == null) {
                throw new BuildException("value missing from filter index " + i);
            }
        }

        try {
            zip();
            pages(fromDir);
        } catch (Exception e) {
            throw new BuildException(e);
        }
    }

    private void zip() throws Exception {
        
        File staticresources = new File(toDir, "staticresources");
        if (!staticresources.exists()) {
            staticresources.mkdirs();
        }

        // Data
        ZipOutputStream zos = new ZipOutputStream(new BufferedOutputStream(
                new FileOutputStream(new File(staticresources, "appzip" + ".resource"))));
        try {
            for (FileSet fs : zipContents) {
                
                if (!fs.isFilesystemOnly()) {
                    throw new BuildException("only filesystem flesets supported");
                }
                
                DirectoryScanner ds = fs.getDirectoryScanner(getProject());
                File baseDir = getProject().getBaseDir();
                File fsDir = fs.getDir(getProject());

                // Keep path as folders
                String path = "";
                for (String part : pathDifference(baseDir, fsDir)) {
                    path += part;
                    path += "/";
                }
                
                for (String fsName : ds.getIncludedFiles()) {
                    
                    String newName = path + fsName;
                    
                    log("zipping dir=" + fsDir + " file=" + fsName + " to=" + newName,
                            Project.MSG_INFO);

                    zos.putNextEntry(new ZipEntry(newName));
                    if (isText(fsName)) {
                        // Replace
                        BufferedReader r = new BufferedReader(new InputStreamReader(
                                new FileInputStream(new File(fsDir, fsName))));
                        try {
                            String line;
                            while ((line = r.readLine()) != null) {
                                zos.write(replace(line).getBytes());
                                zos.write(LF.getBytes());
                            }
                            zos.closeEntry();
                        } finally {
                            r.close();
                        }
                    } else {
                        // Just byte for byte copy
                        BufferedInputStream is = new BufferedInputStream(
                                new FileInputStream(new File(fsDir, fsName)));
                        try {
                            byte[] buf = new byte[4092];
                            int len;
                            while ((len = is.read(buf)) != -1) {
                                zos.write(buf, 0, len);
                            }
                        } finally {
                            is.close();
                        }
                    }
                }
            }
        } finally {
            zos.close();
        }
        
        // Meta
        BufferedWriter ww = new BufferedWriter(new FileWriter(
                new File(staticresources, "appzip" + ".resource-meta.xml")));
        try {
            ww.write(""
                    + "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + LF
                    + "<StaticResource xmlns=\"http://soap.sforce.com/2006/04/metadata\">" + LF
                    + "    <cacheControl>Public</cacheControl>" + LF
                    + "    <contentType>application/zip</contentType>" + LF
                    + "</StaticResource>"
                    );
        } finally {
            ww.close();
        }
    }

    private boolean isText(String name) {
        
        String lc = name.toLowerCase();
        return lc.endsWith(".css") || lc.endsWith(".js");
    }

    private void pages(File dir) throws Exception {

        File pages = new File(toDir, "pages");
        if (!pages.exists()) {
            pages.mkdirs();
        }

        for (File f : dir.listFiles()) {
            if (f.isDirectory()) {
                pages(f);
            } else {
                if (isHtml(f.getName())) {
                    page(pages, f);
                }
            }
        }
    }

    private boolean isHtml(String name) {
        
        String lc = name.toLowerCase();
        return lc.endsWith(".html");
    }

    private void page(File pages, File f) throws Exception {
        
        // Prepend path
        String name = "";
        for (String path : pathDifference(getProject().getBaseDir(), f.getParentFile())) {
            name += cleanName(path);
        }
        name += cleanName(removeSuffix(f.getName()));
        
        File to = new File(pages, name + ".page");
        File toMeta = new File(pages, name + ".page-meta.xml");
        log("transforming page file=" + f + " to=" + to, Project.MSG_INFO);

        BufferedReader r = new BufferedReader(new FileReader(f));
        try {

            // Data
            BufferedWriter w = new BufferedWriter(new FileWriter(to));
            try {
                w.write("<apex:page showHeader=\"false\" sidebar=\"false\""
                        + " standardStylesheets=\"false\""
                        + " applyHtmlTag=\"false\">" + LF + LF);
                String line;
                while ((line = r.readLine()) != null) {
                    w.write(replace(line));
                    w.write(LF);
                }
                w.write(LF + "</apex:page>");
            } finally {
                w.close();
            }

            // Meta
            BufferedWriter ww = new BufferedWriter(new FileWriter(toMeta));
            try {
                ww.write("" + "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + LF
                        + "<StaticResource xmlns=\"http://soap.sforce.com/2006/04/metadata\">" + LF
                        + "    <apiVersion>29.0</apiVersion>" + LF
                        + "    <label>" + name + "</label>" + LF
                        + "</StaticResource>"
                        );
            } finally {
                ww.close();
            }
        } finally {
            r.close();
        }
    }

    private String replace(String line) {
        
        for (Filter f : filters) {
            if (line.contains(f.token)) {
                log("... replacing " + f.token + " in line " + line, Project.MSG_INFO);
                line = line.replace(f.token, f.value);
            }
        }
        return line;
    }

    private String cleanName(String name) {

        StringBuffer sb = new StringBuffer();
        for (int i = 0; i < name.length(); i++) {
            char c = name.charAt(i);
            if (Character.isLetter(c) || Character.isDigit(c) || c == '_') {
                sb.append(c);
            }
        }
        return sb.toString();
    }
    
    private List<String> pathDifference(File baseDir, File subDir) {
        
        List<String> parts = new ArrayList<String>();
        for (File f = subDir; !baseDir.equals(f); f = f.getParentFile()) {
            parts.add(f.getName());
        }
        Collections.reverse(parts);
        return parts;
    }
    
    private String removeSuffix(String name) {
        
        int index = name.lastIndexOf('.');
        return index != -1 ? name.substring(0, index) : name;
    }
}
Advertisements

Ant task that automates the installation of managed packages

PS Summer ’13 looks like it adds meta data support for managed packages so check that out first…

At ClaimVantage we deploy using a mixture of managed packages and source code. The development process for a customer can involve up to this many orgs:

  • one (or more) developer edition orgs that developers create the custom code in
  • a build org that the Jenkins continuous integration server uses to verify that the code committed to the version control system is deployable including 100% passing tests
  • a QA sandbox org that the customer’s QA team test in
  • a UAT sandbox org that the customer’s business users test in
  • the customer’s production org

And as we typically work in 2 week long iterations, over a few months of work for a customer many deployments get done to each of these orgs.

The source code can be pushed into orgs using the Ant tools provided by Salesforce. While it is not quick – typically it takes a few minutes – it requires no attention once started so you can get on with other work and check the outcome later.

But managed package installation unfortunately involves a multiple-page wizard with slow transitions between the pages even before the final asynchronous step. And Salesforce have not provided an API or any other tooling to allow this tedious button clicking to be eliminated.

So I am very grateful that my colleague Sinead Coyle has created an Ant task that automates the clicking via Selenium. She is now sharing this in the Google code project force-managed-package-installer. It allows the managed package installation or upgrade to be just another step in the Ant script ahead of the source code deployment.

When runAllTests=”false” actually means runAllTests=”true”

After several deployments into test environments, we deployed into a customer’s production environment yesterday. One of the deployment steps after installing some managed packages is to push several profiles into the target org using the Ant deploy task with a package.xml that just includes the profiles. It was an unwelcome surprise that all the unit tests in the production org ran. These are tests written by a third party that could have had dangerous side effects; in previous deployments this had not happened and runAllTests=”false” being present in the Ant script suggested it should not.

The explanation is in the Ant deploy runAllTests documentation:

This parameter is ignored when deploying to a Salesforce production organization. Every unit test in your organization namespace is executed.

Whatever the motivation for this behavior, I suggest that returning an error message (containing this text) when runAllTests=”false” is specified for a production org would be a better approach to handling the situation than just ignoring the attribute and running the tests.

Quick summary of how to get started with the Metadata WSDL API via Java’s JAXB

The Metadata API is used to manipulate customization information in an org including Apex classes and Visualforce pages by code such as the Force IDE. I wanted to create an Ant task to automate the creation of a large’ish number of static resources from files via this API and so following on from the Quick summary of how to get started with the Enterprise WSDL API via Java’s JAXB here are few extra things I discovered relating to the Metadata API.

The first is that the Metadata WSDL obtained from Setup -> Develop -> API -> Generate Metadata WSDL had a schema element that was missing the attribute xmlns=”http://www.w3.org/2001/XMLSchema&#8221; that puts it in the correct namespace and so this had to be manually added before the JAXB code generation would work. Also, as in the previous post, a JAXB customization file that had the targetNamespace of its bindings element set to “http://soap.sforce.com/2006/04/metadata&#8221; needed to be created.

The second is that you need classes generated from the Partner WSDL or the Enterprise WSDL to be able to do the login and obtain the session id that is then set in the session header from the Metadata WSDL. I ended up with a project with both the Partner and Metadata WSDL code generated to be able to do this. A bit of care is then needed to use the classes from the correct package.

Lastly, when you cut and paste code that relates to the Partner WSDL or Enterprise WSDL, the endpoint that you use from the login response may accidentally end up being loginResponse.getResult().getServerUrl() instead of the one you need for the Metadata WSDL case of loginResponse.getResult().getMetadataServerUrl().

And that is it if a “quick summary” is what you are interested in. If you want to see some sample code read on…

This example of how to use the Metadata API is obfuscated a little by the code being in the form of an Ant task. All you need to know to understand the Ant part is that the entry point is the execute method and attributes that have set methods can be configured via XML from a build file.

This base class handles the login and connection leaving the specifics of Metadata API or Partner API operations to extending classes:

package com.claimvantage.ant;

import java.net.URL;

import javax.xml.namespace.QName;
import javax.xml.ws.BindingProvider;

import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.Task;

import com.sforce.soap.metadata.MetadataPortType;
import com.sforce.soap.metadata.MetadataService;
import com.sforce.soap.partner.Login;
import com.sforce.soap.partner.LoginResponse;
import com.sforce.soap.partner.SforceService;
import com.sforce.soap.partner.Soap;

public abstract class ForceApiTaskBase extends Task {
    
    private String un;
    private String pw;
    
    private LoginResponse loginResponse;
    
    private MetadataPortType port;
    private com.sforce.soap.metadata.SessionHeader metaDataSessionHeader;
    
    private Soap soap;
    private com.sforce.soap.partner.SessionHeader partnerSessionHeader;

    public String getUn() {
        return un;
    }

    public void setUn(String un) {
        this.un = un;
    }

    public String getPw() {
        return pw;
    }

    public void setPw(String pw) {
        this.pw = pw;
    }

    public void execute() throws BuildException {
        try {
            doExecute();
        } catch (BuildException e) {
            throw e;
        } catch (Exception e) {
            throw new BuildException(e);
        }
    }
    
    protected abstract void doExecute() throws Exception;
    
    protected MetadataPortType getMetadataPortType() throws Exception {
        createMetadataSession();
        return port;
    }
    
    protected com.sforce.soap.metadata.SessionHeader getMetadataSessionHeader() throws Exception {
        createMetadataSession();
        return metaDataSessionHeader;
    }
    
    protected Soap getPartnerPortType() throws Exception {
        createPartnerSession();
        return soap;
    }
    
    protected com.sforce.soap.partner.SessionHeader getPartnerSessionHeader() throws Exception {
        createPartnerSession();
        return partnerSessionHeader;
    }
    
    private void createMetadataSession() throws Exception {
        
        if (port == null) {
            
            // Login done here
            createPartnerSession();
            
            MetadataService service = new MetadataService(getUrl("metadata.wsdl"), new QName("http://soap.sforce.com/2006/04/metadata", "MetadataService"));
            port = service.getMetadata();
            
            BindingProvider b = (BindingProvider) port;
            b.getRequestContext().put(BindingProvider.ENDPOINT_ADDRESS_PROPERTY, loginResponse.getResult().getMetadataServerUrl());
            
            metaDataSessionHeader = new com.sforce.soap.metadata.SessionHeader();
            metaDataSessionHeader.setSessionId(loginResponse.getResult().getSessionId());
        }
    }

    private void createPartnerSession() throws Exception {
        
        if (loginResponse == null) {
            
            SforceService service = new SforceService(getUrl("partner.wsdl"), new QName("urn:partner.soap.sforce.com", "SforceService"));
            soap = service.getSoap();
            
            // Login
            log("connecting as user " + getUn());
            Login login = new Login();
            login.setUsername(getUn());
            login.setPassword(getPw());
            
            loginResponse = soap.login(login, null, null);
            
            BindingProvider b = (BindingProvider) soap;
            b.getRequestContext().put(BindingProvider.ENDPOINT_ADDRESS_PROPERTY, loginResponse.getResult().getServerUrl());
            
            partnerSessionHeader = new com.sforce.soap.partner.SessionHeader();
            partnerSessionHeader.setSessionId(loginResponse.getResult().getSessionId());
        }
    }
    
    private URL getUrl(String name) throws Exception {
        
        // WSDL must be packaged with code
        URL url = getClass().getResource(name);
        if (url != null) {
            return url;
        } else {
            throw new Exception("Could not access WSDL from " + name);
        }
    }
}

And here is an example of an Ant task that makes use of the Metadata API. (The polling is needed to deal with the asynchronous nature of the API.) The Ant task transfers a set of files to an org as static resources:

package com.claimvantage.ant;

import java.io.BufferedInputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.Arrays;
import java.util.List;

import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.Project;

import com.sforce.soap.metadata.AsyncRequestStateType;
import com.sforce.soap.metadata.AsyncResultType;
import com.sforce.soap.metadata.CheckStatus;
import com.sforce.soap.metadata.Create;
import com.sforce.soap.metadata.Delete;
import com.sforce.soap.metadata.StaticResourceCacheControlType;
import com.sforce.soap.metadata.StaticResourceType;

/**
 * Transfer files into an org as static resources.
 */
public class LoadStaticResources extends ForceApiTaskBase {
    
    // Text between this and the extension (if present) used as the description and stripped from the resource name
    private static final String DESCRIPTION_SEPARATOR = "-";
    
    // Text after this stripped
    private static final String EXTENSION_SEPARATOR = ".";
    
    private String contentType;
    private File folder;

    public void setContentType(String contentType) {
        this.contentType = contentType;
    }

    // All files in this folder pushed to server
    public void setFolder(File folder) {
        this.folder = folder;
    }

    protected void doExecute() throws Exception {
        
        if (contentType == null) {
            throw new BuildException("contentType must be set");
        }
        if (folder == null) {
            throw new BuildException("folder must be set");
        }
        if (!folder.isDirectory()) {
            throw new BuildException("folder must be a directory");
        }
        
        File[] filesArray = folder.listFiles();
        if (filesArray == null || filesArray.length == 0) {
            log(folder + " is empty");
            return;
        }
        List<File> files = Arrays.asList(filesArray);

        // API limits to this many at a time
        final int max = 10;
        for (int batch = 0; true; batch++) {
            
            int from = max * batch;
            int to = Math.min(files.size(), from + max);
            
            doBatch(files.subList(from, to));

            if (to >= files.size()) {
                break;
            }
        }
        log("Done");
    }
    
    private void doBatch(List<File> files) throws Exception {
        
        Delete delete = new Delete();
        for (File file : files) {
            StaticResourceType sr = new StaticResourceType();
            sr.setFullName(resourceName(file.getName()));
            delete.getMetadata().add(sr);
        }
        waitForCompletion(files, getMetadataPortType().delete(delete, getMetadataSessionHeader(), null).getResult(), "delete");
        
        Create create = new Create();
        for (File file : files) {
            StaticResourceType sr = new StaticResourceType();
            sr.setFullName(resourceName(file.getName()));
            sr.setDescription(resourceDecription(file.getName()));
            sr.setContent(read(file));
            sr.setContentType(contentType);
            sr.setCacheControl(StaticResourceCacheControlType.PRIVATE);
            create.getMetadata().add(sr);
        }
        waitForCompletion(files, getMetadataPortType().create(create, getMetadataSessionHeader(), null).getResult(), "create");
    }
    
    private void waitForCompletion(List<File> files, List<AsyncResultType> results, String operation) throws Exception {
        
        CheckStatus check = new CheckStatus();
        for (AsyncResultType result : results) {
            check.getAsyncProcessId().add(result.getId());
        }

        int sleepSeconds = 1;
        boolean notAllDone = true;
        while(notAllDone) {
            notAllDone = false;

            log("waiting for async result - sleeping for " + sleepSeconds + " s");
            Thread.sleep(sleepSeconds * 1000L);
            if (sleepSeconds < 64) {
                sleepSeconds *= 2;
            }

            int i = 0;
            for (AsyncResultType result : getMetadataPortType().checkStatus(check, getMetadataSessionHeader(), null).getResult()) {
                int level = result.getState() == AsyncRequestStateType.ERROR ? Project.MSG_ERR : Project.MSG_INFO;
                log(operation + " " + result.getState() + " " + files.get(i).getName() + " " + (result.getMessage() != null ? result.getMessage() : ""), level);
                if (result.getState() == AsyncRequestStateType.QUEUED || result.getState() == AsyncRequestStateType.IN_PROGRESS) {
                    notAllDone = true;
                }
                i++;
            }
        }
    }

    private String resourceName(String s) {
        
        int slash = s.indexOf(DESCRIPTION_SEPARATOR);
        int dot = s.lastIndexOf(EXTENSION_SEPARATOR);
        
        if (slash == -1) {
            slash = s.length();
        }
        if (dot == -1) {
            dot = s.length();
        }
        return s.substring(0, Math.min(slash, dot));
    }
    
    private String resourceDecription(String s) {
        
        int slash = s.indexOf(DESCRIPTION_SEPARATOR);
        int dot = s.lastIndexOf(EXTENSION_SEPARATOR);
        
        if (slash == -1) {
            return "";
        }
        if (dot == -1) {
            dot = s.length();
        }
        
        return s.substring(slash + DESCRIPTION_SEPARATOR.length(), dot);
    }
    
    private byte[] read(File file) throws IOException {
        
        log("reading " + file);
        
        // This is assuming that the data is < 5M bytes as that is the Force.com limit
        byte[] buffer = new byte[5 * 1024 * 1024];
        BufferedInputStream is = new BufferedInputStream(new FileInputStream(file));
        int size = is.read(buffer);
        
        byte[] result = new byte[size];
        System.arraycopy(buffer, 0, result, 0, size);
        return result;
    }
}

An Ant task that required the Partner API would just call the getPartnerPortType/getPartnerSessionHeader methods instead of the getMetadataPortType/getMetadataSessionHeader methods.

Identifying Apex tests to run using wildcards in sf:deploy

When developing code including tests that builds on a managed package it is typically necessary to exclude the managed package tests from the test runs. For example, the new code might add an additional data constraint that causes a managed package test to fail because its data setup violates the constraint. (There is also the secondary issue that the managed package tests may slow the development cycle if they take many minutes to run.)

The sf:deploy Ant task supports a flag to run all tests or alternatively accepts the names of the test classes to run. If the number of tests involved is small, the latter option works fine. But if you have a large number of tests it starts to become tedious and error prone to maintain the list of tests. This is the same problem that Java projects using the JUnit Ant task face and there the solution is to allow the set of tests that are run to be driven by file name matches in the source tree via the batchtest nested element.

I’ve added this mechanism to the DeployWithXmlReportTask (an extension of sf:deploy) that is available in the force-deploy-with-xml-report-task Google code project. Below is an example of how to use it in a project where all test classes follow the convention of having their file names end in “Test”.

<path id="ant.additions.classpath">
    <fileset dir="ant"/>
</path>
<target name="deployAndTestAndReport">
    <taskdef
        name="sfdeploy"
        classname="com.claimvantage.force.ant.DeployWithXmlReportTask"
        classpathref="ant.additions.classpath"
        />
    <delete dir="test-report-xml" quiet="true"/>
    <echo message="deploying to ${sf.username}"/>
    <sfdeploy
            username="${sf.username}"
            password="${sf.password}"
            serverurl="${sf.serverurl}"
            deployRoot="src"
            runalltests="false"
            maxpoll="60"
            junitreportdir="test-report-xml"
            >
        <!-- Run all tests that match the include file name pattern (so avoiding running managed package tests) -->
        <batchtest>
            <fileset dir="src/classes">
                <include name="*Test.cls"/>
            </fileset>
        </batchtest>
    </sfdeploy>
</target>

Scripting the Apex Data Loader via Ant

The Apex Data Loader can be scripted using Spring bean XML definitions. Where the configuration needs to vary for each object that approach makes sense. But I had a need to (repeatedly) export many different objects with largely the same configuration and so wanted to use Ant and keep the configuration in Ant.

The resulting Ant build.xml file is shown here, where a macro is used that accepts a file name, an object name and a SOQL string, making it simple to add additional object exports as needed:

<?xml version="1.0" encoding="UTF-8"?>
<project name="Export" default="all">
    <macrodef name="export">
        <attribute name="file"/>
        <attribute name="object"/>
        <attribute name="soql"/>
        <sequential>
            <echo message="Exporting @{object}"/>
            <mkdir dir="exports"/>
            <copy file="config/template-process-conf.xml" tofile="config/process-conf.xml" overwrite="true" failonerror="true"/>
            <replace file="config/process-conf.xml">
                <replacefilter token="_object_" value="@{object}"/>
                <replacefilter token="_soql_" value="@{soql}"/>
                <replacefilter token="_file_" value="exports/@{file}.csv"/>
            </replace>
            <java classname="com.salesforce.dataloader.process.ProcessRunner" classpath="lib/DataLoader.jar" failonerror="true">
                <sysproperty key="salesforce.config.dir" value="config"/>
                <arg line="process.name=@{object}"/>
            </java>
        </sequential>
    </macrodef>
    <target name="all">
        <export file="SampleAccounts" object="Account" soql="Select Id, Name, ... From Account"/>
        <export file="SampleContacts" object="Contact" soql="Select Id, Name, ... From Contact"/>
        ...
    </target>
</project>

The Ant macro is replacing tokens in this template-process-conf.xml file to produce a process-conf.xml file for each object export:

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <bean id="_object_" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
        <description>TemplatedCsvExtract extracts to a CSV file."</description>
        <property name="name" value="TemplatedCsvExtract"/>
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.endpoint" value="..."/>
                <entry key="sfdc.username" value="..."/>
                <entry key="sfdc.password" value="..."/>
                <entry key="sfdc.debugMessages" value="false"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.extractionRequestSize" value="500"/>
                <entry key="process.operation" value="extract"/>
                <entry key="dataAccess.type" value="csvWrite"/>
                <entry key="sfdc.entity" value="_object_"/>
                <entry key="sfdc.extractionSOQL" value="_soql_"/>
                <entry key="dataAccess.name" value="_file_"/>
            </map>
        </property>
    </bean>
</beans>

The directories required are:

  • “config” containing the template-process-conf.xml file, an empty config.properties file and a log-conf.xml file
  • “lib” containing a copy of DataLoader.jar so that Ant can invoke its com.salesforce.dataloader.process.ProcessRunner class

The output CSV files are written to the “exports” folder.

PS I was working with this again recently on a different machine than the encrypt utility was run on. So as well as setting “sfdc.password” to be the encrypted version of the password and security token I also needed to set the “process.encryptionKeyFile” to reference a file containing the key used to encrypt the password.