Sharing code and components across managed packages

If you are creating multiple managed packages and want to re-use some code and components in several of them there is no simple solution. (The approach you might take in the Java world of creating a JAR file that contains many related classes that you use in multiple applications is not available.)

You can put the components in a separate managed package, but that is a course-grained approach and has its own set of problems. This post outlines how to use svn:externals (yes this is SVN; I’m unsure about Git) to add shared components into the source tree, so the shared components just become part of each managed package. They pickup the namespace as part of the normal packaging process.

So in the version control system you have an extra project that contains the components you want to share:

  • SharedComponents
  • ManagedPackage1
  • ManagedPackage2

“SharedComponents” can only have dependencies on the core platform; it cannot have dependencies on anything in your managed packages.

Then in the managed package projects, add external file definitions to the svn:externals property of the src folder:

classes/SharedSms.cls              https://.../src/classes/SharedSms.cls
classes/SharedSms.cls-meta.xml     https://.../src/classes/SharedSms.cls-meta.xml
classes/SharedSmsTest.cls          https://.../src/classes/SharedSmsTest.cls
classes/SharedSmsTest.cls-meta.xml https://.../src/classes/SharedSmsTest.cls-meta.xml

where represents the SVN path to the “SharedComponents” project. In this example there are just two classes but each managed package can opt into as few or as many of the components as it needs. The purpose of the “Shared” prefix is to make it clearer in the managed package source where the components come from. (IDEs like Eclipse also decorate the icon of svn:externals to distinguish them.)

Once the svn:externals definition is in place, an SVN update automatically pulls content from both locations. You need to be using at least version 1.6 of SVN; our Jenkins (Continuous Integration server) was set to use version 1.4 by default so that had to be changed to get the builds to work.

Discipline is needed when modifying components in “SharedComponents” to not break any managed package code that depends on them. Running Continuous Integration builds on all the projects will help give early warning of such problems.

Breaking managed package dependencies

We have several managed packages with customers sometimes installing just one of them and other times several of them (depending on the set of features they want). Calls can be needed between the packages: global interfaces and classes defined in one package – lets call it B – are called from another package – lets call it A.

But once the direct calls are added and new managed package versions created, A cannot be installed without B first being installed because the platform’s package dependency approach is rigid and enforced at installation time. This dependency (illustrated with UML dependency notation) is not what we want:

Dependencies

So how to allow managed package A to call managed package B without the fixed dependency? The trick is to have no compile-time dependency between A and B but instead to introduce a third entity C (that could be another managed package or non-namespaced local code) that calls are made through where dependency on both A and B is not a problem:

Broken

Here is an example of the pattern. The API used to illustrate the pattern has a single method to send an SMS (text) message.

The implementation (that we want to use from other packages) is in package B:

global interface Sms {
    global void send(String number, String message);
}

global class Factory {
    global static Sms createSms() {...}
}

In package A the interface is duplicated together with a mechanism to register a type that implements the interface. The package A code references only this interface and class:

global interface Sms {
    global void send(String number, String message);
}

global class Factory {
    global static Sms createSms() {
        // Use name of type from custom setting
        String s = ...;
        Type t = Type.forName(s);
        return (Sms) t.newInstance();
    }
    global static void registerSmsType(Type t) {
        // Store name of type in a custom setting
    }
}

...
    Factory.createSms().send('38383', 'PREZ');
...

Then in C, a class is implemented that has the signature defined in A and delegates to B to do the work. At some point this class name must be registered with A. If C is a managed package that could be in an InstallHandler or it could be a manual configuration step:

global class Sms implements A.Sms {
    global void send(String number, String message) [
        B.Factory.createSms().send(number, message);
    }
}

...
    A.Factory.registerSmsType(Sms.class);
....

So A and B remain independent and C is the “glue” that connects then together. A can be installed on its own and so can B. If they are both installed, then adding the C managed package or non-namespaced local code allows the call between the packages to be made.

PS

Stephen Wilcox’s Apex Calls Between Independent Packages describes the same pattern.

Salesforce Packaged Modules – what do you think?

If you have worked on server-side JavaScript you will be familiar with the NPM (Node Packaged Modules) site. The Node.js community have managed to corral the loose world of JavaScript into some 74,000 packages that have been downloaded many million times. When I wanted a HTTP client it took me just a few minutes to find one in NPM that I liked and it has been working fine ever since.

In comparison, I know of relatively few libraries of Apex code, and one of the oldest, apex-lang, presently shows only 1100 downloads. So I assume that of the (several) billion lines of Apex that have been written, just about none of it is re-used in more than one org. So not much standing on the shoulders of giants going on.

Yes there is the managed package mechanism, but I think of managed packages as containers for substantial functionality with typically no dependency on other 3rd party managed packages. Perhaps we have all been waiting for Salesforce to introduce some lighter-weight library mechanism.

Some cool features of NPM packages (see the package.json example below) are:

  • automatic dependency chains: if you install a package that requires other packages they will also be automatically installed
  • versioning and version dependency
  • repository locations included
  • very easy to use e.g. “npm install protractor” and 30 seconds later everything you need is installed and ready to run

What might this look like for Salesforce and Apex? There would need to be an spm command-line tool that could pull from (Git) repositories and push into Salesforce orgs (or a local file system) – seems possible. Some of the 40 characters available for names would have to be used to avoid naming collisions and there would need to be a single name registry. The component naming convention could be something like spm_myns_MyClassName for an Apex class and spm_myns_Package for the package description static resource (that would contain JSON). Somehow say 90% test coverage would need to be mandatory. Without the managed package facility of source code being hidden it would inherently be open source (a good thing in my mind). Perhaps code should have no dependency on concrete SObject types? Perhaps only some component types should be be allowed?

To get this started – apart from the tooling and a site – some truly useful contributions would be needed to demonstrate the value.

In the company where I work we have talked about sharing source code between managed packages but have not done it in any formalised way; I wonder if systems similar to what is described here are already in use in some companies? Or are already open sourced? Comments very welcome.

For reference, here is a slightly cut down NPM package.json example:

{
  "name": "protractor",
  "description": "Webdriver E2E test wrapper for Angular.",
  "homepage": "https://github.com/angular/protractor",
  "keywords": [
    "angular",
    "test",
    "testing",
    "webdriver",
    "webdriverjs",
    "selenium"
  ],
  "author": {
    "name": "Julie Ralph",
    "email": "ju.ralph@gmail.com"
  },
  "dependencies": {
    "selenium-webdriver": "~2.39.0",
    "minijasminenode": ">=0.2.7",
    "saucelabs": "~0.1.0",
    "glob": ">=3.1.14",
    "adm-zip": ">=0.4.2",
    "optimist": "~0.6.0"
  },
  "devDependencies": {
    "expect.js": "~0.2.0",
    "chai": "~1.8.1",
    "chai-as-promised": "~4.1.0",
    "jasmine-reporters": "~0.2.1",
    "mocha": "~1.16.0",
    "express": "~3.3.4",
    "mustache": "~0.7.2"
  },
  "repository": {
    "type": "git",
    "url": "git://github.com/angular/protractor.git"
  },
  "bin": {
    "protractor": "bin/protractor",
    "webdriver-manager": "bin/webdriver-manager"
  },
  "main": "lib/protractor.js",
  "scripts": {
    "test": "node lib/cli.js spec/basicConf.js; ..."
  },
  "license": "MIT",
  "version": "0.16.1",
  "webdriverVersions": {
    "selenium": "2.39.0",
    "chromedriver": "2.8",
    "iedriver": "2.39.0"
  },
  "readme": "...",
  "readmeFilename": "README.md",
  "bugs": {
    "url": "https://github.com/angular/protractor/issues"
  },
  "_id": "protractor@0.16.1",
  "_from": "protractor@0.16.x"
}

Mapping an AngularJS client into a Force.com server

Client-side MVC frameworks such as AngularJS move most of an application into the browser. The server’s job is largely just to deliver static files – HTML, CSS, JavaScript, images – normally located in a tree of directories. And to provide a JSON-based RESTful API using HTTP GET/PUT operations to read and write model objects.

Force.com’s @RestResource Apex classes are an effective way of implementing a JSON-based RESTful API. But Force.com is by design not a general purpose web server, so mapping a typical tree of client files into the page and static resource mechanisms that are available is a bit awkward. Below is one approach to that problem.

One of the pleasures of developing this sort of client-side logic is that (at least on a Mac) all it takes to get started is to put a few files in a directory and then run “python -m SimpleHTTPServer 8000” in that directory. As you edit and add, you see the results immediately: no 10 second delays while files are copied up to a remote server. You pretty naturally end up with a directory tree. So I make this layout of the files essentially the “master”. (A downside of working this way is that browsers like Chrome block requests to servers – such as the remote RESTful API one – other than the server that the pages came from. But handily Chrome has a --disable-web-security command-line option to turn this checking off while you are developing.) This layout can also be zipped and used pretty much directly in tools like Adobe PhoneGap Build so your HTML 5 application can be delivered as an iOS or Android etc app.

But how to transform this master layout into Force.com pages and static resources so the entire app can be served from Force.com? The good news is that thanks to URLFOR, a static resource can be a zip file and a page can reference files and directories within that zip file. So CSS, JavaScript and images can all go into a single static resource. However, the HTML that includes references to the other objects has to go into pages as that is the context that URLFOR works in. The raw HTML also has to be wrapped in Visualforce:

<apex:page showHeader="false" sidebar="false" standardStylesheets="false" applyHtmlTag="false">
    ....
</apex:page>

and the naming flattened as directories are not supported. Visualforce will also report any “well formed XML” violations.

This re-organization of course means that references between the files also have to be adjusted. For a limited size project, the changes can be worked out by hand and treated as a set of search/replace strings (the filter elements below).

Doing all this transformation work by hand is obviously tedious and error-prone. So below is the source code of Ant task that can be configured like this:

<webtosf fromdir="." todir="../EepServer/src">
	<fileset dir="css"/>
	<fileset dir="data"/>
	<fileset dir="js"/>
	<filter token="lib/bootstrap/bootstrap.min.css" value="//netdna.bootstrapcdn.com/bootstrap/3.0.0/css/bootstrap.min.css"/>
	<filter token="lib/angular/angular.min.js" value="//ajax.googleapis.com/ajax/libs/angularjs/1.2.0-rc.3/angular.min.js"/>
	<filter token="js/app.js" value="{!URLFor($Resource.appzip, 'js/app.js')}"/>
	<filter token="data/Acme-420x114.png" value="{!URLFor($Resource.appzip, 'data/Acme-420x114.png')}"/>
	<filter token="partials/login.html" value="partialslogin"/>
</webtosf>

to do all the work repeatedly and reliably.

Feel free to use this code and change it as you like:

package com.claimvantage.ant;

import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;

import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.DirectoryScanner;
import org.apache.tools.ant.Project;
import org.apache.tools.ant.Task;
import org.apache.tools.ant.types.FileSet;

/**
 * Transforms a fairly free-form JavaScript web app layout into Force.com resources.
 * A set of text replacements are applied to any .css, .js or .html files
 * to e.g. fix resource references and necessary name changes.
 * Any .html files are converted into individual Force.com pages.
 * Other files are all added to a single zip file static resource called appzip.
 */
public class WebToSf extends Task {

    private static final String LF = System.getProperty("line.separator");

    public static class Filter {

        private String token;
        private String value;

        public void setToken(String token) {
            this.token = token;
        }

        public void setValue(String value) {
            this.value = value;
        }
    }

    // Root to read from and write to
    private File fromDir;
    private File toDir;

    // Files that go into the named zip
    private List<FileSet> zipContents = new ArrayList<FileSet>();

    // Replacements
    private List<Filter> filters = new ArrayList<Filter>();

    public void setFromDir(File fromDir) {
        this.fromDir = fromDir;
    }

    public void setToDir(File toDir) {
        this.toDir = toDir;
    }

    public void addFileset(FileSet zipContent) {
        zipContents.add(zipContent);
    }

    public Filter createFilter() {
        Filter filter = new Filter();
        filters.add(filter);
        return filter;
    }

    public void execute() {

        if (fromDir == null) {
            throw new BuildException("fromdir must be set");
        }
        if (!fromDir.exists()) {
            throw new BuildException("fromdir " + fromDir.getAbsolutePath() + " does not exist");
        }
        if (toDir == null) {
            throw new BuildException("todir must be set");
        }
        for (int i = 0; i < filters.size(); i++) {
            if (filters.get(i).token == null) {
                throw new BuildException("token missing from filter index " + i);
            }
        }
        for (int i = 0; i < filters.size(); i++) {
            if (filters.get(i).value == null) {
                throw new BuildException("value missing from filter index " + i);
            }
        }

        try {
            zip();
            pages(fromDir);
        } catch (Exception e) {
            throw new BuildException(e);
        }
    }

    private void zip() throws Exception {
        
        File staticresources = new File(toDir, "staticresources");
        if (!staticresources.exists()) {
            staticresources.mkdirs();
        }

        // Data
        ZipOutputStream zos = new ZipOutputStream(new BufferedOutputStream(
                new FileOutputStream(new File(staticresources, "appzip" + ".resource"))));
        try {
            for (FileSet fs : zipContents) {
                
                if (!fs.isFilesystemOnly()) {
                    throw new BuildException("only filesystem flesets supported");
                }
                
                DirectoryScanner ds = fs.getDirectoryScanner(getProject());
                File baseDir = getProject().getBaseDir();
                File fsDir = fs.getDir(getProject());

                // Keep path as folders
                String path = "";
                for (String part : pathDifference(baseDir, fsDir)) {
                    path += part;
                    path += "/";
                }
                
                for (String fsName : ds.getIncludedFiles()) {
                    
                    String newName = path + fsName;
                    
                    log("zipping dir=" + fsDir + " file=" + fsName + " to=" + newName,
                            Project.MSG_INFO);

                    zos.putNextEntry(new ZipEntry(newName));
                    if (isText(fsName)) {
                        // Replace
                        BufferedReader r = new BufferedReader(new InputStreamReader(
                                new FileInputStream(new File(fsDir, fsName))));
                        try {
                            String line;
                            while ((line = r.readLine()) != null) {
                                zos.write(replace(line).getBytes());
                                zos.write(LF.getBytes());
                            }
                            zos.closeEntry();
                        } finally {
                            r.close();
                        }
                    } else {
                        // Just byte for byte copy
                        BufferedInputStream is = new BufferedInputStream(
                                new FileInputStream(new File(fsDir, fsName)));
                        try {
                            byte[] buf = new byte[4092];
                            int len;
                            while ((len = is.read(buf)) != -1) {
                                zos.write(buf, 0, len);
                            }
                        } finally {
                            is.close();
                        }
                    }
                }
            }
        } finally {
            zos.close();
        }
        
        // Meta
        BufferedWriter ww = new BufferedWriter(new FileWriter(
                new File(staticresources, "appzip" + ".resource-meta.xml")));
        try {
            ww.write(""
                    + "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + LF
                    + "<StaticResource xmlns=\"http://soap.sforce.com/2006/04/metadata\">" + LF
                    + "    <cacheControl>Public</cacheControl>" + LF
                    + "    <contentType>application/zip</contentType>" + LF
                    + "</StaticResource>"
                    );
        } finally {
            ww.close();
        }
    }

    private boolean isText(String name) {
        
        String lc = name.toLowerCase();
        return lc.endsWith(".css") || lc.endsWith(".js");
    }

    private void pages(File dir) throws Exception {

        File pages = new File(toDir, "pages");
        if (!pages.exists()) {
            pages.mkdirs();
        }

        for (File f : dir.listFiles()) {
            if (f.isDirectory()) {
                pages(f);
            } else {
                if (isHtml(f.getName())) {
                    page(pages, f);
                }
            }
        }
    }

    private boolean isHtml(String name) {
        
        String lc = name.toLowerCase();
        return lc.endsWith(".html");
    }

    private void page(File pages, File f) throws Exception {
        
        // Prepend path
        String name = "";
        for (String path : pathDifference(getProject().getBaseDir(), f.getParentFile())) {
            name += cleanName(path);
        }
        name += cleanName(removeSuffix(f.getName()));
        
        File to = new File(pages, name + ".page");
        File toMeta = new File(pages, name + ".page-meta.xml");
        log("transforming page file=" + f + " to=" + to, Project.MSG_INFO);

        BufferedReader r = new BufferedReader(new FileReader(f));
        try {

            // Data
            BufferedWriter w = new BufferedWriter(new FileWriter(to));
            try {
                w.write("<apex:page showHeader=\"false\" sidebar=\"false\""
                        + " standardStylesheets=\"false\""
                        + " applyHtmlTag=\"false\">" + LF + LF);
                String line;
                while ((line = r.readLine()) != null) {
                    w.write(replace(line));
                    w.write(LF);
                }
                w.write(LF + "</apex:page>");
            } finally {
                w.close();
            }

            // Meta
            BufferedWriter ww = new BufferedWriter(new FileWriter(toMeta));
            try {
                ww.write("" + "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" + LF
                        + "<StaticResource xmlns=\"http://soap.sforce.com/2006/04/metadata\">" + LF
                        + "    <apiVersion>29.0</apiVersion>" + LF
                        + "    <label>" + name + "</label>" + LF
                        + "</StaticResource>"
                        );
            } finally {
                ww.close();
            }
        } finally {
            r.close();
        }
    }

    private String replace(String line) {
        
        for (Filter f : filters) {
            if (line.contains(f.token)) {
                log("... replacing " + f.token + " in line " + line, Project.MSG_INFO);
                line = line.replace(f.token, f.value);
            }
        }
        return line;
    }

    private String cleanName(String name) {

        StringBuffer sb = new StringBuffer();
        for (int i = 0; i < name.length(); i++) {
            char c = name.charAt(i);
            if (Character.isLetter(c) || Character.isDigit(c) || c == '_') {
                sb.append(c);
            }
        }
        return sb.toString();
    }
    
    private List<String> pathDifference(File baseDir, File subDir) {
        
        List<String> parts = new ArrayList<String>();
        for (File f = subDir; !baseDir.equals(f); f = f.getParentFile()) {
            parts.add(f.getName());
        }
        Collections.reverse(parts);
        return parts;
    }
    
    private String removeSuffix(String name) {
        
        int index = name.lastIndexOf('.');
        return index != -1 ? name.substring(0, index) : name;
    }
}

Managed packages can take hours to become available

The “upload” of a managed package – the creation of a new version – typically takes a few minutes to complete. But there is a second step – some sort of queued replication – that must also complete before the package version becomes available on other instances (e.g. na14 or eu2 or ap1).

Recently we’ve seen delays of up to 5 hours. The delay also applies to actions like “undeprecating” a version: a customer deployment was stalled for over 2 hours waiting for that change to propagate.

For those of us using continuous integration and the newly introduced automation for installing packages (see e.g. Andrew Fawcett’s post “Look ma, no hands!” : Automating Install and Uninstall of Packages!) these delays can result in a series of broken builds until the version becomes available.

I’ve checked with salesforce support and they have responded that:

At present there is no SLA on the installation process as it is an async process that depends on the server availability.

which is a clear position. I guess it’s a case of “hope for the best and plan for the worst” as far as these delays are concerned.

How much harm can one line of code do?

Suppose you are developing a feature that does programmatic manipulation of sharing rules. You go ahead and set the sharing default for your CustomObject to other than “Public Read/Write” in your development org. This automatically creates the CustomObject__Share object to go with your CustomObject__c and so allows you to add a class with a signature such as this:

global class Xyz {
    global void doSomething(List<CustomObject__Share> shares) {
        // ...
    }
}

And then you go ahead and create a managed released package.

The good news is that you now have a managed package that works well for customers who want to configure sharing rules. But the bad news is that you have made sharing rules mandatory for all your customers: the package cannot be installed unless the sharing default for CustomObject is set to other than “Public Read/Write” in the org the package is deployed to.

Woops.

So you go to fix this. The signature you would like to use instead is:

global class Xyz {
    global void doSomething(List<SObject> shares) {
        // ...
    }
}

But then you realize that global method signatures can’t be changed once included in a managed released package. Using @deprecated doesn’t help in this case: it is the presence of CustomObject__Sharing in the signature (whether the method is marked as deprecated or not) that creates the problem. Bottom line is that you are stuck.

Woops WTF.

Be careful what you put in the signature of global methods.