Hash code for any Apex type via System.hashCode

If you are writing your own class that you want to use in sets or as a map key, you must add equals and hashCode methods as described in Using Custom Types in Map Keys and Sets. But hashCode is usually implemented by combining the hash codes of the fields of the class and until now not all Apex types exposed a hash code value.

This is now addressed in Summer ’14 by the System.hashCode method. It saves a lot of work for SObjects and provides consistency for primitive types.

This test illustrates the method working:

private class HashCodeTest {

    static void string() {
        // Already had a hashCode method
        System.assertEquals('hello world'.hashCode(), System.hashCode('hello world'));
    static void decimal() {
        // Wasn't available before
        System.assertEquals(3843600, System.hashCode(123.987));
        System.assertEquals(3843600, System.hashCode(123.987));
        System.assertEquals(0, System.hashCode(0));
        System.assertEquals(1, System.hashCode(1));
        System.assertEquals(2, System.hashCode(2));
        System.assertEquals(1024, System.hashCode(3.3));
        System.assertEquals(-720412809, System.hashCode(4773463427.34354345));
    static void sobject() {
        // Wasn't available before
        System.assertEquals(-1394890885, System.hashCode(new Contact(
                LastName = 'Doe')));
        System.assertEquals(-1394890885, System.hashCode(new Contact(
                LastName = 'Doe')));
        System.assertEquals(-1474401918, System.hashCode(new Contact(
                LastName = 'Smith')));
        System.assertEquals(744078320, System.hashCode(new Contact(
                LastName = 'Doe', FirstName = 'Jane')));
        System.assertEquals(744095627, System.hashCode(new Contact(
                LastName = 'Doe', FirstName = 'John')));
    static void testNull() {
        try {
            System.assertEquals(0, System.hashCode(null));
        } catch (NullPointerException e) {

Thanks to Daniel Ballinger for flagging this via a comment on Expose hashCode on all Apex primitives so hashCode viable for custom classes.

How to pass a large number of selections from a standard list view to a Visualforce page

Buttons can be defined and added to an object’s standard list view and these buttons can access the selected objects:


If only a few items are selected then the IDs can be passed on to a Visualforce page like this (a “List Button” with “Display Checkboxes” checked that has behavior “Execute JavaScript” and content source “OnClickJavaScript”):

var ids = {!GETRECORDIDS($ObjectType.Contact)};
if (ids.length) {
    if (ids.length <= 100) {
        window.location = '/apex/Target?ids=' + ids.join(',');
    } else {
        alert('Select 100 or less');
} else {
    alert('Select one or more Contacts');

The limit of 100 selected items is imposed because 2k characters is generally considered the longest URL that works safely everywhere, and a GET requires that each 15 character ID is appended to the URL together with a delimiter (to pass the values).

So what if you need to support more than 100 selected objects? (The standard list view UI does allow selections to be made on multiple pages and allows a single page to show up to 200 objects.) Using a POST instead of a GET where the ID values are part of the request body rather than the URL avoids the URL length problem. Here is how to do that:

var ids = {!GETRECORDIDS($ObjectType.Contact)};
if (ids.length) {
    var form = document.createElement("form");
    form.setAttribute("method", "POST");
    form.setAttribute("action", "https://c.na15.visual.force.com/apex/Target");
    var hiddenField = document.createElement("input");
    hiddenField.setAttribute("type", "hidden");
    hiddenField.setAttribute("name", "ids");
    hiddenField.setAttribute("value", ids.join(','));
} else {
    alert('Select one or more Contacts');

This JavaScript creates a form and posts it to the server. An important part of making this work is that the full target URL needs to be specified as described in this Get POST data via visualforce page article otherwise the Apex controller can’t pickup the posted data via ApexPages.currentPage().getParameters(). An obscure trick to say the least.

This page:

<apex:page controller="Target">
    <apex:repeat value="{!ids}" var="id">

and controller:

public with sharing class Target {
    public String[] ids {
        get {
            if (ids == null) {
                String s = ApexPages.currentPage().getParameters().get('ids');
                if (s != null) {
                    ids = s.split(',');
                } else {
                    ids = new String[] {};
            return ids;
        private set;

can be used to demonstrate that the IDs are passed correctly for both versions of the JavaScript button.

An @RestResource Apex class that returns multiple JSON formats

The simplest way to write an @RestResource class is to return Apex objects from the @Http methods and leave it up to the platform to serialize these objects as JSON (or XML):

global without sharing class ReportRest {

    public class MyInnerClass {
        public String name;
        public Integer number;

    global static MyInnerClass get() {
        MyInnerClass instance = new MyInnerClass();
        return instance;

This also allows tests to be written that don’t have to deserialize as they can just reference the class instances directly. But the approach imposes these limitations:

  • The response JSON is fixed and determined by the returned classes and their fields so responses that vary depending on the URL requested can’t be produced
  • Error conditions typically get handled by adding error fields to the response object rather than by returning a status code other than 200 and separate error information

Here is an alternate pattern that is a bit more work but in my experience meets the needs of client-side MVC applications (AngularJS in my case) better. The class returns two different JSON formats (depending on the part of the URL after “/report/”):

global without sharing class ReportRest {

    public class Day {
        public Date date;
        public Integer hours;
    public class Employee {
        public String name;
        public Day[] approved = new Day[] {};
    public class Claim {
        public String employeeName;
        public String claimNumber;
    global static void get() {
        RestResponse res = RestContext.response;
        if (res == null) {
            res = new RestResponse();
            RestContext.response = res;
        try {
            res.responseBody = Blob.valueOf(JSON.serialize(doGet(extractReportId())));
            res.statusCode = 200;
        } catch (EndUserMessageException e) {
            res.responseBody = Blob.valueOf(e.getMessage());
            res.statusCode = 400;
        } catch (Exception e) {
            res.responseBody = Blob.valueOf(
                    String.valueOf(e) + '\n\n' + e.getStackTraceString()
            res.statusCode = 500;
    private static Object doGet(String reportId) {
        if (reportId == 'ac') {
            return absenceCalendarReport();
        } else if (reportId == 'al') {
            return absenceListReport();
        } else if (reportId == 'dl') {
            return disabilityListReport();
        } else {
            throw new EndUserMessageException(reportId + ' not implemented');
    private static Employee[] absenceCalendarReport() {
        Employee[] employees = new Employee[] {};
        return employees;
    private static Claim[] absenceListReport() {
        Claim[] claims = new Claim[] {};
        return claims;
    private static Claim[] disabilityListReport() {
        Claim[] claims = new Claim[] {};
        return claims;
    private static String extractReportId() {
        String[] parts = RestContext.request.requestURI.split('\\/');
        String lastPart = parts[parts.size() - 1];
        Integer index = lastPart.indexOf('?');
        return index != -1 ? lastPart.substring(0, index) : lastPart;

Apex classes are still used to represent the returned data but are explicitly serialized using a JSON.serialize call. As the overall response is being explicitly built, the returned status code can be set allowing the client side to vary its logic depending on that status code. In this example error information – intended to be shown to an end user as signalled by the EndUserMessageException custom exception or unintended and so including a stack trace – is returned as plain text that can be directly shown to the end user.

Salesforce Packaged Modules – what do you think?

If you have worked on server-side JavaScript you will be familiar with the NPM (Node Packaged Modules) site. The Node.js community have managed to corral the loose world of JavaScript into some 74,000 packages that have been downloaded many million times. When I wanted a HTTP client it took me just a few minutes to find one in NPM that I liked and it has been working fine ever since.

In comparison, I know of relatively few libraries of Apex code, and one of the oldest, apex-lang, presently shows only 1100 downloads. So I assume that of the (several) billion lines of Apex that have been written, just about none of it is re-used in more than one org. So not much standing on the shoulders of giants going on.

Yes there is the managed package mechanism, but I think of managed packages as containers for substantial functionality with typically no dependency on other 3rd party managed packages. Perhaps we have all been waiting for Salesforce to introduce some lighter-weight library mechanism.

Some cool features of NPM packages (see the package.json example below) are:

  • automatic dependency chains: if you install a package that requires other packages they will also be automatically installed
  • versioning and version dependency
  • repository locations included
  • very easy to use e.g. “npm install protractor” and 30 seconds later everything you need is installed and ready to run

What might this look like for Salesforce and Apex? There would need to be an spm command-line tool that could pull from (Git) repositories and push into Salesforce orgs (or a local file system) – seems possible. Some of the 40 characters available for names would have to be used to avoid naming collisions and there would need to be a single name registry. The component naming convention could be something like spm_myns_MyClassName for an Apex class and spm_myns_Package for the package description static resource (that would contain JSON). Somehow say 90% test coverage would need to be mandatory. Without the managed package facility of source code being hidden it would inherently be open source (a good thing in my mind). Perhaps code should have no dependency on concrete SObject types? Perhaps only some component types should be be allowed?

To get this started – apart from the tooling and a site – some truly useful contributions would be needed to demonstrate the value.

In the company where I work we have talked about sharing source code between managed packages but have not done it in any formalised way; I wonder if systems similar to what is described here are already in use in some companies? Or are already open sourced? Comments very welcome.

For reference, here is a slightly cut down NPM package.json example:

  "name": "protractor",
  "description": "Webdriver E2E test wrapper for Angular.",
  "homepage": "https://github.com/angular/protractor",
  "keywords": [
  "author": {
    "name": "Julie Ralph",
    "email": "ju.ralph@gmail.com"
  "dependencies": {
    "selenium-webdriver": "~2.39.0",
    "minijasminenode": ">=0.2.7",
    "saucelabs": "~0.1.0",
    "glob": ">=3.1.14",
    "adm-zip": ">=0.4.2",
    "optimist": "~0.6.0"
  "devDependencies": {
    "expect.js": "~0.2.0",
    "chai": "~1.8.1",
    "chai-as-promised": "~4.1.0",
    "jasmine-reporters": "~0.2.1",
    "mocha": "~1.16.0",
    "express": "~3.3.4",
    "mustache": "~0.7.2"
  "repository": {
    "type": "git",
    "url": "git://github.com/angular/protractor.git"
  "bin": {
    "protractor": "bin/protractor",
    "webdriver-manager": "bin/webdriver-manager"
  "main": "lib/protractor.js",
  "scripts": {
    "test": "node lib/cli.js spec/basicConf.js; ..."
  "license": "MIT",
  "version": "0.16.1",
  "webdriverVersions": {
    "selenium": "2.39.0",
    "chromedriver": "2.8",
    "iedriver": "2.39.0"
  "readme": "...",
  "readmeFilename": "README.md",
  "bugs": {
    "url": "https://github.com/angular/protractor/issues"
  "_id": "protractor@0.16.1",
  "_from": "protractor@0.16.x"

Serving AngularJS templates from static resources

An AngularJS app typically starts with an “index” page that loads the required JavaScript/CSS and acts as the container for the client-side processed content. The app operates by rendering various templates in response to user interactions into that container.

That “index” page is a good place to obtain information from the “Visualforce” world that can be passed to the “AngularJS” world, and so is best made a Visualforce page. (See Passing platform configuration to an AngularJS app.)

But what about the templates? Typically there are many of these. Should they also be Visualforce pages? At first sight it seems a reasonable thing to do as the templates are “partial pages”. And Visualforce pages have fixed URLs whereas static resources have URLs that include a timestamp making them harder to reference in JavaScript code such as a route provider. And if you use individual static resources per template (rather than a ZIP static resource containing all the templates) each template has its own timestamp.

But providing a clear separation has been made between server-side processing and client-side processing, no Visualforce capabilities are needed for the templates. And using Visualforce pages adds complexity such as requiring profiles to be updated. So how can the static resource timestamp value be handled if static resources are used instead?

The answer is surprisingly simple: it appears that using the current (JavaScript) timestamp is enough to get the latest version. So a $routeProvider templateUrl for a static resource called “xyz_partial” is simply:

templateUrl: '/resource/' + Date.now() + '/xyz_partial'

You can see this pattern applied in this (quite new) Salesforce AngularJS sample application created by Pat Patterson.

PS As David Esposito comments, where there are only a small number of resource references, it is arguably cleaner to not use this timestamp approach.

Passing platform configuration to an AngularJS app

Running a JavaScript client-side MVC app such as an AngularJS app in Salesforce presents the problem of how to obtain configuration information from the platform. Most of the app is best located in a static resource zip file as server-side Visualforce processing isn’t needed. Using relative URLs between the various files in the zip then avoids any dependency on the absolute URL of the zip. (That absolute URL includes a timestamp and also a namespace prefix if a managed package is involved so the fewer references to it the better.)

But there are still a few configuration parameters that are easiest to obtain using Visualforce. The index Visualforce page – that dynamic page content is inserted into – is a good single place to obtain that information and make it available to the rest of the app through JavaScript via Angular’s constant mechanism:

<apex:page showHeader="false" sidebar="false"
        standardStylesheets="false" applyHtmlTag="false">
<html lang="en" ng-app="eepApp" ng-controller="AppController">

<script src="{!URLFor($Resource.appzip, 'lib/angular/angular.min.js')}"></script>
<script src="{!URLFor($Resource.appzip, 'js/app.js')}"></script>
<script src="{!URLFor($Resource.appzip, 'js/controllers.js')}"></script>
<script src="{!URLFor($Resource.appzip, 'js/filters.js')}"></script>
<script src="{!URLFor($Resource.appzip, 'js/services.js')}"></script>

(function() {
    var parts = '{! $CurrentPage.Name }'.split('__');
    var namespace = parts.length == 2 ? parts[0] : null
    var restPrefix =  '{! $Site.CurrentSiteUrl }services/apexrest'
            + (namespace ? '/' + namespace : '');
    var pagePrefix = 'https://{! $Site.Domain }';
    var serverUrls = {
        namespacePrefix: namespace ? namespace + '__' : '',
        configRest: restPrefix + '/eep/config',
        employeesRest: restPrefix + '/eep/employees',
        metaRest: restPrefix + '/eep/meta',
        loginPage: pagePrefix + '{! $Page.Login }',
        logoutPage: pagePrefix + '{! $Page.Logout }'
    console.log('serverUrls=' + JSON.stringify(serverUrls));
    // This configures the Angular app (declared in app.js)
    eepApp.constant('ServerUrls', serverUrls);

With this setup, any service or controller that needs to reference one of the configuration values just declares a dependency on the ServerUrls object and references the values from that. The result is a clean separation of concerns.

Connecting DataTables to JSON generated by Apex

I had a quick go at coming up with an answer to this DataTables sAjaxSource question on Salesforce Stackexchange but didn’t get very far. Using DataTable’s sAjaxSource directly runs you into the problem that the Visualforce page containing the table and the Apex REST API that supplies the JSON use different host names and so the same origin security policy blocks the access. (The answer includes one way to work-around this.)

The approach I tried was to use Apex’s @RemoteAction mechanism, where a JavaScript function is generated in the page. So instead of referencing a URL, the JavaScript function is called and how it connects back to the server is its concern. The trick to getting it to work is to recognise that the JavaScript function that you add (via the fnServerData property) is intended to provide an interception point for the request to the URL you specify (via the sAjaxSource property). So both must be specified, even though in this case the URL is not used.

This an example of the output (using default styles):


Here is the Visualforce page; most of the JavaScript is just cleaning up the JSON data returned from the controller to suit DataTables:

<apex:page controller="DataTableController">


<apex:sectionHeader title="DataTables"/>

<table id="table" cellpadding="0" cellspacing="0" border="0">


var j$ = jQuery.noConflict();

var fields = ['Name', 'Birthdate', 'Phone', 'Email', 'Salary__c'];

var aoColumns = [];
for (var i = 0; i < fields.length; i++) {
    aoColumns.push({'mData': fields[i]});

j$(document).ready(function() {
        'aoColumns': aoColumns,
        'bProcessing': true,
        'bServerSide': true,
        'bFilter': false,
        'sAjaxSource': 'fakeUrl',
        'fnServerData': function(sSource, aoData, fnCallback) {
            // Call the @RemoteAction JavaScript function
            DataTableController.contacts(aoData, function(result, event) {
                if (event.type != 'exception') {
                    for (var i = 0; i < result.aaData.length; i++) {
                        var r = result.aaData[i];
                        for (var j = 0; j < fields.length; j++) {
                            var field = fields[j];
                            if (r[field] == undefined) {
                                // DataTables pops a dialog for undefined values
                                r[field] = null;
                            } else if (field == 'Birthdate') {
                                // Dates transmitted as longs
                                var d = new Date(r[field]);
                                r[field] = ''
                                        + (d.getMonth() + 1)
                                        + '/'
                                        + d.getDate()
                                        + '/'
                                        + d.getFullYear()
                    // Call back into the DataTable function
                } else {



Most of the complexity in the Apex code is in interpreting the request parameters sent by DataTables including things like the multi-column sorting. Note that the conversion from JSON to Apex objects and Apex objects to JSON is left to the platform.

// See https://datatables.net/usage/server-side
global class DataTableController {

    // Defines shape of JSON response
    global class Response {
        public Integer sEcho;
        public Integer iTotalRecords;
        public Integer iTotalDisplayRecords;
        public SObject[] aaData;
        Response(Integer echo, Integer total, SObject[] sobs) {
            this.sEcho = echo;
            this.iTotalRecords = total;
            this.iTotalDisplayRecords = total;
            this.aaData = sobs;

    // DataTable passes JSON definition of what server should do
    private class Params {
        Map<String, Object> m = new Map<String, Object>();
        Integer echo;
        Integer start;
        Integer length;
        String[] columns;
        Integer[] sortColumns;
        String[] sortDirections;
        Params(List<Map<String, Object>> request) {
            for (Map<String, Object> r : request) {
                m.put((String) r.get('name'), r.get('value'));
            echo = integer('sEcho');
            start = integer('iDisplayStart');
            length = integer('iDisplayLength');
            columns = stringArray('mDataProp');
            sortColumns = integerArray('iSortCol');
            sortDirections = stringArray('sSortDir');
        String[] stringArray(String prefix) {
            String[] strings = new String[] {};
            for (Object o : array(prefix)) {
                strings.add(o != null ? esc(String.valueOf(o)) :null);
            return strings;
        Integer[] integerArray(String prefix) {
            Integer[] integers = new Integer[] {};
            for (Object o : array(prefix)) {
                integers.add(o != null ? Integer.valueOf(o) : null);
            return integers;

        Object[] array(String prefix) {
            Object[] objects = new Object[] {};
            for (Integer i = 0; true; i++) {
                Object o = m.get(prefix + '_' + i);
                if (o != null) {
                } else {
            return objects;
        Integer integer(String name) {
           Object o = m.get(name);
           if (o instanceof Decimal) {
               return ((Decimal) o).intValue();
           } else if (o instanceof Integer) {
               return (Integer) o;
           } else {
               return null;
        // Guard against SOQL injection
        String esc(String s) {
            return s != null ? String.escapeSingleQuotes(s) : null;
    global static Response contacts(List<Map<String, Object>> request) {
        Params p = new Params(request);

        String soql = ''
                + ' select ' + String.join(p.columns, ', ')
                + ' from Contact'
                + ' order by ' + String.join(orderBys(p), ', ')
                + ' limit :length'
                + ' offset :start'
        System.debug('>>> soql=' + soql);

        Integer start = p.start;
        Integer length = p.length;
        return new Response(
                [select Count() from Contact limit 40000],
    private static String[] orderBys(Params p) {
        Map<String, String> soqlDirections = new Map<String, String>{
                'asc' => 'asc nulls last',
                'desc' => 'desc nulls first'
        String[] orderBys = new String[] {};
        Integer min = Math.min(p.sortColumns.size(), p.sortDirections.size());
        for (Integer i = 0; i < min; i++) {
                    + p.columns[p.sortColumns[i]]
                    + ' '
                    + soqlDirections.get(p.sortDirections[i])
        return orderBys;