My week with Heroku Connect

Heroku Connect replicates and synchronises data between a Salesforce org and a Heroku Postgres database. So you can build an app in Heroku using one of seven technologies (including Node.js), hook that app up to the Postgres database, and then your app users will be able to see and modify the same data that the Salesforce users see and modify. Bear in mind that this mechanism allows you to share data: it does not allow you to share business logic.

As this solution is built in/on Heroku, it is incredibly easy to setup and get working; I had the Demo Edition setup and the first couple of SObjects working inside an hour. The mapping UI involves a lot of checkbox clicking, but once done the configuration can be exported and imported. The UI provides overview and drill-down on what is going on. One minor problem I had was quickly and efficiently sorted out by Heroku support. All good.

The clean and simple scenario is where the Postgres data is largely handled as read-only. Then (besides whatever technologies you are writing the Heroku app in) the main thing to get used to is that you are writing SQL not SOQL. Also there are two identifier values: the SFID which is the 18 character (case insensitive) Salesforce ID we are used to and a local Postgres integer ID. Foreign key fields from Salesforce reference the SFID not the ID.

Things get more awkward when you want to insert or update the Postgres data. Heroku Connect handles the synchronisation in that direction too, but some aspects of the implementation leak out:

  • The SFID is not available in the transaction where an insert is done (as it is allocated in Salesforce). In my very limited tests, it was available about a second later.
  • Fields such as CreatedDate and any fields populated by Salesforce logic such as insert or update triggers will remain empty until the next synchronization is done; in the basic polling mode that can be up to 10 minutes later. So your users may experience blank fields in the UI when they insert data that change to populated fields when the UI is refreshed 10 minutes later.

Inserting objects into Postgres that have parent/child relationships is awkward. For master/detail, the required pattern is documented in Inserting records with master/detail relationships using external IDs. It requires an “External ID” field to be added to the Salesforce objects (so is a little intrusive at that side) and the Heroku code needs to populate that field and (an automatically added) foreign key with a matching unique value. This then allows Salesforce to create the same master/detail relationship using the normal Salesforce identifier. I don’t know if, but do hope that, the Salesforce foreign key is pushed back into Postgres fairly immediately like the SFID is. There is no mention of how to accomplish this for lookup relationships.

I must mention the pleasure and productivity of writing the full application stack in one language and in JavaScript: AngularJS for the client and Node.js/Express/pg-promise/Passport for the server. All quick to get running for existing Salesforce data thanks to Heroku and Heroku Connect.

Advertisements

Picklist values by record type for AngularJS UI

A convenient Apex API to get the picklist value sub-set per record type has yet to appear. Visualforce does the sub-setting, but that is no help if you are building UI in some other technology such as AngularJS and want the select lists to reflect the sub-setting.

Here is a work-around for that situation, where the aim is to provide JSON to the AngularJS client-side code via a static resource. An API that provides access to both record types and static resources is the Metadata API. Thanks to Andrew Fawcett‘s work (see financialforcedev/apex-mdapi) and improvements in the underlying Salesforce API, this API is now quite simple to call from Apex code.

In the code below, the readPicklists method reads the record type information for 5 record types of a custom SObject called cve__BenefitClaimed__c. (This SObject and its record types are in a managed package that has the namespace prefix cve. Note also that it is the developer name of the record type that is used.) The picklist value data is extracted and placed in nested maps where the first key is the SObject type name, the second key the record type (developer) name and the third key the field name with the value a list of valid picklist values. The updateStaticResource method updates a pre-existing static resource (in a managed package that has the namespace prefix cveep) with the JSON string version of the nested maps.

I run this from a Visualforce admin page. If the picklist assignments are changed, the code is manually re-run.

The result is that the AngularJS code can use its $http service to get the static resource. The data is pre-created and so available quickly, and is also already in JSON format so is easy for the AngularJS code to consume.

Here is the code; sorry it’s rather wide…

public PageReference updatePicklists() {
    
    final String[] recordTypeFullNames = new String[] {
            'cve__BenefitClaimed__c.cve__Accident',
            'cve__BenefitClaimed__c.cve__LongTermCare',
            'cve__BenefitClaimed__c.cve__LongTermDisability',
            'cve__BenefitClaimed__c.cve__ShortTermDisability',
            'cve__BenefitClaimed__c.cve__Survivor'
            };
    
    final String staticResourceFullName = 'cveep__RecordTypePicklistValues';
            
    MetadataService.MetadataPort service = new MetadataService.MetadataPort();
    service.SessionHeader = new MetadataService.SessionHeader_element();
    service.SessionHeader.sessionId = UserInfo.getSessionId();
    
    String jsonString = readPicklists(service, recordTypeFullNames);
    updateStaticResource(service, staticResourceFullName, jsonString);
    
    return null;
}

private String readPicklists(MetadataService.MetadataPort service, String[] recordTypeFullNames) {
    
    Map<String, Map<String, Map<String, List<String>>>> sobMap = new Map<String, Map<String, Map<String, List<String>>>>();
    for (MetadataService.RecordType rt : (MetadataService.RecordType[]) service.readMetadata('RecordType', recordTypeFullNames).getRecords()) {
        MetadataService.RecordTypePicklistValue[] values = rt.picklistValues;
        if (rt.fullName != null && rt.picklistValues != null) {
            String[] parts = rt.fullName.split('\\.');
            String sobjectType = parts[0];
            String recordType = parts[1];
            Map<String, Map<String, List<String>>> rtMap = sobMap.get(sobjectType);
            if (rtMap == null) {
                rtMap = new Map<String, Map<String, List<String>>>();
                sobMap.put(sobjectType, rtMap);
            }
            Map<String, List<String>> fieldMap = rtMap.get(recordType);
            if (fieldMap == null) {
                fieldMap = new Map<String, List<String>>();
                rtMap.put(recordType, fieldMap);
            }
            for (MetadataService.RecordTypePicklistValue picklist : rt.picklistValues) {
                if (picklist.values != null) {
                    List<String> valueList = fieldMap.get(picklist.picklist);
                    if (valueList == null) {
                        valueList = new List<String>();
                        fieldMap.put(picklist.picklist, valueList);
                    }
                    for (MetadataService.PicklistValue value : picklist.values) {
                        valueList.add(value.fullName);
                    }
                }
            }
        }
    }
    
    return JSON.serialize(sobMap);
}

private void updateStaticResource(MetadataService.MetadataPort service, String staticResourceFullName, String jsonString) {
    
    MetadataService.StaticResource sr = new MetadataService.StaticResource();
    sr.fullName = staticResourceFullName;
    sr.contentType = 'text/json';
    sr.cacheControl = 'public';
    sr.content = EncodingUtil.base64Encode(Blob.valueOf(jsonString));
    
    MetadataService.SaveResult[] results = service.updateMetadata(new MetadataService.StaticResource[] {sr});
    for (MetadataService.SaveResult r : results) {
        if (!r.success) {
            String[] errors = new String[] {};
            if (r.errors != null) {
                for (MetadataService.Error e : r.errors) {
                    errors.add('message=' + e.message + ' statusCode=' + e.statusCode + ' fields=' + e.fields);
                }
            }
            throw new EndUserMessageException('Error: ' + String.join(errors, '; '));
        }
    }
}

PS A maximum of 10 record types can be read at once so use multiple calls if you require more than 10.