RSS feed for blog Linkin Skype Mail Me Twitter


Blog Category: linkedin

MWLug 2016 Round-Up

So I’m back from my first mwlug jetlagged to hell and shattered, trying to answer the question that LDC Via always does after a paid conference “was it worth it”

For me the big revelation was the PSC presentation on migration strategy, not for the strategic partnership announcement with LDC Via which I of course knew about as I was there as LDC Via’s representative but for the “this is the truth of the world we live in, it is time to accept it” statement which has been an undercurrent of our yellow world for so long. This has lacked someone articulate enough to say it in a non IBM bashing way also Mr Head in one slide managed to sum up LDC Via’s position in the application structure better than the 4 tech that wrote it had managed in 2 years….

The Conference its self was excellent with some stunning content and very enjoyable from both a learning and a social angle, I could have done without the heat of Austin but the city its self is amazing and I can see why the residents are so proud of it.

The Session with Gab went well with a good attendance and no hecklers (security people can be weird), my individual session was the last of the day and thanks to the conference running a bit on the late side was not well attended (I went and checked if the attendance was as bad in the other sessions and it was), but EVERYBODY in the session ask questions and multiple people came up and thanked me again latter .. Weeeeeee

LDC Via was well received and people are starting to see what it can do and that is is not a threat either to IBM or to their jobs and ecosystem, more of a evolution of storage but I really should have brought more marketing stuff to give away :(

I also attended my first Penumbra meeting and ended up not doing hardly any of the full days work I had brought with me (I hate meetings) because the conversation was so interesting and engaging, I now understand why people join.

And thus the answer to the question at the beginning of the post is: “Yes, it was very much worth it”

IBM Champions

To Find The Perfect Office

I find with offices that I am more than a little picky, you think that all you want are a few simple basics but it turns out that none of the shared or small office providers want to provide what you want in the way you want it

I tried most of the leading communal office companies as a truly private office would be too expensive, and I would go stir-crazy, but every time there would be something fundamentally wrong with them. Faults ranged from one office advertising itself as a “dynamic shared environment” which turned out to be a lobby coffee shop >:( 1 through to places that were so keen to get you in that it felt like you had hired yet another project manager to be on your case. This happened four or five times, but I was determined to find a good work base 2, and my perfect office turned out to be a place called Purple Patch.

Within a day of trailing in to the building fresh from an outrage at a previous office provider, I determined to settle there like an ugly toad under a rock. If you want the low-down I recommend you go and look at their web site

Let’s walk through my must-haves:

24-hour access: This is a big thing for me, in fact THE big thing. Most shared offices close at 6pm, and you’re waiting at the door at 8am to be let in. However Purple Patch is really 24 hour access. They give you a bunch of keys and security fobs after checking your background, and then you can come in at any time, day or night. Perfect for those late night deadlines.

Impresses Clients: The place must look and feel good: a lot of shared offices have a slightly tired feel about them, a bit like airport lounges. At Purple Patch everything is bright and clean, and I thought at first it was because the place was new—but no, it’s been there for 15 years and simply has regular revamps. This means that you can bring clients on-site with a sense of pride, and its excellent location means that there are tons of great places to eat and drink within idle wandering distance.

Hard Lines: I know that everyone just lives on wifi, but not me. Wifi is fine and everything but I want a good speed, I want to go like the clappers, and so each desk has a network port in it that gives you 100Meg down, and the same up. This means that only individual clients’ VPN speeds slow me down.

Layout: I like open-plan, but not OPEN PLAN. Again, Purple Patch scores: the building itself is like a cheerful Gormenghast (or as Ben Poole said, “an eclectic second hand book shop”). Desks are grouped into bays and natural nooks, you don’t feel isolated, but similarly you are not forced to endure every word of a nearby loudmouth’s day.

Humanity: My office provider must be human. One of the Purple Patch clients has a small dog she sometimes brings in, and no-one howls (sorry) about rules. Parents bring in kids and I have never been disturbed by them (the layout helps with this and the fact there is a chill-out area with table-top football that gives kids something to do). No-one brings in jaguars and what can and can’t be done is decided on a case-by-case bases to keep everyone happy, rather than by following a soulless set of rules.

Now the little things:

Nice toilet roll holders: Don’t look at me like that. How many times have you been at a clients or any kind of office and had to deal with a toilet roll holder that looks straight out of Parkhurst prison? Something that does not allow more than 2 sheets to be pulled off before it jams and rips and you end up having a little tantrum, reaching in and ripping the offending roll out which is less than professional. It turns out that Purple Patch agrees and has proper toilet roll holders that hold multiple roles and that don’t damn well jam.

Proper coffee in proper places Purple Patch have different coffee in different places in the building (yes, yes tea as well!) If you want to wait at a machine while it goes “pluck pluck pluck” then fine there’s one of those. Personally I like the old style filter coffee pots and they are always kept topped-up to ensure optimal coding hyperactivity. I have watched Ben Poole consume his entire body weight in coffee within approximately ten minutes of arriving at the office.

… and that’s it. Really I am just relieved to have found an office I like and can do business in :-)

(BTW: I’m not getting paid or anything for this review!)

This is where I skulk. Your desk is static and you get a locking cupboard (there are personal lockers as well)

Lots of little drop-in meeting nooks

Coffee, COFFEE!!! (kept topped-up by the lovely lovely staff)

You would not believe that the meeting rooms are actually cheaper than horrible scabby chain franchise ones

A nice shower at an office, there’s an idea

Even LDC Via can enjoy a meeting there (and maybe the pub afterwards)

  1. Meaning I couldn’t leave my laptop unattended therefore had to take it with me every time I went to the toilet 

  2. I could not really use home as my work place as I need to meet with clients and clients of companies I am sub-contracting for. LDC Via work together a lot as well and frankly I start to lose productivity if I work from home more than around one day every two weeks. 

A Little Thing Done Right

Last week the ‘swag’ from being an IBM champion arrived and to my utter surprise it was just perfect, yes I had picked it from a catalogue and knew at least one of the items was from a brand I knew but that did not detract from the fact it represents something to me a bit deeper than just a give away to keep some evangelists sweet.

Branded stuff like this is really supposed to be used where clients can see it (on site idealy) but recent IBM marketing stuff has been of very poor quality, just somewhere to slap the logo on and hope for the best, the best example of this is the backpacks that were given away at recent IBM Connect events, they were not even worth taking home where as the 2005 and 2003 editions were still in use and treated as a fine vintage, who ever looked at the bags and decided to skip them for this year’s event was a wise person. Anyway the swag that just arrived represents in my opinion just what IBM is aiming at with their champions

  • High quality outsourcing: IBM obviously did not do it them selves but the picking and delivery for me was a simple and flawless exercise.

  • Best of breed: The backpack is Wenger, the notebooks were Moleskin, the t-shirt was Nike. Good competent brands, not too flash, but not some no name knock off that falls to bits.

  • To be seen in public: I am already using my stuff on site and with the same pride I would a MongoDB t-shirt or a LDCVia power pack.

I expect I am reading too much into this and it’s simply the result of a single individual doing their job very well, but even if that’s the case it’s a good example of a rejuvenating IBM

SalesForce for Domino Dogs 3: Web Query Save Agents

“WebQuerySave” / “PostOpen” and all its siblings have been a bastion of Domino and Notes developments since time out of mind and indeed they exist in a near identical form in Salesforce but just called Triggers

Just like Notes/Domino has different events that let code ‘Do Stuff’ to records e.g. “WebQueryOpen”,”OnLoad”, “WebQuerySave” etc etc, Salesforce has the same sort of thing, in their case they are broken down into 2 parts: Timings and Events

Timings: Before and After

Before: The Event has been started but the record has not been saved, this maps basically to the “Query” events in Domino.

If you want to calculate fields and stuff and change values in the record you are saving, this is the time to do that, you don’t have to tell it to save or commit the records as you normally would, it will run a save after your code is run.

After: The record has been saved, all field values have been calculated, then the After event is run.

If you want to update other objects on the basis of this record being created/saved do it here, you can’t edit the record you are saving, but lots of useful bits such as the record id and who saved it are available in the After event 1

Events: Insert, Update, Delete and Undelete

These are exactly what they say there, Insert is like a new document creation, Update is editing an existing document, etc etc

This then gives us a total set of different event types of:

  • before insert
  • before update
  • before delete
  • after insert
  • after update
  • after delete
  • after undelete2

Now you can have a separate trigger for each of these events, but I have found that this bites you in the bum when they start to argue with each other and hard to keep straight when things get complex, so I just tend to have one trigger for all events and a bit of logic in it to determine what’s going to happen when

Here is my Basic template I start with on all my triggers

trigger XXXXTriggerAllEvents on XXXX (
    before insert,
    before update,
    before delete,
    after insert,
    after update,
    after delete,
    after undelete) {
            if(Trigger.isInsert || Trigger.isUpdate) {
                if (Trigger.isUpdate && Trigger.isAfter) {
                   MYScriptLibarary.DoStuffAfterAnUpdate(Trigger.New, Trigger.OldMap);
                } else if (Trigger.isInsert) {
                    //Do some stuff here to do with when a new document being create, like sending emals

As you can see you can determine what event you are dealing with by testing for “.isInsert” or “.isAfter” and then run the right bit of code for what you want, again I like to keep everything in easy sight, so use functions when ever I can with nice easy to understand names.

In the above case, I want to check a field after there has been an update to see if it has been changed from empty to containing a value. you can do this with the very very useful ‘Trigger.New’ and ‘Trigger.OldMap’) as you can see below

public with sharing class MYScriptLibarary {

    public static void DoStuffAfterAnUpdate(List<XXXX> newXXXX, Map<ID, XXXX> oldXXXX) {

                for (XXXX curentXXXX : newXXXX) {
                    if(!String.isBlank(curentXXXX.MyField) && String.isBlank(oldXXXX.get(curentXXXX.Id).MyField) ) {
                        system.debug('OMG!!! MYField changed DO SOMTHING');



So we are taking the list of objects3 that have caused the trigger to run ie “Trigger.New”, looping through them and comparing them to the values in the Trigger.OldMap (which contain the old values) to see if things have changed.

So that is the theory over, you can see existing triggers by entering Setup and searching for “apex triggers”

BUT you cant make them from there, you make them from the Object you want them to act on.

Lets take the Case object for an example

In setup you search for case, and click on “Case Triggers” and then on “New”

That will give you the default trigger…. lets swap that out for the all events trigger I showed above

Better, then just click save and your trigger will be live. simples..

Now there is an alternative way to make triggers, and you do sometime have to use it when you want to create a trigger for an object that does not live in the setup, such as the attachment object.

You will first need to open the Developer Console up (Select your Name in the top right and select “Developer Console”), then select File —> New —> Apex Trigger

Select “attachment” as the sObject and give it a sensible name.

And now you can do a trigger against an object that normally you don’t see.

Final Notes:
  1. Salesforce Process flows can fight with your triggers, if you get “A flow trigger failed to Execute” all of a sudden, go look to see if your power users have been playing with the process flows.
  2. Make sure you have security set correctly, particularly with community users, both security profiles and sharing settings can screw with your triggers if you cant write or see fields.
  3. As always make sure you code works if there are millions of records in Salesforce. CODE TO CATER TO LIMITS.

  1. You know that pain in the butt thing you sometimes have to do with Domino when you have to use the NoteID rather than that Document ID before a document is saved this gets round that issue. 

  2. Yes eagle eyes, there is no “before undelete”. 

  3. You are best to handle to handle all code in terms of batches rather than the single document you are used to in Domino, we will handle batching in a later blog, but just take my word for it at the moment 

Presenting at MWLUG

Hooray!!!, I have been accepted to speak at mwlug this year

I will presenting 2 sessions

1) “The SSL Problem And How To Deploy SHA2 Certificates” with Gabriella Davis

This session went down well at connect and we are hoping that Austin will love this changed and updated version, Gab is awesome to present with.

2) “Salesforce for Domino Dogs”

Now If you saw this at Engage I urge you to come again as this is an evolving presentation that changes dramatically with each iteration (depending on presenters and the ever changing world of Salesforce)

  • Version 1: Balanced Architect (Engage 2016)
  • Version 2: Happy Evangelist (DNUG 2016)
  • Version 3: Rabid Developer <— This is the one I will be presenting

It will be my first trip out there and beside from presenting I will be manning the stand (The rest of the team are insisting I wear a shirt and everything).


I’m looking for someone to room share/split cost with ( I sleep on the floor so there never seems to be a point to getting a room for myself ) …. I can provide references…

SalesForce for Domino Dogs 2: Scheduled Agents

Welcome to the second part of the Salesforce for Domino Dogs series. This one is a monster, but don’t worry we will be revisiting and clearing up some of the complex parts in other blog posts. What was a simple thing in Domino is quite complex in Salesforce and for a variety of very good reasons. So… scheduled agents.

Scheduled Agents: These little sods are the pain of many a Domino admin’s life. Personally I blame them for the lock-down of many a Domino server from the free-for-all that was so empowering to users, but sometimes there is no other way to get round limits or deal with certain triggered activities.

In Salesforce scheduled processes are a bit more complex than you might be used to, and this is not just a Salesforce thing, but a cloud thing—no cloud provider wants their platform just churning along in the background eating up cycles and I/O time.

So let’s break it down:

  1. The code that does stuff
  2. The scheduled task that the code sits in
  3. The schedule itself

1) The Code

So this CAN just be any bit of Apex you want, but most of the time you will actually end up using batch apex. Batch Apex is a whole set of articles in its own right, but in this case it’s just a way of getting round the Apex limits.

… hmmm that does not help. OK let me explain:

You know how with Domino scheduled agents, they will only run for so long before the agent manager shuts it down? This is to stop you writing rubbish code that screws up the system. Apex has a load of limits just like that, and the one that hits quite often is the limit that you can only send 10 emails using Send() in a given transaction (you can send 1000 bulk email per day). To get round this limit you have to “batch”, or break up your code into chucks. In Domino this would be like saying we want to process a whole view’s-worth of documents, but in chunks of say five documents at a time.

An empty bit of batch apex looks like this:

global class NotifiyAllUsersInAView implements Database.Batchable<sObject> {

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){


    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){


    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){



Let’s take it apart. First we will use the “start” function to get the list of objects we want to work through, so we take the empty function:

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){


… and add a search to say get all “contacts” in Salesforce. We only need the email address for these contacts1 so we add that as one of the fields which it gives us:

    // The start method is called at the beginning of the code and works out which objects this code is going to run against
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id, Email FROM Contact]);

Next we want the empty “execute” function which will do whatever we want with each chunk of objects it is sent:

    // The executeBatch method is called for each chunk of objects returned from the start function
    global void execute(Database.BatchableContext BC, List<Contact> scope){


So in this horrible bit of code, the chunk of objects is passed in a reference called “scope” — we are then just iterating the objects and sending an email for each contact (you can see the email address stipulated in the “start” being passed in using “c.Email”):

    // executeBatch method is called for each chunk of objects returned from the start function
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
          Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
          String[] toAddresses = new String[] {c.Email};
          mail.setSubject('Another Annoying Email');
          mail.setPlainTextBody('Dear XXX, this is another pointless email you will hate me for');
          Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });

Finally we need an empty “finish” function which runs when all the batches are done:

    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){


So let’s send a final email notification to the admins:

    //The finish method is called at the end of a sharing recalculation
    global void finish(Database.BatchableContext BC){
        // Send an email to admin to say the agent is done.
        Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
        String[] toAddresses = new String[] {emailAddress};
        mail.setSubject('Agent XXX is Done.');
        mail.setPlainTextBody('Agent XXX is Done.');
        Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });

Put it all together and you get:

global class NotifiyAllUsersInAView implements Database.Batchable<sObject> {

    // String to hold email address that emails will be sent to.
    // Replace its value with a valid email address.
    static String emailAddress = '';

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id, Email FROM Contact]);

    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
          Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
          String[] toAddresses = new String[] {c.Email};
          mail.setSubject('Another Annoying Email');
          mail.setPlainTextBody('Dear XXX, this is another pointless email you will hate me for');
          Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });

    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){
        // Send an email to admin to say the agent is done.
        Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
        String[] toAddresses = new String[] {emailAddress};
        mail.setSubject('Agent XXX is Done.');
        mail.setPlainTextBody('Agent XXX is Done.');
        Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });


So now we need to call this code

2) The Scheduled “Agent”

The code we have just written won’t run in a schedule on its own, we need to wrap it up in a bit of code that can run on a schedule and decide how big the chunks will be. In this case they can’t be more than 10 as we will hit the Apex limits for sending emails. An empty schedule wrapper looks like this (I have called mine ‘Scheduled_Agent’ but you can call it anything):

global class Scheduled_Agent implements Schedulable{
    global void execute (SchedulableContext SC){


Now let’s create a new instance of the batchable code we created in section 1, tell it we want it to run in batches of 5 records or objects, and tell it to execute.

global class Scheduled_Agent implements Schedulable{
    global void execute (SchedulableContext SC){
      Integer batchSize = 5;

      NotifiyAllUsersInAView batch = new  NotifiyAllUsersInAView();
      database.executebatch(batch , batchSize);

Code bit all done!

3) The Schedule

Now it comes time to actually schedule the code to run at a certain time, you can set this up via the user interface by going into Setup, searching for “Apex Classes”, and selecting the result:

Select “Scheduled Apex”

As you can see, the options are limited to, at most, a daily run—you can’t specify it to be any more frequent. However, we need to run to more often than that2.

First open up your developer console, by selecting your name on the top right and picking it from the drop-down.

Now open up the “Execute Anonymous Window” from the debug menu.

You can now run Apex code manually, and as such you can schedule jobs with a load more precision using a Cron String. In this case we want to run the agent every 10 mins within the hour, so we create a new instance of our “Scheduled_Agent” scheduled class and schedule it appropriately:

Click “Execute” and you can see the jobs have been scheduled. It should be noted that you can only have 100 of these in your org and this uses up 6 of them, so some planning would be good.

And there you go, scheduled agents. Let the legacy of horror continue!

  1. When you get an object via SOQL, you ask for all the fields you want, this is not like getting a Notes Document you don’t just get access to all the document fields automatically. 

  2. Well we don’t but you just know someone will demand it to be sent more often. 

SalesForce for Domino Dogs 1: Profile Documents

Following on from the Initial Session “Salesforce for Domino Dogs” that Paul Mooney and I did at Engage and a modified version of which that has just been presented at #DNUG I figured that a series of dev articles on how you would do a job in Salesforce that you had always taken from granted in Domino might be a good idea, because:

  1. It would give an honest way of comparing features between the 2 systems shorn of the hype/marketing/platform bashing, that frankly gets on my thungers from both sides.
  2. It will hopefully help people trying to integrate the 2 systems.
  3. As IBM are one of the largest Salesforce consultancies in the world, it is something a champion should do.
  4. The Salesforce community is very short on this kind of thing given its size in comparison to traditional IBM communities and with people like René in it I want to try and help improve it.

These articles are not in any order and are not meant to represent any form of training guide.

So lets get started, first up: Profile Documents!!

In Domino you tend to store config settings for an app in a profile document1 for all your one off settings

To get the same features in Salesforce you use a ‘custom setting’ which does exactly the same job and has one huge advantage over using a normal Salesforce custom object that could do the same job.

(It should be noted that Domino profiles are nothing like Salesforce profiles)

To create a custom setting, go into Setup and search for “Custom Settings”

Click on the “New” button, and fill in some sane details, for normal configs i.e. stuff you would use system wide, select a setting type of “List”, if you want to use them for things like default values in fields and formulas then select “Hierarchy”

Click ‘Save’ and you now have a custom setting object, you can add one or more fields to it just as you would any other object in Salesforce

Added all the fields you want?, lets put in some data. if you go back to the list of custom settings you will see that you now have a “manage” link, click this

Then click “New”

Fill in the fields just like you would on a normal form, if this is a Setting there is only going to be one of, I tend to give it the same title as the name of the object to keep things simple, in this case “General Settings”, if you are going to use it multiple times than give it a name that will make sense in the context of your code

All done. now we can use the setting in our code and see the reason why we would use them vs. a normal custom object.

As you can see from the code below, you don’t have to use a Select statement, which means getting the settings wont count against your apex limits, HORRAY!!!

You just have to create a new instance of the setting object then you can just “getInstance” with the Name of the object you created to get the document back.

General_Settings__c generalSettings = General_Settings__c.getInstance('General Settings');
String thirdPartAppURL = '';
if (null != generalSettings) {
     thirdPartAppURL = generalSettings.Third_Part_App_URL__c;


  1. Well you are supposed to, but you never do thanks to the evil caching and the fact the are are a sod to just copy around, so you end up with a config document and a view you store them in. 

Editable salesforce templates

Editing an email that is generated via an Saleforce email template BEFORE it is sent is something that I have had a few clients grumble over, the feature is baked into the Quote object which makes sense, but you try telling clients they cant have it for Order objects….

So this is a basic solution that gets round this problem and is the basic set-up for expanding it to solve all the issues, in this case I am solving the most common issue I have come access, in that you are generating a order recept or delivery note pdf using the ‘renderAs=”PDF”’ option in an email template and want to send the email with a custom message to one or more people that you decide at the time of sending.

To do that we are going to create a new visual force page that resembles a normal email and use it to fill in the details before we generate the email template


1) Create a new email custom object

This is just a basic object with the fields we need for the email, and a lookup field to the parent Order

New Custom Email Object

Note: For brevity I’m leaving out the creation of the layout for the custom object and the adding of the related list to the Order Layout

2) Add some extra Fields to your Order object

These are the temporary fields that we are using to actually send the template email, resist the urge to set the To/CC/BCC as email fields as that will stop you having multiple recipients.

Extra Fields

Obviously these fields have to be editable to all but not part of the order layouts so they don’t show up.

3) Create a new “Template Editor” Visualforce Page

This page is as simple or complex as you can want it, my one here takes email address separated by a comma but yours can be a really posh contact lookup

<apex:page controller="sendOrderPDFEmail">
    <apex:messages />
    <apex:pageBlock title="Email for Order: {!order.Name}">
    <p>Fill out the fields below and click "Send email"</p>
    <apex:form ><br/><br/>

        <apex:outputLabel value="Template To Attach" for="chooseTemplate"/>: <br/>
        <apex:selectList id="chooseTemplate" value="{!template}" size="1">
            <apex:selectOption itemValue="Template1" itemLabel="Template1"/>
            <apex:selectOption itemValue="Template2" itemLabel="Template2"/>
            <apex:selectOption itemValue="Template3" itemLabel="Template3"/>
        <apex:outputLabel value="To" for="To"/>: <br/>
        <apex:inputText value="{!to}" id="To" maxlength="255" style="width: 300px;"/>
        <apex:outputLabel value="CC" for="CC"/>: <br/>
        <apex:inputText value="{!cc}" id="CC" maxlength="255" style="width: 300px;"/>
        <apex:outputLabel value="BCC" for="BCC"/>: <br/>
        <apex:inputText value="{!bcc}" id="BCC" maxlength="255" style="width: 300px;"/>
        <apex:outputLabel value="Subject" for="Subject"/>: <br/>
        <apex:inputText value="{!subject}" id="Subject" maxlength="255" style="width: 500px;"/>
        <apex:outputLabel value="Body" for="Body"/>: <br/>        
        <apex:inputTextarea value="{!body}" id="Body" richText="true" rows="20"/>
        <apex:commandButton value="Send Email" action="{!send}"/>

4) Create the “Send” code (don’t forget to create your test code).

I have just put comments on the code, as its not complex and follows the flow chart at the top of the blog.

public class sendOrderPDFEmail {

    public String template { get; set; }
    public String cc { get; set; }
    public String bcc { get; set; }
    public String to { get; set; }
    public String subject { get; set; }
    public String body { get; set; }
    public Account contactLookup { get; set; }
    private final Order order;

    public sendOrderPDFEmail() {
        //Get the Id of the order that we are working on
        template = '';
        String OrderId = ApexPages.currentPage().getParameters().get('id');
        order = [SELECT Name, ID
        FROM Order
        WHERE Id = :OrderId];

    public Order getOrder() {
        return order;

    public PageReference send() {
    if( !String.isBlank(cc)) {
        cc = cc.trim();
    if( !String.isBlank(bcc)) {
        bcc = bcc.trim();
    if( !String.isBlank(to)) {
        to = to.trim();
    //Mail email Order object
    orderemail__c con = new orderemail__c(
    insert con;

    //Store the temp values in the Order
    update order;

    if ( String.isBlank(to) ) {
         to = '';
     // Construct the list of emails we want to send
     List<Messaging.SingleEmailMessage> lstMsgs = new List<Messaging.SingleEmailMessage>();
     Messaging.SingleEmailMessage msg = new Messaging.SingleEmailMessage();
     //set the email tempalte from the name chossen on the email form
     EmailTemplate[] emailTemplate = [select id from EmailTemplate where Name=:template];
     if (emailTemplate.size() > 0) {
          msg.setTemplateId( emailTemplate[0].id );
     } else {
         msg.setPlainTextBody('No Template Provided');

     // Trim and convert the comma delimted string to a suitable recipiant list for the email.
     String[] trimmedtoarray;
     if (!String.isBlank(to)) {
        String[] toarray = to.split(',');
        trimmedtoarray = new String[toarray.size()];
        Integer k = 0;
        for (String singlEmail: toarray) {
           trimmedtoarray[k++] = singlEmail.trim();
     if (!String.isBlank(cc)) {
         String[] ccarray = cc.split(',');
         String[] trimmedccarray = new String[ccarray.size()];
         Integer i = 0;
         for (String singlEmail: ccarray) {
             trimmedccarray[i++] = singlEmail.trim();
     if (!String.isBlank(bcc)) {
         String[] bccarray = bcc.split(',');
         String[] trimmedbccarray = new String[bccarray.size()];
         Integer j = 0;
         for (String singlEmail: bccarray) {
            trimmedbccarray[j++] = singlEmail.trim();
     // Templates do need an object (Contact that he pdf will be used to generate against) for things like first name etc
     // So we are going to pick the first contact on the to list.
     Contact c = [select id, Email from Contact where email = :trimmedtoarray[0] limit 1];
     msg.setWhatId( );

     //Send the email

    // clear out the temp fields we used to generate the email.
    update order;
        //Send the user back to the Order
        PageReference backToQuotePage = new PageReference('/' +;
        return backToQuotePage;

5) Create a button in the Order object to launch your “Email” screen

End Email Button

6) Update your Email template

Now update your email templates to use the “le_Body__c” and “le_Subject__c” fields for their email contents,

<messaging:emailTemplate recipientType="Contact"
    <messaging:htmlEmailBody >

        <c:EmailStyle />
            <apex:outputText value="{!RelatedTo.le_Body__c}" escape="false"/>

    <messaging:plainTextEmailBody >
    <apex:outputText value="{!RelatedTo.le_Body__c}" escape="false"/>

<messaging:attachment renderAs="PDF" filename="GeneratedEmail.pdf">
      @page {
      size: A4 landscape;
      margin-top: 1.0cm;
      margin-left: 1.0cm;
      margin-right: 1.0cm;
      margin-bottom: 7cm;
        @bottom-center {
          content: element(footer);


        <c:EmailStyle />




So PDF’s generated by Email templates in Salesforce are not saved and you cant get hold of them, so we are not saving them in the custom email object, if you do need that, then you will need to convert your email templates to visual force pages then render as PDFs and save them in the custom objects.

New Platform Type New Client Type

I have been doing a lot of cloud dev in recent months, not Internet facing work (been doing that for over a decade), but proper work on various cloud platforms (4+ of them) and they have turned out to require a shifting of mental gears, not from a technical aspect nor from a platform or paradigm shift (saas) but from dealing with a different type of client point of view.

Now that seems odd, sure your cloud clients are nearly all the business rather than IT, but lots of my work is direct with the business and it is often a relief to do so as you can deliver a product that best matches the exact needs of the people that use it.

So why?

After a lot of head scratching and reviewing of the projects I have come up with the following reasons

  1. Sales before IT: With cloud based projects the sales team have very often just finished with the customer, so the customer arrives with the expectation that the platform is a PERFECT fit for everything they might want and it might just needs a tiny update to match their needs and that the update will only take a hour or 2…. but as is always true the devil it in the details, so when we look at their requirements and say that it’s going to take a week of hard work and then they will have to spend time testing, you have suddenly upset both their time frames and budgets. 1

  2. Client reflexes: A lot of the cloud clients are sales/marketing people or from another branch where haggling and negotiations are built-in, these people live in a fast moving world and have never liked the iterative and somewhat slow moving nature of traditional IT projects “I just want it to work how I want”. for such people paying “not a penny more” and getting more than you paid for are Good Things. A side-effect of this is that such clients are quick to anger when they require a change that will take more money or time. Small changes are non-stop with cloud projects where the client can see the work as it is done: I have heard phrases like “Just one more thing”, “It will only take 5 minutes”, “I had assumed” and — my absolute favourite — “It’s just common sense, it should do XXX,” more over the last six months than I have in the previous six years combined.

  3. They have already paid: Decent cloud services are not cheap and the clients have often already paid a fair lump before they get to customising their environment, so every penny you want is money they feel is an extra, very much like someone at a hotel, we all enjoy the extra stuff but are really unhappy to see it on our bill. 2

  4. Re-tooling: All of the new cloud platforms are feature rich and do a lot of things very quickly but that is often within the boundaries of a given tool or feature, I can see why this is so 3, you are aiming for the old 80/20 rule , so when a customer says “I just need it to do xxxxx” and you simply can’t make the tool do that, have then used a different tool and spent a load of time reconfiguring the new tool to look like the old one so you can add the one missing feature. it does not matter how clever that is or how hard you have worked, from a clients point of view you have turned a simple 5 min job into a 5 day job you and are from their point of view are not providing value for money 4.

  5. Client rapport: Most cloud customisations are quick things, as a developer you have had very little time to get to know the client, what they mean vs what they say, what pressures they are under, if they have budget for these changes etc etc, and they often just view you as someone just getting in the way of their shiny new cloud platform.

These relatively new changes in the client developer relationship mean you have to change your way of dealing with clients

So how do we fix this??

This is the hard bit, I have laid awake at night for a number of nights, wondering hard how to fix this, my time honoured method of working my guts off having failed me.

So far I have come up with:

  1. Make it human: Try and make the relationship one between humans, site visit if possible, Skype video if not, so clients feel that they are working with people and more importantly people who’s professional opinion they can trust.

  2. Speed up interactions: Not speed up coding as that has actually not got much faster with the new platforms, but speed-up the feedback you give clients, a quick Agile Scum with a client each morning can head a lot of bad things off and make them feel far more in the loop 5, use this to also keep them informed on how much the client has used / left in their bank / project pool, even it they have pushed for a fixed price, addionally the clients can cut their losses if a small change is going to take a long time.

  3. Be firm: I’m rubbish at this part but with cloud clients there is an underlying expectation that you get loads for free, and that includes any changes they might want to make after a spec has been agreed, there is a middle groud between “nickle and diming” and being used as a doormat, try and get a rapport with your client so that you both know where that is.

Any one else got any good ideas?

  1. Both Matt and Julian have been REALLY serious about avoiding this kind of thing on LDCVia and have made the phase “it will be easy, it will only take and hour or so” a capital offence. 

  2. One thing that I have found after multiple quotes, is that honesty works even less with cloud based quotes than it does with traditional IT quotes, I have had at least 3 occasions where I was genuinely puzzled that a quote I had done had not got picked for a spec, and a much cheaper quote was accepted, on all occasions I questioned the ability to deliver on a quote that low (even using offshore staff) and have been told each time that the competition just use the quote to get in the door then nickle and dime the project to death…. I hate that, I really hate it >:( 

  3. Hell I’m one of the co-developers of a cloud platform and when we are coding new stuff it always with an eye to “how can we spend our time on stuff that will get the most use”. 

  4. BTW the phrase “I’m trying to do what you asked” does not help here. 

  5. But be firm that the meeting is only of keeping everyone on track, it is not a place to add a few new requirements in to the spec every day (ohhh boy don’t they love to try that), and that each person does only get 2-5 mins, if they want a longer meeting, book it later in the day. 

Engage 2016

Where to start….

I should have done a proper blog after IBM Connect 2016, but its been so busy I did not make time, however a number of the points here that were just supposition at Connect have now been turned into facts.

Engage the Event

Theo Heselmans who for all intents and purposes IS Engage, managed to do the impossible and top last years conference, firmly establishing Engage as the best event after Connect, with a huge array of varied content and no subject taboo, all that is required is that the content is GOOD!!, there were a number of times when I really would have liked to be in two places at the same time and that has not happened at a conference in ages.

The venue was the crazy Evoluon in Eindhoven which seemed custom built for such event (if they could get a grip on the wifi)

In a cheeky moment Theo did a backpack give away (because Connect did not this year for the first time)

Back Pack


We went live with LDC Via one year ago at the last Engage so it holds a special place for us, in that year a lot has happened and we have delivered tons more, but we got a lot more attention this year and found that we even have some competition, in this last year we have become established in peoples minds and people realise we offer a great product, all 4 of us were there and did not seem to have a quiet second, as always the speed sponsoring was exhausting and Julian did a great job on the 30 second new speed sponsoring to the whole conference, Matt White was bogged down with cold but still fought his way through the conference and drove us both there and back while at the same time giving his flipping cold to the rest of us.

This years give away “Colin” did sterling work and won the unofficial “best conference give-away” from a number of sources.


We tried a different tack with the sponsor session, showing how many ways you could integrate LDC Via into different frameworks (slide deck below)

LDC Via building a new app from Mark Myers

IBM and Champions

Well bugger me… Inhi Cho Suh really impressed us all, from the little we saw of her at Connect (she was only introduced to the community right at the end of the conference) we guessed that she might not only be sane but sane and HUMAN and it turned out that she was!!, far more than we had all dared to hope, no “two tassel talk”1, she had fresh new ideas and had already started acting on them (such as moving Connect to San Francisco in 2017), we had a proper round table event where she dealt well with the hysterical gibbering we blasted her with and left us feeling that it might actually be possible to finally turn things around and start IBM ICS again (somebody pinch me)

On that note being a champion meant something at Engage and made me really think that I must buck my ideas up and start delivering far more.

The Community

Sadly I was doing another round of late night work while I was there so missed out on the parties and such (hence no photos), but judging from the state of my fellow LDC Via colleagues, the events and evenings were great fun and even members who were no longer in the IBM world turned up to revive past glories.

I got happy gifts from friends, Carl Tyler played mule for the amazing vegan mayonnaise that I cant get in Europe, and Rene Winkelmeyer got me the most amazing energy drink

Engage Food

Gabriella Davis got me this totally apt t-shirt (which has since been pinched by my wife)



The session I luckily got accepted for was with Paul Mooney and was on Salesforce, a relatively new skill set for me and we were standing room only (mainly due to Paul), but people liked it and we were asked lots of good questions.

Saleforce For Domino Dogs from Mark Myers

Other than that, it was mayhem, I had terrible client overflow and was running around all over the place stressing every second, but frankly I would not change it for the world. roll on next year.

  1. “two tassel talk” : This is the long stream of corporate platitudes (sort of the corporate version of what politicians say) that you get off senior managers that are just marking time in their current role before they hope to be promoted away. 

Latest Blogs