RSS feed for blog Linkin Skype Mail Me Twitter

Stickfight

Salesforce Pardot: Multi Completion Rules

I do pardot implementations amongst my other salesforce stuff, which always seem to involve a number of hacks to bend the Salesforce and Pardot functions to meet the existing behaviours that the client wants.

Now pardot has “completion actions” these are very useful jobs that are performed when a form is submitted, but limited in that they are not conditional, i.e. you cant say “if field X = Y then do W action else do Z action”, Pardot them selves show you a way round this using form field based completion actions.

This is a nice trick and can be used as a base for more complex things, namely running MULTIPLE completion actions, an example of which is shown below

<script type="text/javascript">
var email = encodeURIComponent('%%email{js}%%')

switch('%%CheckBox_1{js}%%')
{
case 'true': document.write('<iframe src="FORM_HANDLER_1_URL?email=' + email + '" height="1px" width="1px" ></iframe>');
break;
case '': document.write('<iframe src="FORM_HANDLER_2_URL?email=' + email + '" height="1px" width="1px" ></iframe>');
break;
};



switch('%%CheckBox_2{js}%%')
{
case 'true': document.write('<iframe src="FORM_HANDLER_3_URL?email=' + email + '" height="1px" width="1px" ></iframe>');
break;
case '': document.write('<iframe src="FORM_HANDLER_4_URL?email=' + email + '" height="1px" width="1px" ></iframe>');
break;
};
</script>


This means that you can run a lot of conditional rules based on the fields in the one form, the most common use I make of this is of a Custom Email Preference Center Pages but one where a user can fill in more details than just their email address.

Oh one final note, you will see on the pardot instructions they close their iframes with a “/>” this wont work if you have multiple iframes on one page you need to close of iframes properly with “</iframe>”

Razer Deathadder Chrome Sensitivity on Linux

Silly little post and more an aide-memoire than anything, but recently I have been doing a bit of work on a VERY secure site, one that does not permit bluetooth of any form even for such things as mice, resulting in a quick pound down to PC World, after a squint down the rows of mice I realised that wired is really dead and there was none that I would not feel ashamed to use, dead that is apart from serious gaming, in that area there was a very pleasant and quality mouse in the form of the Razer deathadder chroma.

All was fine till I plugged the darn thing in and discovered its ultra sensitivity meant it was nearly uncontrollable and the normal UI sensitivity settings went nowhere near far enough down to make it usable, so back to command line we go.

So first lets get a list of the input devices on the system with:

xinput list


Now lets check this list for the first instance of “Razer Razer DeathAdder Chroma” in the pointers list and get its id number (in this case 12), then change its sensitivity as below, I find a value of 3 takes it down to the same level as a normal mouse

xinput set-prop 12 "Device Accel Constant Deceleration" 3



Terminal Screen Shot

There we go, a usable mouse.

MWLug 2016 Round-Up

So I’m back from my first mwlug jetlagged to hell and shattered, trying to answer the question that LDC Via always does after a paid conference “was it worth it”

For me the big revelation was the PSC presentation on migration strategy, not for the strategic partnership announcement with LDC Via which I of course knew about as I was there as LDC Via’s representative but for the “this is the truth of the world we live in, it is time to accept it” statement which has been an undercurrent of our yellow world for so long. This has lacked someone articulate enough to say it in a non IBM bashing way also Mr Head in one slide managed to sum up LDC Via’s position in the application structure better than the 4 tech that wrote it had managed in 2 years….

The Conference its self was excellent with some stunning content and very enjoyable from both a learning and a social angle, I could have done without the heat of Austin but the city its self is amazing and I can see why the residents are so proud of it.

The Session with Gab went well with a good attendance and no hecklers (security people can be weird), my individual session was the last of the day and thanks to the conference running a bit on the late side was not well attended (I went and checked if the attendance was as bad in the other sessions and it was), but EVERYBODY in the session ask questions and multiple people came up and thanked me again latter .. Weeeeeee

LDC Via was well received and people are starting to see what it can do and that is is not a threat either to IBM or to their jobs and ecosystem, more of a evolution of storage but I really should have brought more marketing stuff to give away :(

I also attended my first Penumbra meeting and ended up not doing hardly any of the full days work I had brought with me (I hate meetings) because the conversation was so interesting and engaging, I now understand why people join.

And thus the answer to the question at the beginning of the post is: “Yes, it was very much worth it”


IBM Champions

New Anime Series: Thunderbolt Fantasy

I don’t tend to do my anime reviews any more, mainly due to lack of time but also that they don’t really add much to the internet, however this one I just felt I had to flag up, it’s called Thunderbolt Fantasy and it’s a puppet show done in the style of an anime, I am aware of the long history of puppet shows in many cultures, but the action puppet show as a TV format was first really introduced with Thunderbirds by Gery Anderson and culminated as far as I was concerned with the creation of Star Fleet X-bomber (a Japanese show by Go Nagai) which I ADORED!!

Now Thunderbolt Fantasy is an updated form of that kind of Series, It is a mix of puppets and CGI special effects feeling more like a movie or a classic epic tale than an anime. it is fascinating to watch, the plot line is totally feudal power rangers with an epic music score 1, the puppets them selves and the sets are works of art so much so that I have had to watch each episode more than once as the first time I’m too busy staring to read the subtitles.

Obviously effects have come on a lot since the 80s and it still has that charming puppet feel, but you don’t just see the top half of the puppets, full body shots and foot shots are common and mixed flawlessly with things like fire and magic.

I’m just praying its going to be at least a 24 episode series.








  1. I’m not sure it will beat Brian May’s opening theme music but its going to give it a run for its money 

To Find The Perfect Office

I find with offices that I am more than a little picky, you think that all you want are a few simple basics but it turns out that none of the shared or small office providers want to provide what you want in the way you want it

I tried most of the leading communal office companies as a truly private office would be too expensive, and I would go stir-crazy, but every time there would be something fundamentally wrong with them. Faults ranged from one office advertising itself as a “dynamic shared environment” which turned out to be a lobby coffee shop >:( 1 through to places that were so keen to get you in that it felt like you had hired yet another project manager to be on your case. This happened four or five times, but I was determined to find a good work base 2, and my perfect office turned out to be a place called Purple Patch.

Within a day of trailing in to the building fresh from an outrage at a previous office provider, I determined to settle there like an ugly toad under a rock. If you want the low-down I recommend you go and look at their web site

Let’s walk through my must-haves:

24-hour access: This is a big thing for me, in fact THE big thing. Most shared offices close at 6pm, and you’re waiting at the door at 8am to be let in. However Purple Patch is really 24 hour access. They give you a bunch of keys and security fobs after checking your background, and then you can come in at any time, day or night. Perfect for those late night deadlines.

Impresses Clients: The place must look and feel good: a lot of shared offices have a slightly tired feel about them, a bit like airport lounges. At Purple Patch everything is bright and clean, and I thought at first it was because the place was new—but no, it’s been there for 15 years and simply has regular revamps. This means that you can bring clients on-site with a sense of pride, and its excellent location means that there are tons of great places to eat and drink within idle wandering distance.

Hard Lines: I know that everyone just lives on wifi, but not me. Wifi is fine and everything but I want a good speed, I want to go like the clappers, and so each desk has a network port in it that gives you 100Meg down, and the same up. This means that only individual clients’ VPN speeds slow me down.

Layout: I like open-plan, but not OPEN PLAN. Again, Purple Patch scores: the building itself is like a cheerful Gormenghast (or as Ben Poole said, “an eclectic second hand book shop”). Desks are grouped into bays and natural nooks, you don’t feel isolated, but similarly you are not forced to endure every word of a nearby loudmouth’s day.

Humanity: My office provider must be human. One of the Purple Patch clients has a small dog she sometimes brings in, and no-one howls (sorry) about rules. Parents bring in kids and I have never been disturbed by them (the layout helps with this and the fact there is a chill-out area with table-top football that gives kids something to do). No-one brings in jaguars and what can and can’t be done is decided on a case-by-case bases to keep everyone happy, rather than by following a soulless set of rules.

Now the little things:

Nice toilet roll holders: Don’t look at me like that. How many times have you been at a clients or any kind of office and had to deal with a toilet roll holder that looks straight out of Parkhurst prison? Something that does not allow more than 2 sheets to be pulled off before it jams and rips and you end up having a little tantrum, reaching in and ripping the offending roll out which is less than professional. It turns out that Purple Patch agrees and has proper toilet roll holders that hold multiple roles and that don’t damn well jam.

Proper coffee in proper places Purple Patch have different coffee in different places in the building (yes, yes tea as well!) If you want to wait at a machine while it goes “pluck pluck pluck” then fine there’s one of those. Personally I like the old style filter coffee pots and they are always kept topped-up to ensure optimal coding hyperactivity. I have watched Ben Poole consume his entire body weight in coffee within approximately ten minutes of arriving at the office.

… and that’s it. Really I am just relieved to have found an office I like and can do business in :-)

(BTW: I’m not getting paid or anything for this review!)



This is where I skulk. Your desk is static and you get a locking cupboard (there are personal lockers as well)



Lots of little drop-in meeting nooks



Coffee, COFFEE!!! (kept topped-up by the lovely lovely staff)



You would not believe that the meeting rooms are actually cheaper than horrible scabby chain franchise ones



A nice shower at an office, there’s an idea



Even LDC Via can enjoy a meeting there (and maybe the pub afterwards)


  1. Meaning I couldn’t leave my laptop unattended therefore had to take it with me every time I went to the toilet 

  2. I could not really use home as my work place as I need to meet with clients and clients of companies I am sub-contracting for. LDC Via work together a lot as well and frankly I start to lose productivity if I work from home more than around one day every two weeks. 

A Little Thing Done Right

Last week the ‘swag’ from being an IBM champion arrived and to my utter surprise it was just perfect, yes I had picked it from a catalogue and knew at least one of the items was from a brand I knew but that did not detract from the fact it represents something to me a bit deeper than just a give away to keep some evangelists sweet.

Branded stuff like this is really supposed to be used where clients can see it (on site idealy) but recent IBM marketing stuff has been of very poor quality, just somewhere to slap the logo on and hope for the best, the best example of this is the backpacks that were given away at recent IBM Connect events, they were not even worth taking home where as the 2005 and 2003 editions were still in use and treated as a fine vintage, who ever looked at the bags and decided to skip them for this year’s event was a wise person. Anyway the swag that just arrived represents in my opinion just what IBM is aiming at with their champions

  • High quality outsourcing: IBM obviously did not do it them selves but the picking and delivery for me was a simple and flawless exercise.

  • Best of breed: The backpack is Wenger, the notebooks were Moleskin, the t-shirt was Nike. Good competent brands, not too flash, but not some no name knock off that falls to bits.

  • To be seen in public: I am already using my stuff on site and with the same pride I would a MongoDB t-shirt or a LDCVia power pack.

I expect I am reading too much into this and it’s simply the result of a single individual doing their job very well, but even if that’s the case it’s a good example of a rejuvenating IBM











SalesForce for Domino Dogs 3: Web Query Save Agents

“WebQuerySave” / “PostOpen” and all its siblings have been a bastion of Domino and Notes developments since time out of mind and indeed they exist in a near identical form in Salesforce but just called Triggers

Just like Notes/Domino has different events that let code ‘Do Stuff’ to records e.g. “WebQueryOpen”,”OnLoad”, “WebQuerySave” etc etc, Salesforce has the same sort of thing, in their case they are broken down into 2 parts: Timings and Events

Timings: Before and After

Before: The Event has been started but the record has not been saved, this maps basically to the “Query” events in Domino.

If you want to calculate fields and stuff and change values in the record you are saving, this is the time to do that, you don’t have to tell it to save or commit the records as you normally would, it will run a save after your code is run.

After: The record has been saved, all field values have been calculated, then the After event is run.

If you want to update other objects on the basis of this record being created/saved do it here, you can’t edit the record you are saving, but lots of useful bits such as the record id and who saved it are available in the After event 1

Events: Insert, Update, Delete and Undelete

These are exactly what they say there, Insert is like a new document creation, Update is editing an existing document, etc etc

This then gives us a total set of different event types of:

  • before insert
  • before update
  • before delete
  • after insert
  • after update
  • after delete
  • after undelete2

Now you can have a separate trigger for each of these events, but I have found that this bites you in the bum when they start to argue with each other and hard to keep straight when things get complex, so I just tend to have one trigger for all events and a bit of logic in it to determine what’s going to happen when

Here is my Basic template I start with on all my triggers

trigger XXXXTriggerAllEvents on XXXX (
    before insert,
    before update,
    before delete,
    after insert,
    after update,
    after delete,
    after undelete) {
            if(Trigger.isInsert || Trigger.isUpdate) {
                if (Trigger.isUpdate && Trigger.isAfter) {
                   MYScriptLibarary.DoStuffAfterAnUpdate(Trigger.New, Trigger.OldMap);
                } else if (Trigger.isInsert) {
                    //Do some stuff here to do with when a new document being create, like sending emals
                }
            }
}


As you can see you can determine what event you are dealing with by testing for “.isInsert” or “.isAfter” and then run the right bit of code for what you want, again I like to keep everything in easy sight, so use functions when ever I can with nice easy to understand names.

In the above case, I want to check a field after there has been an update to see if it has been changed from empty to containing a value. you can do this with the very very useful ‘Trigger.New’ and ‘Trigger.OldMap’) as you can see below

public with sharing class MYScriptLibarary {

    public static void DoStuffAfterAnUpdate(List<XXXX> newXXXX, Map<ID, XXXX> oldXXXX) {

                for (XXXX curentXXXX : newXXXX) {
                    if(!String.isBlank(curentXXXX.MyField) && String.isBlank(oldXXXX.get(curentXXXX.Id).MyField) ) {
                        system.debug('OMG!!! MYField changed DO SOMTHING');
                    }
                }

          }

}


So we are taking the list of objects3 that have caused the trigger to run ie “Trigger.New”, looping through them and comparing them to the values in the Trigger.OldMap (which contain the old values) to see if things have changed.


So that is the theory over, you can see existing triggers by entering Setup and searching for “apex triggers”



BUT you cant make them from there, you make them from the Object you want them to act on.

Lets take the Case object for an example



In setup you search for case, and click on “Case Triggers” and then on “New”



That will give you the default trigger…. lets swap that out for the all events trigger I showed above



Better, then just click save and your trigger will be live. simples..

Now there is an alternative way to make triggers, and you do sometime have to use it when you want to create a trigger for an object that does not live in the setup, such as the attachment object.



You will first need to open the Developer Console up (Select your Name in the top right and select “Developer Console”), then select File —> New —> Apex Trigger



Select “attachment” as the sObject and give it a sensible name.



And now you can do a trigger against an object that normally you don’t see.

Final Notes:
  1. Salesforce Process flows can fight with your triggers, if you get “A flow trigger failed to Execute” all of a sudden, go look to see if your power users have been playing with the process flows.
  2. Make sure you have security set correctly, particularly with community users, both security profiles and sharing settings can screw with your triggers if you cant write or see fields.
  3. As always make sure you code works if there are millions of records in Salesforce. CODE TO CATER TO LIMITS.

  1. You know that pain in the butt thing you sometimes have to do with Domino when you have to use the NoteID rather than that Document ID before a document is saved this gets round that issue. 

  2. Yes eagle eyes, there is no “before undelete”. 

  3. You are best to handle to handle all code in terms of batches rather than the single document you are used to in Domino, we will handle batching in a later blog, but just take my word for it at the moment 

Presenting at MWLUG

Hooray!!!, I have been accepted to speak at mwlug this year

I will presenting 2 sessions

1) “The SSL Problem And How To Deploy SHA2 Certificates” with Gabriella Davis

This session went down well at connect and we are hoping that Austin will love this changed and updated version, Gab is awesome to present with.

2) “Salesforce for Domino Dogs”

Now If you saw this at Engage I urge you to come again as this is an evolving presentation that changes dramatically with each iteration (depending on presenters and the ever changing world of Salesforce)

  • Version 1: Balanced Architect (Engage 2016)
  • Version 2: Happy Evangelist (DNUG 2016)
  • Version 3: Rabid Developer <— This is the one I will be presenting

It will be my first trip out there and beside from presenting I will be manning the stand (The rest of the team are insisting I wear a shirt and everything).

P.S.

I’m looking for someone to room share/split cost with ( I sleep on the floor so there never seems to be a point to getting a room for myself ) …. I can provide references…

SalesForce for Domino Dogs 2: Scheduled Agents

Welcome to the second part of the Salesforce for Domino Dogs series. This one is a monster, but don’t worry we will be revisiting and clearing up some of the complex parts in other blog posts. What was a simple thing in Domino is quite complex in Salesforce and for a variety of very good reasons. So… scheduled agents.


Scheduled Agents: These little sods are the pain of many a Domino admin’s life. Personally I blame them for the lock-down of many a Domino server from the free-for-all that was so empowering to users, but sometimes there is no other way to get round limits or deal with certain triggered activities.

In Salesforce scheduled processes are a bit more complex than you might be used to, and this is not just a Salesforce thing, but a cloud thing—no cloud provider wants their platform just churning along in the background eating up cycles and I/O time.

So let’s break it down:

  1. The code that does stuff
  2. The scheduled task that the code sits in
  3. The schedule itself

1) The Code

So this CAN just be any bit of Apex you want, but most of the time you will actually end up using batch apex. Batch Apex is a whole set of articles in its own right, but in this case it’s just a way of getting round the Apex limits.

… hmmm that does not help. OK let me explain:

You know how with Domino scheduled agents, they will only run for so long before the agent manager shuts it down? This is to stop you writing rubbish code that screws up the system. Apex has a load of limits just like that, and the one that hits quite often is the limit that you can only send 10 emails using Send() in a given transaction (you can send 1000 bulk email per day). To get round this limit you have to “batch”, or break up your code into chucks. In Domino this would be like saying we want to process a whole view’s-worth of documents, but in chunks of say five documents at a time.

An empty bit of batch apex looks like this:

global class NotifiyAllUsersInAView implements Database.Batchable<sObject> {

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){

    }

    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){

    }

    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){

    }

}

Let’s take it apart. First we will use the “start” function to get the list of objects we want to work through, so we take the empty function:

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){

    }

… and add a search to say get all “contacts” in Salesforce. We only need the email address for these contacts1 so we add that as one of the fields which it gives us:

    // The start method is called at the beginning of the code and works out which objects this code is going to run against
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id, Email FROM Contact]);
    }

Next we want the empty “execute” function which will do whatever we want with each chunk of objects it is sent:

    // The executeBatch method is called for each chunk of objects returned from the start function
    global void execute(Database.BatchableContext BC, List<Contact> scope){

    }

So in this horrible bit of code, the chunk of objects is passed in a reference called “scope” — we are then just iterating the objects and sending an email for each contact (you can see the email address stipulated in the “start” being passed in using “c.Email”):

    // executeBatch method is called for each chunk of objects returned from the start function
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
          Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
          String[] toAddresses = new String[] {c.Email};
          mail.setToAddresses(toAddresses);
          mail.setSubject('Another Annoying Email');
          mail.setPlainTextBody('Dear XXX, this is another pointless email you will hate me for');
          Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
       }
    }

Finally we need an empty “finish” function which runs when all the batches are done:

    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){

    }

So let’s send a final email notification to the admins:

    //The finish method is called at the end of a sharing recalculation
    global void finish(Database.BatchableContext BC){
        // Send an email to admin to say the agent is done.
        Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
        String[] toAddresses = new String[] {emailAddress};
        mail.setToAddresses(toAddresses);
        mail.setSubject('Agent XXX is Done.');
        mail.setPlainTextBody('Agent XXX is Done.');
        Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
    }

Put it all together and you get:

global class NotifiyAllUsersInAView implements Database.Batchable<sObject> {

    // String to hold email address that emails will be sent to.
    // Replace its value with a valid email address.
    static String emailAddress = 'admin@admin.com';

    // The start method is called at the beginning of the code and works out which objects this code is goign to run agains.
    // It uses an SOQL query to work this out
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id, Email FROM Contact]);
    }

    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
          Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
          String[] toAddresses = new String[] {c.Email};
          mail.setToAddresses(toAddresses);
          mail.setSubject('Another Annoying Email');
          mail.setPlainTextBody('Dear XXX, this is another pointless email you will hate me for');
          Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
       }
    }

    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){
        // Send an email to admin to say the agent is done.
        Messaging.SingleEmailMessage mail = new Messaging.SingleEmailMessage();
        String[] toAddresses = new String[] {emailAddress};
        mail.setToAddresses(toAddresses);
        mail.setSubject('Agent XXX is Done.');
        mail.setPlainTextBody('Agent XXX is Done.');
        Messaging.sendEmail(new Messaging.SingleEmailMessage[] { mail });
    }

}

So now we need to call this code

2) The Scheduled “Agent”

The code we have just written won’t run in a schedule on its own, we need to wrap it up in a bit of code that can run on a schedule and decide how big the chunks will be. In this case they can’t be more than 10 as we will hit the Apex limits for sending emails. An empty schedule wrapper looks like this (I have called mine ‘Scheduled_Agent’ but you can call it anything):

global class Scheduled_Agent implements Schedulable{
    global void execute (SchedulableContext SC){

    }
}

Now let’s create a new instance of the batchable code we created in section 1, tell it we want it to run in batches of 5 records or objects, and tell it to execute.

global class Scheduled_Agent implements Schedulable{
    global void execute (SchedulableContext SC){
      Integer batchSize = 5;

      NotifiyAllUsersInAView batch = new  NotifiyAllUsersInAView();
      database.executebatch(batch , batchSize);
    }
}

Code bit all done!

3) The Schedule

Now it comes time to actually schedule the code to run at a certain time, you can set this up via the user interface by going into Setup, searching for “Apex Classes”, and selecting the result:


Select “Scheduled Apex”


As you can see, the options are limited to, at most, a daily run—you can’t specify it to be any more frequent. However, we need to run to more often than that2.

First open up your developer console, by selecting your name on the top right and picking it from the drop-down.


Now open up the “Execute Anonymous Window” from the debug menu.


You can now run Apex code manually, and as such you can schedule jobs with a load more precision using a Cron String. In this case we want to run the agent every 10 mins within the hour, so we create a new instance of our “Scheduled_Agent” scheduled class and schedule it appropriately:


Click “Execute” and you can see the jobs have been scheduled. It should be noted that you can only have 100 of these in your org and this uses up 6 of them, so some planning would be good.


And there you go, scheduled agents. Let the legacy of horror continue!


  1. When you get an object via SOQL, you ask for all the fields you want, this is not like getting a Notes Document you don’t just get access to all the document fields automatically. 

  2. Well we don’t but you just know someone will demand it to be sent more often. 

SalesForce for Domino Dogs 1: Profile Documents

Following on from the Initial Session “Salesforce for Domino Dogs” that Paul Mooney and I did at Engage and a modified version of which that has just been presented at #DNUG I figured that a series of dev articles on how you would do a job in Salesforce that you had always taken from granted in Domino might be a good idea, because:

  1. It would give an honest way of comparing features between the 2 systems shorn of the hype/marketing/platform bashing, that frankly gets on my thungers from both sides.
  2. It will hopefully help people trying to integrate the 2 systems.
  3. As IBM are one of the largest Salesforce consultancies in the world, it is something a champion should do.
  4. The Salesforce community is very short on this kind of thing given its size in comparison to traditional IBM communities and with people like René in it I want to try and help improve it.

These articles are not in any order and are not meant to represent any form of training guide.

So lets get started, first up: Profile Documents!!


In Domino you tend to store config settings for an app in a profile document1 for all your one off settings

To get the same features in Salesforce you use a ‘custom setting’ which does exactly the same job and has one huge advantage over using a normal Salesforce custom object that could do the same job.

(It should be noted that Domino profiles are nothing like Salesforce profiles)

To create a custom setting, go into Setup and search for “Custom Settings”



Click on the “New” button, and fill in some sane details, for normal configs i.e. stuff you would use system wide, select a setting type of “List”, if you want to use them for things like default values in fields and formulas then select “Hierarchy”



Click ‘Save’ and you now have a custom setting object, you can add one or more fields to it just as you would any other object in Salesforce



Added all the fields you want?, lets put in some data. if you go back to the list of custom settings you will see that you now have a “manage” link, click this



Then click “New”



Fill in the fields just like you would on a normal form, if this is a Setting there is only going to be one of, I tend to give it the same title as the name of the object to keep things simple, in this case “General Settings”, if you are going to use it multiple times than give it a name that will make sense in the context of your code



All done. now we can use the setting in our code and see the reason why we would use them vs. a normal custom object.

As you can see from the code below, you don’t have to use a Select statement, which means getting the settings wont count against your apex limits, HORRAY!!!

You just have to create a new instance of the setting object then you can just “getInstance” with the Name of the object you created to get the document back.

General_Settings__c generalSettings = General_Settings__c.getInstance('General Settings');
String thirdPartAppURL = '';
if (null != generalSettings) {
     thirdPartAppURL = generalSettings.Third_Part_App_URL__c;
    
} 


Simples..


  1. Well you are supposed to, but you never do thanks to the evil caching and the fact the are are a sod to just copy around, so you end up with a config document and a view you store them in. 

Latest Blogs