XPages review

Here at LDC towers we are quite a diverse bunch each member taking a primary role in a few technologies and then acting as secondary and tertiary to other members to back them up when needed, with people like Matt White and Ben Poole in the team handling the bulk of the
XPage work, my Primary UI skills were Spring web/view/mvc and Flex (as well as the classic domino we all still know like the back of our hands).

During this time I kept up on XPages by watching Matt’s videos on XPages101.net and following the rest of the blog spheres posts, but quite frankly demand has outstripped supply and now im Xpageing in anger so I figured a review on XPages was due for any body else who is late to the show ( I know you lot, some of you are still not doing proper Java in your apps!!!).

Sooo…what do I think?

Well frankly I like it, you produce really nice apps chock full of functionally, cleanly and at break neck speed, the only real hurdle is how you approach the development cycle and this depends on your point of view:

From a classic Domino persons point of view, XPages are not another component like pages or Navigators (remember them), they don’t work that way, think of XPages as a separate product that you have installed in your domino designer and server that provides you with a whole new layer of features, I found that it was easiest if I thought of them as an IBM plug in to the Classic Lotus product (which helped me resolve some of the integration miss matches and different ways of doing things).

From a none Domino person point of view: IBM have written another implementation of JSF (just like Spring Faces) and have glued it to a Legacy NoSQL database with integrated security and dedicated server platform to enable that platform to serve up content in an up-to-date form (just like they did with the HTTP task in 4.6).

Once you have your point of view and get down to writing code, you will find you don’t actually write XPages, they are just the final container, you actually write custom controls and then plug them into the relevant Xpage when you’re ready, like little UI and code modules sitting in a hierarchy (a bit like Spring MVC ) , understanding this is the corner stone to not tripping your self up. On the designer its self it’s obvious that the framework architects have been given considerably more time to get their sh*t together than the IDE designers and there is a lot of “you need to do that in the source view” or “change that in the ‘All properties’ section” , if you cant figure out how to do something, don’t worry they WILL have catered for it, it just wont be in the designer UI yet.

Lists are easyer to read than chunks of text so here are the good and bads of XPages as I see them.

Goods

  • SSJS (server side java script): I love this, all the power of JavaScript running nice and securely at the back end with nearly all the functionally of @fomula’s added to it (makes me want to go of and learn node.js).
  • Flexible: unlike other modern JSF/JSP frameworks, XPages thankfully inherits its ‘classic’ ancestors ability to be completely adaptable to what you want, which is a breath of fresh air from Spring in which you are basicly told “we know what’s best, so you cant do that”.
  • The Security is still there, its still far easier that other frameworks to do good security and it inherits well from classic (though not perfectly).
  • Expandability and plug-ins : IBM have gone with a constant upgrade path in which they develop add-ons which are released to the community as plug-ins which in turn will eventually get rolled into the main product ( the best example being the “XPages Extension Library” ) , giving you a nice balance of speed and supportability (thumbs up).

    Bads

  • SSJS: Its not ECMA complaint and I’ve already hit a few WTF moments when coding functions, also debugging is not the friendliest and please for the love of all that’s holy can I have a auto format key-shortcut.
  • Still a fair amount of hacking: there is quite a lot of “how how the hell do I do that” then performing some strange convoluted action for dealing with simple problems , the old domino people are used to it but I can imagine it being a real head scratcher for new people.
  • It has that IBM “I’m still slightly a prototype” feel about it, with a lot of IBM products you have the feeling they went “cool it works, ship it” while gently bypassing testing, now I know they are really trying (I got a tweet reply to me bitching about an error in about 30 mins) but still its an ongoing process.
  • The IDE is slow and hates to share with other developers at the same time (turn off automatic build), now I use plain eclipse most of the time as well as Adobe Flash builder and for my sins IBM RAD and all of them are faster than 8.5.3.

    All in all, what with the Xwork server, if you can get a client to ignore the “lotus” memory then we have a real contender on our hands, I am building apps at the same speed (if not faster) as I used to with classic domino but they are easy to make look really good, are technologically up to date, and built with a great deal less hastle than with something like Spring, IBM just need to clean up the rough edges and convince us they are not going to dump the whole thing for pure connections.

Old Comments

Mark Barton(09/07/2012 09:35:35 GDT)

Thanks Mark some good stuff there and some valid points.

Lets hope IBM listen about the grumblings about DDE, I can’t imagine new developers would adopt it out of choice.

Would be interesting to see some best practice design patterns with regards to where the business logic for a component sits. Java over SSJS?

We both know it makes no sense to put all of your eggs in one basket with regards to technology and any client side developer worth there salt will be keeping up with the trends.

Jason Hook(09/07/2012 10:11:57 GDT)

My first project was a steep learning curve even having Java and JSP experience.

Now in later projects I’m leveraging an awesome amount of business logic I wrote in managed beans in that first project. Because of this my speed of development is rapidly increasing.

I completely agree with your sentiments on DDE. We deserve a much better IDE. There are plenty of great IDE’s: Visual Studio, CODA, Flux4. I feel that once you’re up and running with XPages a decent text editor with solid code complete might be better than DDE.

Managed beans are great! It means you are writing Java but frankly on the level that most of us need that’s not so hard. Must write down a tip or two on using them.

Scoped variables & being able to maintain state!

Geeks that build software for geeks, that’s IBM for me. Things always have that complicated & not quite finished feel, and then here comes the next shiny thing.

I know that Connections is the current big shiny thing but some of us work for small to medium sized enterprises that can’t justify that kind of spend. So to continue to maintain and grown market share in that space I expect that they (IBM):

Will continue to support & articulate clearly the future direction of XPages and XWorks;

Will focus on marketing the product so that it looks new and shiny for the next generation of IT Managers that will be buying the product;

Will reach out to Developers and Administrators to help the get the most out of the innovations they are making, rather than developing stuff and waiting for us to make sense of it. LUG’s seem like the perfect opportunity to do that.

I deliberately made those items sound more positive because they need the encouragement. I’ve invested huge amounts of time acquiring skills and I want IBM to match that commitment we are all making.

Must get back to work (writing the next application with XPages)!

Mark(09/07/2012 10:28:56 GDT)

bleeding heck Mr Hook that comment is longer than the post, thanks for putting the effort in, looks to check on after reading it

Mark: yes some standards would be a good idea.

Keith Strickland(09/07/2012 15:23:47 GDT)

“XPages are not another component like pages or Navigators (remember them), they dont work that way, think of XPages as a separate product that you have installed in your domino designer and server that provides you with a whole new layer of features”

That statement is key in clearing your mind of all the old Lotus Domino hacks we used to have to incorporate just to get a simple feature to work on the web. I’ve been saying this for a while now, XPages are not another design element, it is a new platform all together. The sooner someone accepts this the sooner they will start becoming productive with XPage development.

Mark(10/07/2012 05:49:11 GDT)

@keith yup yup

Mark(10/07/2012 12:37:53 GDT)

@michael one of the things i gauge DDEs speed against is how long it takes before I can code, rather than mealy open the app, that’s where DDE seems to suffer, YES my apps are nearly always on a server, but if its for a none XPage fix, I can normally open R7.0.4 do a minor fix and shut down, before 8.5.3 has got its knickers sorted out

Michael Bourak(10/07/2012 10:48:06 GDT)

Agree with much… but DDE slow is a mystery for me. On my laptop, 5 years old, 4Go Ram though, it’s fast…at least MUCH MUCH faster than RAD for exemple.

Are you sure you disabled automatic compilation ? Are you running against a server via a low end bandwith ?

On the same subject, I often see people complain about DDE stability. ON my laptop, it crashed maybe…5 times in 6 months of daily usage…

Michael Bourak(10/07/2012 13:12:21 GDT)

The majority of “wait time” is due to Java tools initializing (eclipse stuff) and time to get the project from network…

In 8.5.4 there is a new feature coming that will help keep DDE open and not relaunch / close it…(not sure I can say more…but I use it via beta version and like it a lot)

Ben Poole(10/07/2012 20:40:08 GDT)

The Java tooling in Eclipse does take time, but DDE still has its own pain points. I’ve found that there are a few things that make it faster:

1. Turn off automatic builds
2. Increase default JVM heap / max heap sizes
3. Ensure the host machine / VM has plenty of RAM

It’s still extremely slow compared with proper IDEs though

Michael Bourak(11/07/2012 09:09:33 GDT)

@Ben : running DDE or any disk intensive software inside a VM will suffer a lot from poor virtualisation disk performances…

Can you give sample of “proper IDE” ? If you mean notepad++, sure

On which OS are you running DDE ? What’s your config ?

Mark(11/07/2012 08:41:18 GDT)

@michael looking forward to 8.5.4!

@Ben yup, I took your advise on those bit a while back and it has helped a lot

Ben Poole(25/07/2012 22:33:01 GDT)

running DDE or any disk intensive software inside a VM will suffer a lot from poor virtualisation disk performances

Pish!

Re OS, you can run DDE on Win7 (64 bit), Win7 (32 bit) and WinXP. Doesn’t matter, it will still be slow.

As for “proper” IDEs, there are a few out there:

– Visual Studio
– Eclipse (vanilla)
– Netbeans

but yes, generally my preference is for the simpler tools like Coda 2 and Sublime Text 2.

Classic Domino Search Trick

There is less classic Domino work around then there used to be ( Pure Java, mobile and Xpages taking up most of my time), but still plenty of maintenance work for those who can provide what the client wants, any way, this is something I just updated on an existing clients classic app, and it struck me that it MUST be known by just about every one, but either no one has blogged it, or my Google foo is weak at the moment, so I decided to post it in case I might be of some use.

Problem: you have a classic search form, and you want to analyse or work with the results (making totals or stuff like that), but need to somehow get hold of the values in order to do so, I can remember this being a complete PITA years ago, but with a fresh set of eyes its dead easy.

Solution:

1) Make your searching view a HTML one (via the view properties).

2) Make a column that returns a hidden field for each row for the value(s) that you want to access.

"<td>" + "<input name="PostCodeSearchValue"  value="" + PostCode + "" type="hidden" />" + "</td>"

3) Now when you search, you can go a fetch all these values with client side java script.

var coll = document.getElementsByName("PostCodeSearchValue");
var arr = [];
for (var i = 0; i < coll.length; i++) { 
 arr[i] = document.getElementsByName("PostCodeSearchValue").item(i).value;
 }

and do what ever you want with them.

Now that lacked a bit description wise, so here is a demo and working download for you to take to bits

Demo: Here , Just Click “Search” to get a bunch of data to work with and then “Find Unique Postcodes” to run a bit of Javascript over the result to find and display the unique Post Codes in the returned data

DownLoad: You can get the file Here

ahhh, History…

update: Matt White has pointed out that this produces invald HMTL (working but invalid), and suggests using a JS framework and doing the same sort of “find” but using a CSS Class name… a fine point

Old Comments
————
##### Mark(07/06/2012 16:24:22 GDT)
Does that mean you are making a separate search call? rather than a local bit of JS dealing with the data you already have? if you were wanting to total a value for the items the existing search had returned how would you be doing that in the context of a $$search form?
##### Mark Barton(07/06/2012 15:59:29 GDT)
Yes – rather than HTML in the search view columns just construct JSON data, or am I missing something?
##### Mark Barton(07/06/2012 15:28:30 GDT)
Couldn’t you just return JSON?
##### Mark(07/06/2012 15:31:32 GDT)
as part of the search form data?

Being nice to other developers with REST services

REST services are really nice, really fast and easy to create, but sometimes we forget in our haste to get them out the door that a little bit of structure will help both our selves and any third party developer than might user them, eg, if when a call is made to your service

[]

is returned, what does that mean? was it sucessfull but there is no data, or was there an error in your call, a bit of context would go a long way, the easiest way round this is a little wrapper class, nothing fancy, just enough to throw the front end devs a bone, my basic one looks like this

package com.ldc.classes;
public class RESTReturn {  
    public static final Integer SUCCESS = 0;
    public static final Integer FAILED_GENERAL = 1;
    public static final Integer FAILED_INVALID_PARAMETER_VALUES = 2;
    public static final Integer FAILED_MISSING_PARAMETER = 3;
    public static final Integer FAILED_AUTHENTICATION_FAILURE = 4;
    public static final Integer FAILED_AUTHORISATION_FAILURE = 5;
    public static final Integer FAILED_AUTHENTICATION_EXPIRED = 6;
    public RESTReturn() {
    }
    public RESTReturn(int returnStatusParm) {
        returnStatus = returnStatusParm;
        returnStatusDesc = statusToString(returnStatusParm);
    }
    private int returnStatus;
    private String returnStatusDesc;
    private Object payload;
    public int getReturnStatus() {
        return returnStatus;
    }
    public void setReturnStatus(int returnStatus) {
        this.returnStatus = returnStatus;
    }
    public String getReturnStatusDesc() {
        return returnStatusDesc;
    }
    public void setReturnStatusDesc(String returnStatusDesc) {
        this.returnStatusDesc = returnStatusDesc;
    }
    public Object getPayload() {
        return payload;
    }
    public void setPayload(Object payload) {
        this.payload = payload;
    }
    public static String statusToString(Integer wsReturn) {
        if (SUCCESS.equals(wsReturn)) {
            return "It Worked";
        } else if (FAILED_GENERAL.equals(wsReturn)) {
            return "Error: This call caused an error";
        } else if (FAILED_INVALID_PARAMETER_VALUES.equals(wsReturn)) {
            return "Error: Something you gave me was rubbished";
        } else if (FAILED_MISSING_PARAMETER.equals(wsReturn)) {
            return "Error: You left somthing out";
        } else if (FAILED_AUTHENTICATION_FAILURE.equals(wsReturn)) {
            return "Error: Err user name or password was wrong";
        } else if (FAILED_AUTHORISATION_FAILURE.equals(wsReturn)) {
            return "Error: You you can log on, but you dont have rights to do that";
        } else if (FAILED_AUTHENTICATION_EXPIRED.equals(wsReturn)) {
            return "Error: Your logon has expired, sorry";
        } else {
            return null;
        }
    }
}

and I would use it a bit like this

package com.ldc.classes;
import java.util.List;
public class Getaddress {
    public RESTReturn getAddressInJSON() {
        int callStatus = RESTReturn.SUCCESS;
        List<Address> someAddresses = null;
        try {
            someAddresses = doSomthingToGetALoadOfAddress();
        } catch (Exception e) {
            callStatus = RESTReturn.FAILED_GENERAL;
        }
        RESTReturn restReturn = new RESTReturn();
        restReturn.setReturnStatus(callStatus);
        restReturn.setReturnStatusDesc(RESTReturn.statusToString(callStatus));
        restReturn.setPayload(someAddresses);
        return restReturn;
    }
}

so instead of

[]

I would get

{"returnStatus":1,"returnStatusDesc":"Error: This call caused an error", "payload" : "[]"}

which at least tells you that something is not right with the world

Obviously you can get as creative as you want, and a lot depends on how good your internal functions are at displaying their unhappiness, but every little helps and you will make better friends with your client side devs for a bit of consideration like this

Thanks to Ben Poole for pointing out to me that I’m behind the times as normal and this is standard practice, doh!!

Old Comments
————
##### Mark Barton(01/06/2012 14:32:21 GDT)
I wonder though if you should throw a HTTP Status != 200 e.g. 404 along with the custom exception error message.

Its then trivial to catch an error using something like JQuery.
##### Mark(01/06/2012 14:37:24 GDT)
Good point, in this case I have found that this kind of structure is most beloved of the mobile apps lot, who prefer to handle such errors separately due to the someone unreliable nature of mobile networks, but it would be a good idea to use http status items (goes off to look)

Exposing existing classes for REST top tips

Bolting new functionality onto older systems always has its fair share of fun and games, but exposing internal data structures to external parties is an even greater horror, this does happen though, particularly when clients want to access previously locked down systems via web services or REST interfaces or you want to deliver the same functional via new client eg B2B, mobile etc etc, I have been doing this recently with REST services using the JERSEY library, which like nearly all the good service libraries serialise and deserialise classes for you automatically, however nothing is ever perfect straight out of the box so here are some the things you can do to make your REST services perfect (these work with any JAX/Jackson implementation and often even CXF based systems)

Ill put all the code in first then explain it later (this coded represents the internal class that I am exposing via the services plus the annotations that I have added to it):

package com.ldc.classes;
import java.util.Date;
import org.codehaus.jackson.map.annotate.JsonSerialize;
import org.codehaus.jackson.annotate.JsonProperty;
public class Address {
    @JsonProperty("referenceId")
    private int Id;
    private Date whenTheyMovedIn;
    private String guessWhenTheyMovedIn;
    private String firstLine;
    private String secondLine;
    private String postCode;
    @JsonSerialize(using = CustomDateSerializer.class)
    public Date getWhenTheyMovedIn() {
        return whenTheyMovedIn;
    }
    public void setWhenTheyMovedIn(Date whenTheyMovedIn) {
        this.whenTheyMovedIn = whenTheyMovedIn;
    }
    @JsonSerialize(using = CustomTextToDateSerializer.class)
    public String getGuessWhenTheyMovedIn() {
        return guessWhenTheyMovedIn;
    }
    public void setGuessWhenTheyMovedIn(String guessWhenTheyMovedIn) {
        this.guessWhenTheyMovedIn = guessWhenTheyMovedIn;
    }
    public String getFirstLine() {
        return firstLine;
    }
    public void setFirstLine(String firstLine) {
        this.firstLine = firstLine;
    }
    public String getSecondLine() {
        return secondLine;
    }
    public void setSecondLine(String secondLine) {
        this.secondLine = secondLine;
    }
    public String getPostCode() {
        return postCode;
    }
    public void setPostCode(String postCode) {
        this.postCode = postCode;
    }
    public int getId() {
        return Id;
    }
    public void setId(int Id) {
        this.Id = Id;
    }
}

package com.ldc.classes;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.codehaus.jackson.JsonGenerator;
import org.codehaus.jackson.JsonProcessingException;
import org.codehaus.jackson.map.JsonSerializer;
import org.codehaus.jackson.map.SerializerProvider;
public class CustomDateSerializer extends JsonSerializer<Date> {
    @Override
    public void serialize(Date value, JsonGenerator gen, SerializerProvider prov)
            throws IOException, JsonProcessingException {
        SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ssz");
        String formattedDate = formatter.format(value);
        gen.writeString(formattedDate);
    }
}

package com.ldc.classes;
import java.io.IOException;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Locale;
import org.codehaus.jackson.JsonGenerator;
import org.codehaus.jackson.JsonProcessingException;
import org.codehaus.jackson.map.JsonSerializer;
import org.codehaus.jackson.map.SerializerProvider;
public class CustomTextToDateSerializer extends JsonSerializer<String> {
    @Override
    public void serialize(String value, JsonGenerator gen, SerializerProvider prov)
            throws IOException, JsonProcessingException {
        Date date = null;
        try {
            date = new SimpleDateFormat("MM/dd/yyyy", Locale.US).parse(value);
        } catch (ParseException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }       
        SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ssz");
        String formattedDate = formatter.format(date);
        gen.writeString(formattedDate);
    }
}

Right!, first one is name changes, i.e. you have a field who’s name is just fine in your internal class, but you want to change it when you expose it via REST services (in this case its the “id” field, this was because “id” is a reserved word under iOS which make using it a PITA for the client coders), all you have to do is put the following @annotation and the Alias you want to use just above the field you want to change, and bingo, new name ONLY for REST services

@JsonProperty("referenceId")
    private int Id;

Next is date formatting, the default json date format is not loved by many people I know, and a lot of people prefer ISO 8601, you can use a custom class to do this conversion for you quite easly, again without touching the internal workings of your existing classes, just put this anotation above the ‘getter’ for that field and it will do the conversion for you on the fly (you can see the CustomDateSerializer.class code above)

@JsonSerialize(using = CustomDateSerializer.class)
    public Date getWhenTheyMovedIn() {
        return whenTheyMovedIn;
    }

Finally you can take the custom classes a bit further and change type not just class, for example converting a short form date “DD/MM/YY” stored in a String into a propper Date field. you again just declare a custom class but beef up your code and you can turn anything to anything, funky!

 @JsonSerialize(using = CustomTextToDateSerializer.class)
    public String getGuessWhenTheyMovedIn() {
        return guessWhenTheyMovedIn;
    }

Looking back on this last project and talking with clients about a future one, there is maybe the option of having a dedicated set of classes for external services, far more control and more secure (as, if someone adds a field that really should NOT be exposed it wont automatically flow though to the external service), mind you if you do it that way you will have to weight up the not inconsiderable advantages of speed and easy of maintenance offered by using the internal classes.

but either way these tips might help

Old Comments
————
##### Karsten Lehmann(31/05/2012 15:37:18 GDT)
Nice!
I hope you never try to use this on a Domino server with Extension Library installed. IBM picked Apache Wink as JAX-RS provider and JAX-RS only allows one per JVM (an extremely bad idea).

I did some investigation in this area when we started developing REST APIs for a framework. Somewhere in the JVM, there is a static field that contains the JAX-RS provider and this can only be set one time – with one value.
##### Mark(31/05/2012 18:07:21 GDT)
Karsten: oh poot, hmmmm, there must be a way round that, grrrrrrr

Chris: im off the booze, im on java drugs now
##### chris(31/05/2012 17:46:32 GDT)
dont understand a word of it you must be drinking something stronger than me

Drobo or NAS to amason s3 with rsync

I have written before about syncing desktop stuff to a NAS (in my case a Drobo) using Rsnc for backup reasons before, but lets now take it a step further and sync our Drobo/NAS stuff with amazon s3, in this example I will be backing up my precious happy hardcore and audio book collection that all live in a Directory called “Audio” on one of my Drobo shares

1) First create a mount point for your Drobo/NAS connection, I created a folder called “/media/localAudio” (ensuring that the local user you will be backing up as has write rights to the folder)

2) Next ensure you have the samba file sharing utilities installed (smbfs), you can do this on a terminal prompt with

sudo apt-get install smbfs

3) See if you can now mount your share with “sudo mount -t smbfs //192.168.0.XXX/myshare/Audio/ -o username=stickfight,password=password”
This assumes that I am want to map the “Audio” directoy on the “myshare” share on the IP address 192.168.0.XXX, also that you have to log-on to your share to be able to read/write to it, if you don’t, just miss out the “-o username=stickfight,password=password” bit

4) Now letts connect to amazon S3, you will first need an amazon 3s bucket for this (or use an existing one), go here for instuctions on creating one, mine is called “stickfight-audio”

5) This bit is less easy, you need to install s3sf, you can get clear instructions from here

6) Right, s3sf uses FUSE to perform its connections, but we have to tell it where your security credeiatals are for your S3 bucket, so create a text file .passwd-s3fs in your home directory and put your security credentials in them in the following format bucketName:accessKeyId:secretAccessKey , e.g.

stickfight-audio:0VWEOIEWOIUREWOIUFDS2:t4SyQ6pGjldoi4898dsoierelke/auw2wB4Rs+

(no theses arn’t my bloody credentials)
and give it the following permissions

chmod 600 ~/.passwd-s3fs

7) Next we want a mount point for the s3 bucket on our system, I created a folder called “/media/s3Audio” (ensuring that the local user you will be backing up as has write rights to the folder)

8) Now we can mount our S3 bucket as a local drive with s3fs bucket_name /mount/point, e.g.

s3fs stickfight-audio /media/s3Audio

9) Next make sure you have Rsync installed with

apt-get install rsync

10) Finally you can run your Rsync command to do the backup e.g.

rsync -r -t -u --progress  /media/localAudio/ /media/s3Audio

NOTES:
“-r” = copies all the sub directories and file, normally you would use “-a” but that copies the file permissions as well which in this case I don’t want.
“-u” = Update, means it only copies only new or recently changed files.
“–progress” = makes the terminal output far more readable and tells you how far it gets now.
“/media/localAudio/ /media/s3Audio” = source and target directories.(the “/” at the end of localAudio stops it actually creating a localAudio folder in the root of your s3 bucket thus keeping your directory structures the same level)

Doing this will backup your data perfectly, but it will beat the hell out of your bandwidth, a program such as trickle can limit the damage, install it via

sudo apt-get install trickle

Then use it to alter your S3 mount point, so that it limits the upload speed (in this case to 512KB/s, but you can change it to what ever suits you)

trickle -u 512 s3fs stickfight-audio /media/s3Audio

If you get an error along the lines of “trickle: Could not reach trickled, working independently: No such file or directory” ignore it, its just a badly worded advisory

So thats it working, I’ve rolled all this up into a script file that I can run when it suits me (its too big for a schedule)

sudo mount -t smbfs //192.168.0.XXX/myshare/Audio/ -o username=stickfight,password=password
trickle -u 512 s3fs stickfight-audio /media/s3Audio
rsync -r -t -u --progress  /media/localAudio/ /media/s3Audio

There you go.