FileZilla Download Issue

FileZilla is one of the best FTP clients ever, however while they’re perfect everybody else isn’t

What this means is that if an FTP site isn’t configured correctly then trying to download files that have special characters or accents in them using the FileZilla client will fail, the FileZilla team don’t want to support it because they’re doing the correct thing standards wise (UTF-8 settings on the server) and it’s other people’s failings that are causing the problem but other clients DO support this kind of behaviour eg. firefox which is a shame…

This means that we do need to find a way around the problem with FileZilla, the first sign that something has gone wrong will be in your failed downloads tab

A quick glance at the file usually shows you that there is a special character or accent in it, in this case a “Γ€”

To get round this I found easiest way is merely to select all of the items that have failed right click and export them



This will give you an XML file, open it up and look for the “File” sections and then find and replace the special charaters in the “localFile” section to a none special charater, save the file then import it into fileZilla (file –> Import), you will find that it will download all missing files.

It would be nice if there was an automated version of this but as far as I can tell the FileZilla guys are sticking to standards. πŸ™

<File>
    <LocalFile>/mnt/Slow/DownLoads/Anime/Series/Ongoing/Kirakira Precure A La Mode/[anon] KiraKira Precure Γ€ La Mode - 47 [1280x720].mkv</LocalFile>
    <RemoteFile>[anon] KiraKira Precure Γ€ La Mode - 47 [1280x720].mkv</RemoteFile>
    <RemotePath>1 0 5 Anime 6 Series 7 Ongoing 26 Kirakira Precure A La Mode</RemotePath>
    <Download>1</Download>
    <Size>571273852</Size>
    <ErrorCount>1</ErrorCount>
    <DataType>1</DataType>
</File>

Christopher Odd Youtube Channel Review

This is a social review post done when I should be doing technical posts but I REALLY enjoy this YouTube channel and as this particular YouTuber is moving to running his channel full time it seemed like a good opportunity to do a review.

The channel is called @ChristopherOdd and it’s a ‘let’s play’ channel or ‘long play’ as I used to know them, in which someone plays full video games while narrating what they are doing, there are hundreds of them out there so what differentiates this one from others?

  1. Christopher Odds voice: His voice is far more like an audio book narrator than a gamer, he doesn’t get hysterical, he doesn’t make you wince, he doesn’t get over excited in a fake way (though he suffers baddly from jump scares in horror games) it’s a pleasant dulcet tone to listen too, as someone that listens to a huge number of audio books I’ve come to really appreciate the quality of a narrator and as far as I’m concerned Christopher Odd is the best narrator for games that I have currently met on any platform.
  2. Mental speed: I don’t know his age and I haven’t looked into it (it feels a bit stalkerish to start looking at peoples age and things like that) but he’s about my mental speed, he solves problems at about the speed I do, I don’t want to jump up and down and yell for him to do things faster or slower, he just thinks at a nice pace and I feel after he has completed the game that I have seen all of that game and that it is as complete a play through as I would make.
  3. Being correct without being too correct: Christopher is constantly correcting himself while he’s playing, you can see from one video to another that he is learning and changing his opinions as things grow and change, Cultural slips are corrected, things said in the heat of the moment are amended instantly and this is at a time when his popularity is at about 300000 YouTube followers when a lot of YouTubers think they can get away with saying offensive things without any backlash. It makes you think he is a good person in real life.

These and the choices of the games he plays result in a channel I can watch all day, a rarity in this day and age πŸ™‚

But even if he is a joy to listen too, why would I want to watch a video of someone playing computer games? for me there are two reasons

  1. He plays a series of games that I would never play!! I do not want to play Dark Souls, all the Dark Souls series are massively frustrating to play and that’s not something I am interesting in experiencing but I am fascinated with their worlds and want to learn about them [1]After watching the play through I now feel invested in the game and From Software actually get money out of me when they normally wouldn’t as I have bought Dark Souls books and merchandise..
  2. He plays most of his games based on their story and that story is often better than a lot of TV series. To me watching him play is it enjoyable as a series, in fact more so because I can predict most elements of a TV story line (As can most people) but when watching a game there will be surprise elements based purely on his game play over and above the story, the same cannot be said of TV.

All this has meant that I am now a Patron supporter of his channel and treat the news of him going full time with great delight, his channel is by far the one I watch the most and at the current quality I see no reason to change that in the future. πŸ™‚

References

References
1 After watching the play through I now feel invested in the game and From Software actually get money out of me when they normally wouldn’t as I have bought Dark Souls books and merchandise.

Copying NSF files to the cloud

This is an old tip that I never thought I would use again but has come back to life with the advent of the cloud:

Scenario:

We are migrating multiple servers from onsite to the cloud, the bandwidth of this copy means that it won’t happen within 24 hours and it definitely won’t happen within the maintenance window we have, normally with Lotus Notes migrations from one server to another this wouldn’t be an issue as Domino replication has been a model of stability and ease of use for well over 10 years HOWEVER there are tons and I do mean tons of complex replication settings in this clients setup a lot of them unknown or unremembered to the client, so they have found that using replication means they will miss some of these and after having a look at them I tend to agree so file copy it is..

NSF files tend to be a bit bulky and zip up really rather well so zipping them up before moving them over them makes sense but we don’t want to do one large zip, because 1) The target file system doesn’t have a lot of extra space on it, and 2) The actual copy will take several days so we want to do it one chunk at a time

First let’s get a list of all of the NSF’s files we want to copy over, this will have a double advantage of giving us an indication of numbers and size etc as well as giving us something that we can actually work through so that we can do one group at a time,

dir *.nsf /s /b>f:filelist.txt

Once we have the list then the following little script popped in a batch file and with WinRAR installed on the system will give you an exact mimic of your notes folder structure but with each NSF zipped up and in the correct place

@ECHO OFF
setlocal enableextensions
for /F "tokens=*" %%A in (filelist.txt) do (
    FOR %%i IN ("%%A") DO (
        md D:NSFZIP%%~pi
        "C:Program Files (x86)WinRARRAR.exe" a -r -dh f:NotesZIP%%~pi%%~ni.rar %%A
    )
)

You can then copy them over in whatever method you prefer and unzip as suits you, this method may seem a little Noddy but this is the third time I’ve used it and every time the notes movement has been the easiest part of a migration.

Missing A Conference

Over the last few weeks my social media stream has been filled with pictures and memories of times gone by for Lotusphere/Connect, these memories have been more than a little bit painful as they were all great times, a meeting time for great friends as well as for a community spirit that I’ve never met in any other technology, not Salesforce, nor Java, node or MongoDB,

This community still does exist even though it has shrunk over the last couple of years, however there is hope that with the recent changes and the hopeful reinvigoration by IBM/HCL as well as the constant work of such core community leaders of Gabriella Davis that it will return and maybe even grow, having basically opted out of the community for the last year or so through a mixture of client demands and ever-increasing work load, I am now reminded by these pictures and memories how important such a community is and not just to work and to business but to friendship and general sanity,

Long live the yellow bubble!!!

 

⇑ My First Lotusphere, young fresh-faced and not fat



⇑ Presenting for the first time at Lotusphere

⇑ On the Piss with good friends

⇑ The famous “all bloggers” photo

Saleforce Same Code Different Triggers

In Salesforce the same bit of code can be triggered a lot of different ways and with calls to third parties there are different rules for the different ways of calling stuff.

For example take this bit of code, in it we are just passing a contact ID and it is going to go and talk to a third party web service, inside the “setUpRequest” it’s going to update the third party with the details of the Salesforce Contact and in return recive some bits and bobs from the third party to update the Saleforce side. Basic syncing between two parties

public class BlogFramework {
    public Static Void UpdateContactFromExternalWebService(String contactID) {
                Http h = new Http();
                HttpRequest request = setUpRequest(contactID);
                HttpResponse response = h.send(request);
    }           
}

 

we want this thing to happen at two different times:

1. When a user manually updates a contact and then just saves it: we want the sync to happen instantly so the user can see immediately what’s happened and what’s been updated.

2. On schedule: The content might not be updated in Salesforce at all, all changes might happen in the third party but the details still have to be kept up to date for reports and views etc.
So this bit of code has to be callable both from a Schedule and from a save Trigger

let’s take the save trigger first, as it is now it won’t work, you will get the error “Callout from triggers are currently not supported.” error if you try, normally you would just pop the annotation “@Future(callout=true)”[1]The “@Future(callout=true)” annotation basically means that the salesforce code does not stop and wait before doing other things this means that calls to third parties does not slow down … Continue reading at the top of this function and that would solve that but as you will see later on we can’t do that so what we’re going to do is have a little wrapper function that has the @future annotation and from that it’s going to call are real function.

@Future(callout=true)
public Static Void UpdateContactFromExternalWebServiceTrigger(String contactID) {
        BlogFramework.UpdateContactFromExternalWebService(contactID);
}   

 

we can then put that wrapper functions in our contact save trigger and everything will work perfectly

trigger ContactTriggerAllEvents on Contact (
    before insert,
    before update,
    //before delete,
    after insert,
    after update
    //after delete,
    //after undelete
    ) 
    {
        for(Contact cnt:Trigger.new)
        {
            BlogFramework.UpdateContactFromExternalWebServiceTrigger(cnt.ID); 
        }        
    }

 

Next comes calling it from a schedule, if we had put the @future annotation on the actual function this would fail because you cannot call a future function from a scheduled action but we dont have that issue now, what you DO have to do is bolt-on the “Database.AllowsCallouts” to your batch object as seen below

global class UpdateFromAzpiral implements Database.Batchable<sObject>, Database.AllowsCallouts{
    // Get all the contacts
    global Database.QueryLocator start(Database.BatchableContext BC){
        return Database.getQueryLocator([SELECT Id FROM Contact]);
    }
    // The executeBatch method is called for each chunk of objects returned from the start function.
    global void execute(Database.BatchableContext BC, List<Contact> scope){
      for(Contact c : scope){
         BlogFramework.UpdateContactFromExternalWebService(c.ID);
      } 
    }
    //The finish method is called at the end of a sharing recalculation.
    global void finish(Database.BatchableContext BC){
    }
}

 

Now your batch object will be allowed to do callouts.
Putting all these bits together means you can have a single function that calls out to third parties that can be triggered from either a Schedule or an ordinary Trigger.

References

References
1 The “@Future(callout=true)” annotation basically means that the salesforce code does not stop and wait before doing other things this means that calls to third parties does not slow down the salesforce UI.

A Year In Review 2017

So here comes the year in review blog postI think it’s fair to say that I’ve never had a year that’s been so head down teeth gritted

Most of what would be considered the fluff or interesting things that you do in a year has been missing entirely, no conferences, no training courses, no anything other than client work. that is not to say I haven’t done new things, each week, each month seems filled with new technology, new things to code, new things to learn but it’s all been work that has to be delivered, work that has to be produced on time.

This resulted in the company itself doing well and thanks to that I’m in a better position then I think I’ve ever been before, a state that is all for the good because what with Brexit happening soon, UK companies now face an uncertain future so using the next two years to prepare for that is something that is going to be really important

Looking forward to the coming year it looks as if I have finally reached a point in my career progression where I do not have a major on-site client, all of my work can be done remotely which is a goal I’ve been aiming for some time but it’s still a little bit stunning to finally reach it, it will mean I need even more focus in how I work and that will give me a couple of extra blog posts as I formalise the way I behave on a day to day basis but I finally have the flexibility I’ve been after.

It’s always good to look back in the year and try and update your CV, what are you an expert in? what can you sell yourself as? what you are aiming for going forward?
To be frank, knowing technology by rote plays less and less a part of what I provide my clients (both LDC and None LDC).

What I’m good at is learning new[1]And by “New” I just mean new to me. things, in adapting and providing clients with what they want when they want, the stuff that I’ve been hired to do going forward this year and stuff that I’ve been doing for the last 8 months at least hasn’t really been solely technologically orientated no one has said “oh are you an expert in x” they have just hired me to solve a problem, to make an issue go away, how that is done has been irrelevant or has already been set in stone by corporate decision.

However, I wouldn’t be me unless I still loved new technology and rolled in it like freshly cut grass.

  1. Javascript frameworks: Javascript is still the internet darling it has been for the last couple of years, but the frameworks come so fast that I now just poke my nose in each one that passes to see if it does anything groundbreaking or if it genuinely replaces one I am currently using.
  2. Networks and encryption: This has been an odd one to go back to and get up to date with, so many of the solutions I have had to provide this year have not been code related, or rather code has not been the best way to solve the problem, hardware and network performance issues don’t just go away with platform as a service if anything they get more complex as they are not as transparent.
  3. Salesforce offshoots: Salesforce continues to keep buying things and integrating them into their ecosystem so things like AMPScript have become commonplace.

But there have been losses and this year I lost my IBM champion status, there was the brief pang of “Bugger”, but writing this I can’t say I disagree with the decision, I did no conferences this year[2]LotusSphere/IBM Connect/IBM Think was in direct conflict with a client delivery, I’m not travelling to the US unless I have to for personal reasons so that cuts out the regional US conferences, … Continue reading, and thus no speaking gigs, this blog was very quiet on the IBM front, and all the stuff I did for IBM was behind the scenes at client sites and a Champion really does have to be seen …. C’est la vie

2018

What do I think I will be doing in 2018?

  1. Practical Cloud – The cloud has changed so many things and made them better, but in some ways we have gone backwards and there is a lot of work in such regressions, for example, inter-machine network speed that had reached really rather fast rates on internal networks has suddenly tanked when it is measured between existing onside stuff and new cloud services.
  2. Hard Decisions – Over the last year, I have seen a growing trend of business actually having the budget and gritting their teeth over modernizing apps that have been around for 10+ years.
  3. Security – Even things that have been trusted for years have failed in the last 12 months, and while there are lots of security people around that will load up your network and apps with new standards and firewalls, there does not seem to be anyone that is willing to fix the trashed performance when the heavy boots of the security forces have been in and done their work.

I suspect quite a lot of my year will be spent using both new and old tech to get things working again after someone has enacted the latest company edict……. πŸ™‚

References

References
1 And by “New” I just mean new to me.
2 LotusSphere/IBM Connect/IBM Think was in direct conflict with a client delivery, I’m not travelling to the US unless I have to for personal reasons so that cuts out the regional US conferences, UKICON was cancelled and I screwed up my passport for Engage