The Timings of Workshops in a Big project

 

Workshops or the simple act of sitting down with the client or customer in a single room and thrashing out a problem are absolutely invaluable in any decent sized project.

They are a necessity both for actually getting the feel of a given bit of work and figuring out local issues 1, to building rapport with the business so they know that you’re all on the same side when trying to deliver.
The only thing is, timing. The human element of a workshop, the rapport side of it demands that we should do them straight out the gate, to provide a good first impression, “we are your kind of people.” “Tell us what you need, and how we can help.”, all that kind of stuff dictates that you should do a workshop first thing in a project.

But from a technical and practical side this does not work, because you haven’t turned over all the rocks and looked under them, everyone is fresh to the project and enthusiastic, however very little of the true cost or issues that a project is going to face will come up now, and frankly from a moral and engagement point of view you don’t want the cynical techs raising such points in initial calls.
This is one of the reasons why when you have large consulting houses come in and do an evaluation of an infrastructure or what have you, they always miss things. It’s not their fault, things simply have not been found.

You need to go down into the project’s dependant systems, turn over every rock, find the real people who are doing things rather than the people who were introduced as in charge, find all the nooks and crannies where the Dragons be, before you actually have a true visualisation of any existing system or processes.

So, when I’m doing large scale projects, I would recommend two sets of workshops. An initial workshop that is human focused, that develops rapport, and gives initial starting positions and works out where people hope to end up.

Then further down the project timeline, book in a review workshop, or “step back” workshop to be done once you’ve had a few months in, and the techs have had a chance to talk to each other, you’ve had a chance to thrash out real processes, found those hidden systems that everyone had forgotten about. Found out the people that do all the little manual interventions to make a process run in the real world

After that, put those into the project plan and cater for any knock back they will cause. This is the crucial point, you will have planned for these knock backs and timing changes up front, this gives the client the appreciation and understanding that you have been down this road before, you know that this is what’s going to happen, can predict it and roll with it to still deliver on time, and should you do the review workshop and don’t find anything new, then it can be used as a back patting exercise for the client, shows everyone how well they knew their stuff to start with, and how pleasurable they are to work with.

So yes, I would always include two sets of workshops.

  1. if it’s a large international corporation[]

1st Tuesday Club No:191

I really should have done this post last month as that was the first Tuesday Club’s 20th anniversary and I have been going for the full 20 years, The event should really have been a big song and dance, but as always with the first Tuesday club, everything is subtle.

However, I’m glad I’m doing my periodic post for this particular meetup. It was at a new venue and with a new sponsor 1, and it was a particularly good one. Not only was there a number of new people there that had never been before, but they all had a far better time than they expected. of particular note was a conversation with one new attendee who was coming because their office is moving back to full time in the city, and therefore these kinds of social events suddenly leap back to their old prominence.

The vendor Checkmarx did a particularly excellent job. They were in evidence and we’re working the floor as any good vendor would, but only in a very social aspect. The rule of no hard sell was very much kept. I met three people from the vendors team, who were genuinely fun people to chat to and very knowledgeable, Good contacts to know. One of them particularly was very, very funny, despite the fact he was a salesperson dropped into a room full of security techs.

Conversations as always varied hugely from serious ones on how politics effects security implementation in corporations, through to the daftest travel stories

The highlight conversation of the night for me was a very involved one on exactly how vector databases work.2 How these worked in their base form, as well as how their search associations functioned. this sounds boring unless you were watching us do it , because we commandeered a chunk of the bar and watched by a number of very, very patient bar staff were trying to explain Euclidean distance and dot product by using wine and beer glasses.

The venue was the Hispania Restaurant, a new one on me, and particularly fine, when I mentioned it to a colleague it turned out that it was their favourite restaurant, and they were very jealous that I was getting to spend the evening there for free, the facilities were perfect with a steady flow of drinks and food all night.

All in all it turned out to be one of the most pleasant work evenings I have had in ages, my Linkedin has more contacts than it did yesterday, and all the people I met were keen to come back next month.

  1. , For the actual first Tuesday Club at least[]
  2. Vector databases being the backbone of modern AI.[]

Top Tip: Bulk buying Cloud Services

This is a useful little tip that could have saved multiple companies I have worked with lots of money, particularly at the beginning of a migration or evaluation of a new cloud service project.

So you have done an initial proof of concept, you’ve done a check with Gartner to see whether everything fits your needs before taking on a cloud service, and then you start to talk to the vendor about money. The salesman then offers you an amazing deal if you spend X amount of money or pre buy a large chunk of services at a huge discount.

Before you make that commitment, think of two things:

  1. A lot of the cloud services have an expiry time on them. So say they give you a 2 for 1 offer of you spend £1 million this year, and you end up with £2 million worth of services. But originally you were hoping or planning to spend only £500,000 on services and only have the people allocated to work at that speed, now you have 4 times the amount and only a year to spend them, else they will go to waste, Eeek.
  2. We all work as if our projects are going to succeed, and we all behave as if they’re a 100% sure thing. But we must when making responsible decisions with other people’s money, think of what happens if the project gets stopped or cancelled or otherwise is prevented from getting over the line. In that case, what would you say to the finance people who ask “Hang on a minute, you’ve spent a million pounds worth of services what are we going to do with that now?.”

Conclusion

The cloud is a beautiful world and offers so many amazing things. but a vendor salesman is still a salesman. They are ultimately on the vendors side, not yours. Think of your client / employer and spend the money they have put in your care responsibly.

London’s Calling 2023

 

London’s calling for salesforce is a classic community conference with all the things that you don’t normally get in a vendor or marketing conference.

Initial impressions as someone attending it for their first time was that the conference was run by a long-time group of friends, 1 and gave off that warm community feel but still run professionally, and given that Salesforce is a still growing trend it was well catered for with all the freebies, everything from massages to good giveaways.

The Venue was excellent, helped by fantastic weather, but with obvious backups in case it all went rainy.

The content was far more what I like to see at a conference, with tons of deep dive content for the product and how to use and expand it. As you can see from the screenshot, there was loads of concurrent sessions going on and the vendor area was well placed because you go via it between sessions to grab a coffee.

The Vendor area its self was nicely laid out, and fit in well with the conference timetable giving you time when you needed it to have a good chunk of time to talk to a specific vendor.

 

There was a giant floor decoration of all the speakers in the main entrance, which was a nice touch

Good T-shirt Swag, solid make good art and slightly unusual (front)

Good T-shirt Swag, all Serious sponsors (Back)

You could not get your t-shirt till at least lunchtime. So you can’t run off early which is always a known problem when hosting an event inside a major metropolis that is full of interesting stuff.

The session content its self was slightly different to the other community conferences I go to, which are very very geeky with live coding, meaning you had to research before you even went to the session. This was far more accepting to people who didn’t know much about Salesforce or its community. I think the phrase to sum it up would be “No Click Code”, which seems to be a buzzword used by a number of the vendors.

The best session of the day was “What happens when you click the Save button?” by Simon Connock, I would have killed for this lesson 5 years ago.

Personally, I do wish there was one channel available that you went to and did some really, really geeky stuff that was written by the pure devs, but I’m aware that that’s because that’s what I am. Everything else seemed to be catered for perfectly with good food and even an after conference party.

Now this is how you label snacks and Food at a conference.

Something that should be noted is they’ve managed to achieve near perfect diversity on genders in both attendees and presenters, far better than any of the big conferences or even previous communities ones, however they’ve done it, other conferences should be copying it.

Conclusion

Frankly I have not been shutting up about this conference to my colleagues that do salesforce, and will sure as hell be going next year as well as dragging multiple people with me, it is very much recommended.

  1. just like the IBM ones I used to attend,[]

Elephant in the room: The data cost of AI

Now you will forgive the slightly overdramatic nature of this title, but it’s something that had me practically jumping up and down during the main demo in the latest Salesforce conference, and it’s something that comes to light in every single AI Demo that I’ve seen.

And that is the use of AI when it comes to data in a demo verses a corporate environment or doing something serious. Ultimately the problem is data access. As Salesforce themselves have just said, “AI is only as good as the data”, and I’m not talking about Bias or AI hallucinations or any of that. I’m simply talking about large scale AI use in a corporate environment and the expectations of the business.
AI has been sold as basically the solution to all things of life on Earth. but to give those kind of delivery’s all of your data has to be available to AI near Instantly, and all of IT know that is not the case.

let’s take the Salesforce demo as a perfect example. They were showing that AI had access to four different data sources, and it was using that access to construct the perfect interactions with customers and other humans. Brilliant. Perfect.. . But all of that data is already in Salesforce. AWS‘s demo was exactly the same. All of the vendors that are showing these beautiful AI’s using data are effectively doing it from local sources in modern formats and with high quality data. If we would move this back to our corporate environment , the sheer volume of legacy data that is scattered all over the place used in various different systems would initially make this impossible. if you don’t believe me, think of when you’re trying to merely do reports cross data sources and how much of a pain that it is.
But we are not home to Impossible, so before selling AI to the business make sure you set the expectations and work out how you are going to get round current limitations

These Limitations are grounded in the following:

  1. Interfaces: Lots of legacy data does not have standard interconnects and does not have standard API’s to get it all in one place, particularly in terms of real time access, sorting this out is additional cost and time on your integrations.
  2. The underlying infrastructure: A lot of the legacy data you will be pulling in 1 will not work in the way you think it does. So if you’re thinking your AI will be able to access all data 24/7 then you are in for a shock. Some legacy systems will be down or slow for backups. Some legacy systems will be running at 80% or 90% of their utilisation because no one’s bought them bigger servers. Some systems will be suffering from their own heavy loads because of quarter and year end reports. These will mean that source systems are already under tremendous strain and you adding additional load is not going to make the business happy.
  3. Data Transfer: There is also the matter of data transference and accountability. when feeding AI’s we will often be pulling data from all over the planet. that’s fine most of the time but you do have to be careful. The EU tends to only be bothered when you’re shovelling data outside of its borders. But there are countries that get really really hot and sweaty about moving data across borders. Italy and Turkey are two that leap to mind, and the German Workers Council is famously very fighty when it comes to moving peoples data around and exposing it in different ways. So that has to be handled. Some companies like Salesforce are aware of these problems. They made a big thing on their recent secure layer, but that secure layer is for data leaving salesforce not the other way round, be careful people, there be dragons!.
  4. Syncing: While we are talking about moving data, don’t think you’re going get away with merely syncing data to a giant central repository. it can get huge and out of control very quickly, the costs can sky rocket and sooner or later an accountant is going to ask you to justify the reason you are maintaining the same data twice, Then you have then got the constant syncing to deal with, which in itself can trigger some of the points I mentioned above, and finally a lot of the legasy data stores do not maintain transactional logging, or even update logging. a feature needed for reliable syncing.
  5. Cost: All this data pumping around for integration costs money and the nearer you want it to real time the more money it costs. Is the new AI features going to give you a good Return on investment or a competitive advantage?
  6. Silo Owners: Lastly there are the owners of the current data sets, these source systems are often that person(s) career and job and they will fight tooth and nail to maintain control of it. thus introducing a political element in your deliverable that can get very messy.
  7. Meta Data & Data Quality: 2 Mark Forster made the following wise addition:

    I would add another issue to those described here, which is metadata management. Are all data used being described in the same way with rigourous and applicable ontologies? In many cases this is not true and data from different sources are described in different ways. Maybe it’s trivial in that the units differ (Kg Vs Stones). Maybe the same attributes are given different names. As always it must be cleaned to a high standard to achieve meaningful results.  

    I could not agree with this comment more and its why I spend so much time on the Insurance Dictionary, as data definitions and quality are a terrible plague on integrations.

And that’s it, this was merely a short rant. It’s not to say that AI is not going to produce all of the awesomeness for us that has been promised. But you’ve got to manage the business expectations on what it can do with the information you can get to it. And the core of that is to make sure it has all the data in a timely fashion, costed out, and confirmed on a legal side.

  1. and I say this from decades of real life experience,[]
  2. Edit: This entry was added after I cross posted to linkedin[]