×

Category: Transformation

The Disability Price Tag

How bad service design punishes those with a disability.

If you’ve never heard of the ‘Disability Price Tag’; the simplest way to describe it is that it’s the price you pay for costly mistakes due to symptoms of (or a byproduct of) your disability.

An image by the ADHD treatment centre describing the common issues that cause the 'ADHD tax'
The ADHD tax – as described by the ADHD Treatment Centre

People with ADHD are so often hit by the disability price tag; it’s been nicknamed the “ADHD Tax“, some obvious examples of this are parking and traffic tickets, late fees, high interest debt (i.e. credit cards), and low credit score (leading to higher interest debt, inability to get loans, problems renting apartments and buying cars, etc.) But this isn’t the only way people with disabilities are being hit financially.

Heating, insurance, equipment and care cost’s are all higher for people with Disabilities; and the Scope Extra Cost Commission in 2014 found that people with a Disability faced an extra £550 a month on their living expenses due to disability-related expenditure. In fact a study in 2015 by MacInnes, T., Tinson, A., Hughes, C., Born, T.B., Aldridge, H: Monitoring poverty and social exclusion found the poverty rate among people in families where someone has a disability is 8% higher than of those in families where no one is disabled.

For able-bodied and/or neurotypical folks, these can be really hard to comprehend; and even when they can comprehend the truth is, they have little sympathy for it.

But even if we acknowledge that people with Disabilities face the Disability Price Tag; there’s little recognition of the fact that bad service design can make these problems significantly worse for people with disabilities.

Many companies deliberately make it hard to cancel your subscription after a free trial period; or to register a complaint or appeal an unjustified fine; this is a fact everyone can easily recognise. What people fail to recognise if that this disproportionately affects people with disabilities, who are likely they to find it much harder to navigate these systems.

An image showing some complicated cancellation instructions
Complicated cancellation instructions

Let’s examine an example we’ve been dealing with recently; my family recently paid for parking in Manchester‘s Arndale shopping centre; where they’ve recently swapped their parking system to a new paperless (but still not digital) system.

The new system requires you to enter your Registration number on the machine and pay, but doesn’t issue you a ticket. There are a number of flaws in here that makes this utterly inaccessible and un-user friendly for people; so lets consider the issues this could cause:

  1. You have to guess how long you’ll need to park; if you need longer there is no easy way to top up your time without physically returning to the machine to extend it. Not very practical for those with a physical impairment, or who have caring responsibilities which would make nipping back to the car difficult.
  2. As users aren’t issued a ticket, and the machine doesn’t capture your mobile number or any way of issuing an e-ticket or reminder, users have to remember what time their ticket ends; this is not designed to support those with a cognitive disability which may make remembering information difficult.
  3. The system requires you to enter your registration number from memory; there are no reminders or prompts on the system; which features a normal alpha-numeric keyboard, relying on you to know your registration by heart; which means it can be very easy to enter your registration plate wrong. As the system doesn’t issue you a ticket/receipt; nor does it email you your ticket information, there is no way to check you entered the information correctly. This can easily trip up people with a learning difficulty like Dyslexia, or for those whom the English Alphabet is not their default.

My partner has Dyslexia and ASC and I have ADHD (always a winning combination when it comes to the Disability Price Tag). When we parked and went to purchase the ticket we managed to accidentally mix up the O for a 0; because we didn’t receive a copy of our ticket, we had no way of knowing we’d made this mistake, and simply assumed everything was fine. Upon returning to our car we spotted the fine sitting on the car window. We went to speak to the parking warden, who acknowledged we did indeed have a valid ticket on the system, but that the reg was technically wrong, and therefor we’d been fined. The warden acknowledged this was a very common issue that caught many people out; and recommended we appeal.

Appealing in and of its self is not an easy process; and what many able-bodied and neurotypical folks don’t understand is how much privilege it takes to appeal things; and how many spoons it can take to do so. Lets use our example of the the parking fine again:

  • The details of the fine and how to appeal it was only available on a piece of paper, and you have to wait unto 24hrs for the system to be updated before you can appeal (very easy to lose for those with ADHD etc.)
  • The print is extremely small (not good for those with a visual impairment and/or Dyslexia etc.);
  • and the reference number is not only small but also long and complex (very easy to get wrong for those with Visual Impairments or ADHD/ Dyslexia etc.)

Of course, to make matters more frustrating Manchester City Council then rejected the appeal, saying it’s the users responsibility to make sure they have bought their parking ticket correctly; which is a prime example of the higher price tag people with a Disability can face for doing that so many people would consider relatively simple.

So, how when we’re designing services can we do better? The answer is easy than you think; carry out research and test with users! User research and testing aren’t only applicable when we’re developing web services; they are equally important when developing and rolling out new systems in the physical world. Within Disabled service provision, this co-production of services is being seen as more and more important.

Co-Production is a term used to describe the partnership between people with disabilities or health conditions, carers and citizens and those who develop and run public services. While the upfront costs of co-production may seem higher; by designing services with users needs at the heart, we can significantly reduce the financial and emotional burden inaccessible services place on those with Disabilities; in turn improving their quality and way of life.

How do we make legacy transformation cool again?

Guest blog first published in #TechUk’s Public Sector week here on the 24th of June 2022.

Legacy Transformation is one of those phrases; you hear it and just… sigh. It conjures up images of creaking tech stacks and migration plans that are more complex and lasting longer than your last relationship.  

Within the Public Sector, over 45% of IT spend is on Legacy Tech. Departments have been trying to tackle legacy transformation for over 20+ years; but it remains the number one blocker to digital transformation.  

An image of some servers in black and white covered in wires.
Black and White servers

So why is it so hard and what can we do about it?   

The fundamental problem with Legacy transformation is that as an approach it’s outdated.  

The problem companies are trying to solve is that their technology systems need modernising or replacing; usually (at least in the public sector) these programmes come about because a contract is coming to an end and/or the platform the companies’ technology was built upon is effectively burning and can no longer be maintained.  

The problems with this approach are:  

  • That it so often ends in a big bang transition due to the desire to avoid hybrid running of services because of the complexity of migration 
  • The architecture of the new system is constrained by the need to remain consistent with the technical architecture used across the organisation,   
  • Transformation programmes can easily fall into the trap of delivering a ‘like for like’ solution that misses out on opportunities for innovation; this can be for many reasons, often as they have a cliff edge contract leaving them in a rush to find a replacement quickly,   
  • The programmes are developed in siloes, only considering the technical changes needed; but they don’t consider the wider business change needed to make transformation stick.  
  • The value is only delivered once the new service goes live and replaces the old system when it’s turned off.  This leaves many organisations needing to run both systems at once; but not wishing to due to the large cost implications.  

Due to these issues the big bang delivery often ends up being a lot later than planned; costing significantly more while neither meeting the users or business needs; and quickly becoming outdated.  

Don’t forget, the latest thing you’ve just updated will itself be considered Legacy in 5 years; so do we need to start thinking about legacy transformation differently? Is there an iterative approach to legacy transformation that works, and how should we approach it?  

Within Kainos we’ve worked hard to bring the User Centred design principles we’ve used to successfully deliver Digital Services to accomplish high impact legacy transformation programmes. By understanding user needs and business requirements we can plan early for ‘just enough’ legacy change to support the transformation; prioritising and identifying the value that can be added where and when; building scalable and extensible services that will maximise automation opportunities; carefully evaluating transition options and data migration dependencies so we can ensure we’re meeting user needs and adding value at each stage without risking business disruption.   

A whiteboard covered in post it notes and a user journey to demonstrate user centred design
User Centric Design

This incremental, user centred approach allows us to identify opportunities for innovation and truly enable digital transformation that focuses on the business benefits, reducing overall costs whilst realising value early and often.  

By thinking about business change and taking this iterative approach to realise value early and often we’ve been able to stop assuming that every element of the old legacy service needs throwing out and replacing; and instead, we’re identifying those elements that can be kept with just a bit of love and care to update them and make them work, and which elements we need to deliver something new. By prioritising where we focus our effort and making sure whether it is something old or something new, or a combination of the two, we can meet those critical user and business needs.  

Up-cycling doesn’t just work for vintage furniture and clothes after all, maybe it’s time we take that same mindset when we’re think about technical transformation; reinventing something old and making it into something better and new. After all tech changes faster than ever, so if we don’t change our mindset and approach, we will be left behind and quickly not just become out of fashion, we’ll be outdated.  

By adapting our approach to Legacy Transformation, Kainos are able to build excellent services that are secure and that users want to use; transforming business processes to fully embrace digital channels; microservices architecture that reduces future legacy risk; and costs that are optimised to benefit from public cloud platforms. 

Maximising the Lean Agility approach in the Public Sector

First published on the 26th June 2022 as part of #TechUk’s Public Sector week here ; co-authored by Matt Thomas.

We are living in a time of change, characterised by uncertainty. Adapting quickly has never been more important than today, and for organisations, this often means embracing and fully leveraging the potential of digital tools.

A lot has been said about Lean Agility but for an organisation in the Public Sector facing the prospect of a digital transformation, it is still difficult to understand what to do and how.

In our mind, while lean helps to solve the right problems, agility supports quick adaptability and the ability to change course whenever necessary.

A poster saying 'build, measure, learn" with an image of a pencil eraser removing the "L" or learn
Build, Measure, Learn

At Kainos working in the Digital Advisory team the one problem we hear about repeatedly from clients is the difficulties they face of delivering the right thing at pace, and how they struggle to maximise their efficiency. Some of the typical red flags we see when beginning to understand why clients are struggling to deliver effectively are:

  • evergreen delivery projects that never end; without an end product in sight or a product nobody uses constantly being tweaked; as opposed to teams delivering units of quantifiable value,  
  • lacking prioritisation; everything is a priority and so everything is in-flight at the same time,  
  • development is stalled or slow; with poor delivery confidence and large gaps between releases, 
  • traditional long-term funding cycles requiring a level of detail which doesn’t match near-term agile planning and responsive delivery, 
  • ineffective communication and lack of experienced deliver leadership; so decision making is made on gut feel and who shouts loudest rather than being firmly tied to desired business outcomes, 
  • Siloed pockets of various stages of Agile adoption /maturity and effectiveness making coordinated planning and collaboration difficult. 

Within Kainos our belief was that by introducing Lean-Agility Management we could scientifically remove waste & inefficiency whilst Increasing delivery confidence, employee job satisfaction and visibility of the work being undertaken. As such we. introduced a lightweight and straightforward Lean-Agility approach that could be adopted across multiple portfolios. 

Our approach does not just focus on Agile coaching (although that’s part of it) or other isolated elements of a transformation, but on 4 distinct pillars: Lean-Agility Management, Lean-Analytics & Dashboarding, Product & Design Coaching and Agile Coaching & Architecture.  This gives us the opportunity to build sustainability and in-house expertise to continue this journey. 

Recently we’ve been working with an integrated energy super-major to help them improve in several of these key areas.  We were asked to help, whilst contributing to the wider Agility transformation by bringing consistent high standards in delivery culture and ways of working through Lean and Agility. 

The results have delighted the client; we have managed to improve delivery speed by over 70%, delivery confidence by more than 50% and job satisfaction by over 20%.

This approach is one we’re using with several other clients in the commercial sector, all with similar positive effects; but it’s not something we encounter being used within the Public Sector much; either by us or by other consultancies.

How can this approach help the public sector and what is needed to make this a success?

From our experience, we have found the key elements to getting this right are:  

  • Starting with a Proof of Value (POV) – We tend to pick two volunteer squads to test with and prove this approach can work and add value.  
  • Senior Buy in and time – Agility transformation lives and dies by the clarity and direction of its leaders; teams need clear leadership, the support and empowerment to innovate and improve.  
  • Pod structure connects the transformation from exec to squads 
  • Multi-disciplined Agility team with knowledge of Product, Design and DevSecOps as well as Agility 
  • Desire to change culture – We don’t just mean continuous improvement, everybody does that, the difference is evolving to a resolute passion to rigorously improve everything 
  • Data at the core – clear metrics give teams a direction of travel and an idea of where targeted improvements could add real value   
  • Consider the people – We track job satisfaction because it’s important. Improvements come from your people. If you keep losing your people, you’re constantly going to be in a state of hiring and retraining, which is costly in terms of time and money. Happy people innovate and perform better.

Our Lean-Agility approach is very much an Agile approach to an Agile transformation, we start small prove the value, learn your business, customise and adapt. Lean-Agility is something we mould to you rather than a theory we try to plug and play, in that sense Lean-Agility for you will look and feel different to Lean-Agility for a different client and so it should! 

Product vs Service vs Programme?

How we define a product vs a service is a debate that comes up regularly; as proved by Randal Whitmore (Deputy Director of New Propositions at the UKHSA) today on Twitter:

In fact, it comes up so regularly, I could have sworn I’d blogged about it before; but if I have, it isn’t on here! So, what is the difference and does it matter?

If you search online for ‘Product vs. Service’ you’ll get a very dry (an in my opinion not that helpful) answer that “A product is a tangible item that is put on the market for acquisition, attention, or consumption, while a service is an intangible item, which arises from the output of one or more individuals. … In most cases services are intangible, but products are not always tangible.”

There you go, question answered!

Ok, so lets say you actually went a useful response; that is understandable; what’s the answer? The best analogy I have ever found to help describe this is one I heard Ben Holliday use once, and I’ve since stolen and reused any time anyone ever asks me this question (which is pretty regularly)!

So, let’s talk about going on holiday!

Sunglasses on a beach
Dreaming of a sunny holiday

A service is all about someone delivering the outcome you want to achieve.; its the holistic wrapper that contains all the end to end steps needed to enable you to achieve that desired outcome.

Let’s say you want to go on holiday; you can choose to use a travel agency like Tui who offer holidays as a service. Should you decide you want a package holiday, you can book and pay for your entire holiday through Tui and they will organise everything for you. Or you may decide you want to do all the organisation yourself and as such just need to book some flights, and go directly to KLM or EasyJet to book your flights. The services these companies offer are all similar (Tui will let you just book flights for example) but they will all differ in some ways; which is generally where the products that make up the service come in.

Products are the individual components that are part of that holistic service wrapper.

For our example of a package holiday; you can choose your flights; how much luggage you want to take with you, what hotel you want to stay at, whether you want to go on any excursions etc. These are all products a travel agency offer as part of their wider service; and you can choose which products you wish to use; But it’s not only that, you can also choose how you book your holiday. You can book via the app; via their website; you could call them and book over the phone; or you could book in one of their shops (well, ok not so much nowadays, but for our hypothetical example lets say you still can).

Lets say it’s the day before your holiday; A few years ago Tui released a new product; which was their App, which included lots of new features that customers could choose from. Now a days you can check in online; you can download your boarding pass to your phone; you can choose your seats; request special assistance and choose to check your bags in all before you get to the airport via the app.

white airplane on mid air
Come Fly Away

We’ve talked about the customer facing products and features that make up the holiday service a travel agency offers; but there is obviously a lot more to it than that. As part of developing each of these products the travel agencies had to think about how they would all fit together to form the holistic service. Theres also all the back end integration to think about, to offer their holiday Service Tui need to work with other suppliers (like the Airports and hotels; which partner with Tui, but are not owned or controlled by them). Should your flight get cancelled or delayed because of bad weather or congestion at the airport; the travel agency will first need to be notified, and then to notify you as their customer and give you options on what to do next etc.

When they decided to launch the App; or to open up holiday options into a new country; a programme could have been set up to manage this. A programme is one way an organisation may choose to manage multiple work streams or teams that are working to deliver something. They are entirely internal, and make no difference to the end users experience.

So there you have it:

A service is about the desired (intangible) outcome; it’s holistic and made up of many products etc.

A product is a succint (tangible) element that delivers value, it is made up of many features. A product can stand alone or alongside other products as part of a holistic service.

A feature is a componant of a product that adds value as part of the wider product but offers little value when utilised alone.

A programme is an organisational governance mechanism that can be used to organise and manage teams to deliver an outcome.

Assessing Innovation

(co-written with Matt Knight)

Some background, for context

Just over a month ago I got approached to ask if I could provide some advice on assessments to support phase two of the GovTech Catalyst (GTC) scheme. For those who aren’t aware of the GovTech Catalyst Scheme, there’s a blog here that explains how the scheme was designed to connect private sector innovators with the public sector sponsors, using Innovate UK’s Small Business Research Initiative (SBRI) to help find promising solutions to some of the hardest public sector challenges.

Person in a lab coat with a stethoscope around their neck looking through a Virtual Reality head set.
Looking for innovation

The Sponsor we were working with (who were one of the public sector sponsors of the scheme) had put two suppliers through to the next phase and allocated funding to see how and where tech innovation could help drive societal improvements in Wales. As part of their spend approval for the next phase, the teams had to pass the equivalent of a Digital Service Standard assessment at the 6 month point in order to get funding to proceed. 

For those who aren’t aware, there used to be a lovely team in GDS who would work with the GTC teams to provide advice and run the Digital Service Standard assessments for the projects; unfortunately this team got stood down last year; after the recent GTC initiatives started, leaving them with no one to talk to about assessments, nor anyone in place to assess them. 

The sponsor had reached out to both GDS and NHS Digital to see if they would be willing to run the assessments or provide advice to the teams, but had no luck; which left them a bit stuck; which is where I came in. I’ve blogged before about the Digital Service Standards; which led to the Sponsor reaching out to me to ask whether I’d be willing and able to help them out; or whether I knew any other assessors who might be willing to help. 

Preparing for the Assessments

As there were two services to assess; one of the first things I did was talk to the wonderful Matt Knight to see if he’d be willing and able to lead one of the assessments. Matt’s done even more assessments than me; and I knew he would be able to give some really good advice to the product teams to get the best out of them and their work. 

Matt and I sat and had a discussion on how to ensure we were approaching our assessments consistently; how to ensure we were honouring and adhering to the core tenants of the Digital Standards whilst also trying to assess the teams innovation and the value for money their services could deliver in line with the criteria for the GovTech scheme.

What became quickly apparent was; because this was to support the GTC scheme; the teams doing the work were fully private sector with little experience of the Digital Service Standards. A normal assessment, with the standard ‘bar’ we’d expect teams to be able to meet, wouldn’t necessarily work well; we’d need to be a little flexible in our approach. 

Obvious, no matter what type of Assessment you’re doing the basic framework of an assessment stays the same (start with user needs, then think about the End-to-End service, then you can talk about the team and design and tech, and along the way you need to ask about the awkward stuff like sustainability and open source and accessibility and metrics) can be applied to almost anything and come up with a useful result, regardless of sector/background/approach. 

As the services were tasked with trying to improve public services in Wales, we also wanted to take account of the newly agreed Welsh Digital Standards; using them alongside the original Digital Standards; obviously the main difference was the bits of the Welsh Standards that covered ensuring the well-being of people in Wales and promoting the Welsh Language (standards 8 & 9), you can read more about the Well being of future generations Act here

The assessments themselves 

An image of a team mapping out a user journey
User Journey Mapping

The assessments themselves ran well, (with thanks to Sam Hall, Coca Rivas and Claire Harrison my co-assessors) while the service teams were new to the process they were both fully open and willing to talk about their work, what went well and not so well and what they had learnt along the way. There was some great work done by both the teams we assessed, and it’s clearly a process that everyone involved learned a lot from, both in terms of the service teams, and the sponsor team, and it was great to hear about how they’d collaborated to support user research activities etc. Both panels went away to write up their notes; at which point Matt and I exchanged notes to see if there were any common themes or issues; and interestingly both assessments had flagged the need for a Service Owner from the sponsor to be more involved in order to help the team identify the success measures etc. 

When we played the recommendations and findings back to the Sponsor, this led to an interesting discussion; although the sponsor had nominated someone to act as the link for the teams in order to answer their questions etc. and to try and provide the teams some guidance and steer where they could. Because of the terms of the GTC scheme, the rules on what steers they could and couldn’t give were quite strict to avoid violating the terms of the competition. Originally the GTC team within GDS would have helped the sponsors navigate these slightly confusing waters in terms of competition rules and processes. However, without an experienced team to turn to for advice it leaves sponsors in a somewhat uncomfortable and unfamiliar position; although they had clearly done their best (and the recommendations in this blog are general comments on how we can improve how we assess innovation across the board and not specifically aimed at them)”

Frustratingly this meant that even when teams were potentially heading into known dead-ends etc; while the sponsor could try to provide some guidance and steer them in a different direction; they couldn’t force the teams pivot or change; instead the only option would be to pull the funding. While this makes sense from a competition point of view; it makes little to no sense from a public purse point of view; or from a Digital Standards point of view. It leaves sponsors stuck (when things might have gone a little off track) rather than being able to get teams to pivot; they are left choosing between potentially throwing away or losing some great work; or investing money in projects that may not be able to deliver. 

Which then raises the question; how should we be assessing and supporting innovation initiatives? How do we ensure they’re delivering value for the public purse whilst also remaining fair and competitive? How do we ensure we’re not missing out on innovative opportunities because of government bureaucracy and processes? 

In this process, what is the point of a Digital Service Standard assessment? 

If it’s like most other assessment protocols (do not start Matt on his gateway rant), then it’s only to assess work that has already happened. If so, then it’s not much good here, when teams are so new to the standards and need flexible advice and support on what they could do next etc.   

If it’s to assess whether a service should be released to end users, then it’s useful in central government when looking to roll out and test a larger service; but not so much use when it’s a small service, mainly internal users or a service that’s earlier on in the process aiming to test a proof of concept etc. 

If it’s to look at all of the constituent areas of a service, and provide help and guidance to a multidisciplinary team in how to make it better and what gaps there are (and a bit of clarity from people who haven’t got too close to see clearly), then it’s a lot of use here, and in other places; but we need to ensure the panel has the right mix of experts to be able to assess this. 

While my panel was all fantastic; and we were able to assess the levels of user research the team had done, their understanding of the problems they were seeing to solve, their ability to integrate with legacy tech solutions and how their team was working together etc. none of us had any experience in assessing innovation business cases or understanding if teams had done the right due diligence on their financial funding models. The standards specify that teams should have their budget sorted for the next phase and a roadmap for future development; in my experience this has generally been a fairly easy yes or no; I certainly wouldn’t know a good business accelerator if it came and bopped me on the nose. So while we could take a very high level call on whether we thought a service could deliver some value to users; and whether a roadmap or budget looked reasonable; a complex discussion on funding models and investment options was a little outside our wheelhouse; so was not an area we could offer any useful advice or recommendations on.  

How can we deliver and assess innovation better going forward? 

If we’re continuing to use schemes like the GTC scheme to sponsor and encourage private sector innovators to work with the public sector to solve important problems affecting our society, then we obviously need a clear way to assess their success. But we also need to ensure we’re setting up these schemes in such a way that the private sector is working with the public sector; and that means we need to be working in partnership; able to advise and guide them where appropriate in order to ensure we’re spending public money wisely. 

There is a lot of great potential out there to use innovative tech to help solve societal issues; but we can’t just throw those problems at the private sector and expect them to do all the hard work. While the private sector can bring innovative and different approaches and expertise, we shouldn’t ignore the wealth of experience and knowledge within the public sector either. We need people within the public sector with the right digital skills, who are able to  prioritise and understand the services that are being developed inorder to ensure that the public purse doesn’t pay for stuff that already exists to be endlessly remade. 

Assessment can have a role in supporting innovation; as long as we take a generous rather than nitpicking (or macro rather than micro) approach to the service standard. Assessments (and the Standards themselves) are a useful format for structuring conversations about services that involve users (hint: that’s most of them) just the act of starting with user needs – pt 1 – rather than tech – changes the whole conversation. 

However,  to make this work and add real value, solve a whole problem for users (point 2 of the new uk govt standard) – is critical, and that involves having someone who can see the entire end to end process for any new service and devise and own success measures for it. The best answer to both delivering innovation, and assessing it, is bringing the private and public sector together to deliver real value; creating a process that builds capacity, maturity and genuine collaboration within the wider public sector. A space to innovate and grow solutions. True multidisciplinary collaboration, working together to deliver real value.

“Together, We Create”

Big thanks to Matt for helping collaborate on this, if you want to find his blog (well worth a read) you can do so here:

Cost vs. Quality

A debate as old as time, and a loop that goes around and around; or so it seems in the Public Sector commercial space.

Every few years, often every couple of spend control cycles, the debate of cost vs. quality rears its head again; with Commercial weighting flip flopping between Quality as the most important factor, to cost (or lowest cost) as the highest priority.

When quality is the most important factor in the commercial space; Government Departments will prioritise the outputs they want to achieve; and weighting their commercial scores to the areas that indicate Quality – things like ‘Value Add’; ‘Delivering Quality’, ‘Culture’, ‘Delivering in Partnership etc’. We will see more output focused contracts coming out on to the market; with organisations clear on the vision they want to achieve and problems they need to solve and looking for the supplier that can best help them achieve that.

When reducing costs becomes the highest priority, the commercial weighting moves to ‘Value for Money’. Contracts are more likely to be fixed price and are often thinly veiled requests for suppliers to act as body shops rather than partners with commercial tenders scoring day rate cards rather than requesting the cost for overall delivery of outcomes.

Unfortunately, a lot of the time, when the priority switches to cost over quality; we end up with a lot of projects not being delivered; of outcomes being missed, and user needs not being met. In order to cut more and more costs, offshoring resource can become the only way to deliver the results cheaply; with the departmental project teams working out of sync with their offshore delivery partners; making co-design and delivery much harder to do, and making it almost impossible to achieve the required quality. This goes in a cycle, with Departments toting and grooming between “offshore as much as possible to cut costs” and “the only way to deliver quality is for everyone to be collocated in the office 100% of the time”. Full collocation of the teams inevitably driving up the costs again.

So, does that mean in order to get quality we have to have high costs? Surely there is an obviously a sweet spot we’re all looking for, where cost and quality align; but why does it seem so hard to achieve within the Public Sector and what do we need to be looking at to achieve it?

When the government commercial function (and GDS) shook up the public sector digital world over nearly a decade ago they introduced things like the Digital Marketplace and implemented the Spend Control pipeline; with the aim of moving departments away from the large SI’s that won 90% of government contracts. These suppliers often charged a fortune and rarely seemed to deliver what was actually needed. (This blog gives the details on what they intended, back in 2014).

Lots of SME suppliers began to enter the market and began to win contracts and change up how contracts were delivered, as completion increased, costs decreased; with quality partnerships forming between new suppliers and government departments; and the quality of delivery increased as new options, solutions and was of working were explored.

However, this left Departments managing lots of individual contracts; which grew increasingly complex and time consuming to mange. In order to try and reduce the number of contracts they had to manage; the scale of the contracts began to increase, with more and more multimillion pound contacts emerging.

As the size and value of the contracts increased, SME’s began to struggle to win them, as they couldn’t stand up the teams needed quickly; nor could they demonstrate they had the experience in delivering contracts of that scale; which became a bit of a self-fulfilling prophecy, as the larger SI’s continued to win the larger contracts as they were the only ones able to provide the evidence they could staff and deliver them; and their costs remained high.

This left the SME’s facing three options:

  • Decide not to try for the larger contracts, reducing the amount of competition; potentially increasing costs and decreasing quality in the long run);
  • Form partnership agreements with a number of other SME’s or a larger supplier (again reducing the amount of completion) in order to be able to stand up the teams needed and enable delivery of larger contracts. However having a consortium of suppliers not used to working together could complicate delivery, which could in turn decrease the quality or speed of delivery if not carefully managed; as such not all contracts allowed consortium or partnership bids due to the perceived complexity they could bring.
  • Or the SME aimed to grow to allow them to be able to win and deliver the larger contracts. As SME’s grew however, they would often have to either increase their costs in order to run a larger organisation that could still deliver the same quality they did as before; or they could keep their costs low, but their quality would likely decrease.

Throughout the pandemic, the focus has been on delivery; and there’s been a healthy mix of both small and large contracts coming out, meaning lots of competition. While costs have always been a factor;  the pandemic allowed both departments and suppliers to remove much of the costly admin and bureaucratic approval processes in favour of lightweight approaches involved to bring on suppliers and manage teams outputs, encouraging innovation in delivery and cost; with lockdowns ensuring co-location was now out of the question many suppliers were able to reduce their rates to support the pandemic response as both departments and suppliers agreeing that the priority had been on delivering quality products and services to meet organisations and users urgent needs.The removal of co-location as a prerequisite also open up the market to more suppliers to bid for work, and more individuals applying for more roles; which increased competition and inevitably improved the quality out the outputs being produced. This in fact led to a lot of innovation being delivered throughout the pandemic which has benefited us all.

As we move out of the pandemic and into the next spending review round; the signs are that the focus is about to swing back to costs as the highest priority. With larger contracts coming out that are looking for cheaper day rates in order to allow departments to balance their own budgets; but as the economy bounces back and departments begin to insist again that teams return to the office, most suppliers will want to increase their costs to pre-pandemic levels. If we’re not careful the focus on cost reduction will mean we could decrease the quality and innovation that has been being delivered throughout the pandemic; and could cost the taxpayers more in the long run. Look at DWP’s first attempt to deliver Universal Credit for how badly things can go wrong when cost is the highest priority and when the Commercial team and runs the procurement process with minimum input from Delivery; driving the commercial and deliver decisions being made more than quality.

To find the sweet spot between Cost and Quality we need to create the best environment for innovation and competition. Allowing flexibility on where teams can be based will support this; supporting and encouraging SME’s and Medium sized suppliers to bid for and win contracts by varying contract sizes and values. Focusing on outputs over body shopping. Looking for what value suppliers can add in terms of knowledge transfer and partnership rather than simply prioritising who is the cheapest.

It’s important we all work together to get the balance between cost and quality right, and ensure we remain focused on delivering the right things in the right way.

Seesaw

Service Owner vs. Programme Manager vs. Product Lead

What’s the difference? Does the name matter?

Over a year ago, following an interesting chat with David Roberts at NHSBSA, I got to thinking about the role of the Service Owner; and why the role wasn’t working in the way we intended back in the dawn of the Service Manual. This in turn (as most things do for me) led to a blog in order to try and capture my thoughts in the vague hope they might be useful/interesting for anyone reading them.

Ironically, for what was a random think-piece, it has consistently been my most popular blog; getting a least a dozen reads everyday since I wrote it. Which got me thinking again; what is it about that blog that resonates with people? And the fact is, the role of the Service Owner is no better or consistently understood today than it was then. The confusion over the role of the Service Owner; their role and responsibilities, is still one of the most common things I get asked about. What’s the difference between a Service Owner or Manager (is there one)? How/why is the role different to that of the Product Lead? What is the difference between a Service Manager and a Programme Manager? Is the Service Owner different to the SRO? What do all these different role titles mean?

What's In a Name? A lot. – AE2S Communications
What’s in a name?

Every department/Agency within the Public Sector seems to have implemented the role of the Service Owner differently; which makes it very hard for those in the role (or considering applying for the role) to understand what they should be doing or what their responsibilities are etc. This is probably why, as a community of practice within DDaT, it certainly used to be the one hardest communities to bring together, as everyone in it was doing such different roles to each other.

Some clients I’ve been working with use the role of Service Owner and Lead Product Manager interchangeably; some have Service Owners who sit in Ops and Service Managers who sit in Digital (or vice versa); some have Service Managers sitting alongside Programme Managers; or Service Owners alongside Programme Directors, all desperately trying to not stand on each others toes.

So what is the difference?

The obvious place to look for clarity surely is the Service Manual, or the DDaT capability framework. The Service manual specifies it is the responsibility of the Service Owner is to be: “the decision-making authority to deliver on all aspects of a project. Who also:

  • has overall responsibility for developing, operating and continually improving your service
  • represents the service during service assessments
  • makes sure the necessary project and approval processes are followed
  • identifies and mitigates risks to your project
  • encourages the maximum possible take-up of your digital service
  • has responsibility for your service’s assisted digital support”

When the DDaT capability framework was first written, the Service Manager was more akin to a Product person; and originally sat as a senior role within that capability framework; yet they were also responsibility for the end to end service (which was a very big ask for anyone below the SCS working as an SRO). But the role often got confused with that of the IT Service manager, and (as perviously discussed in last years blog) the responsibilities and titles got changed to create the role of Service Owner instead.

Interesting in the Service Manual the reference to the Service Owner being the person who has responsibility for the end to end service; has now been removed; instead focusing on them being the person responsible for being the person responsible for delivering the project. While I imagine this is because it’s very hard for any one person (below SCS level) to have responsibility for an end to end service in the Public Sector due to the size of the Products and Services the Public Sector delivers; it does however mean the new role as description in the Service Manual seems to bring the role of Service Owner closer to that of the Programme Manager.

However, in contrast to the description in the Service manual, the DDaT capability framework does still specify that the role of the Service Owner is “accountable for the quality of their service, and you will be expected to adopt a portfolio view, managing end-to-end services that include multiple products and channels.” Obviously the onus here has changed from being responsible for the end to end service to managing the end to end service. But even that is a clear difference to being responsible for delivering a project as the manual describes it.

Some elements of the new Service Owner role description in the Manual do still align to the traditional responsibilities of Product people (mainly considering things like assisted digital support and ensuring you can maximise take up of your service); but the Service Manual has now removed those responsibilities within a team from the Product Manager. Now the Product Manager seems too intended to be much more focused solely on user needs and user stories; rather than the longer term uptake and running of the service. But again, confusingly, in the Capability framework for Product Management there is still the expectation that Product people will be responsible for ensuring maximum take-up of the service etc.

It seems in trying to clarify the role of the Service Owner, the Service Manual and the Capability framework disagree on exactly what the responsibilities of the role are; and rather than clarify the difference between the Product people and the Service Owners, the waters have instead been muddied even more. Nor have they made it any clearer if/what the difference is between the role of the Service Owner or Programme manager is.

The Project Delivery Capability framework states that “there are many other roles that are needed to successfully deliver projects. These roles are not included in our framework but you will find information on them within the frameworks of other professions, such as, Digital, Data & Technology framework” frustratingly it doesn’t give any clarity on how and when roles like SRO or Programme Manager might overlap with roles within the DDaT framework; nor how these roles could work best with the roles within the DDaT framework. Both the Service Owner role and the Programme manager role state responsibility for things like stakeholder management; business case development/alignment; risk management and governance adherence. Admittedly the language is slightly different; but the core themes are the same.

So is the assumption that you don’t need both a Programme Manager and a Service Owner? Is it an either or that has never been clearly specified? If you’re using PRINCE2 you get a Programme Manager, if Agile its a Service Owner? I would hope not, mainly because we all know that in reality, most Public Sector digital programmes are a blend of methodologies and never that clear cut. So are we not being clear enough about what the role of the Service Owner is? Does it really matter if we don’t have that clarity?

Evidence has shown that when teams aren’t clear on the roles and responsibilities of there team mates, and especially those people responsible for making key decisions; then bottlenecks being to occur. Teams struggle to know who should be signing of what. Hierarchy and governance become essential to achieving any progress; but inevitabley delays occur while approvals are sought, which simply slows down delivery.

So can we get some clarity?

At the start of the year DEFRA advertised a role for a Service Owner which (I thought) clearly articulated the responsibilities of the role, and made it clear how that role would sit alongside and support Product team and work with Programme professionals to ensure effective delivery of services that met user needs. Sadly this clarity of role seems few and far between.

I would love, when travel etc. allows, to see a workshop happen mapping out the roles of Service Owner; SRO; Programme manager; Product Lead etc. Looking at what their responsibilities are; providing clarity on where there is any overlap and how this could be managed better so that we can get to the point where we have consistency in these roles; and better understanding of how they can work together without duplication or confusion over the value they all add.

For now, at least, it’s each organisations responsibility to ensure that they are being clear on what the responsibilities for the roles and those people working in them are. We need to stop pretending the confusion doesn’t exist and do are best to provide clarity to our teams and our people; otherwise we’re only muddying the waters and it’s that kind of confusion that inevitably impacts teams and their ability to deliver.

Let’s be clear, say what do you mean

Partnership

The good and the bad.

At Difrent we always talk about our desire to deliver in partnership with out clients. To move beyond the pure supplier and client relationship to enable proper collaboration.

One of my main frustrations when I was ‘client side’ was the amount of suppliers we’d work with who said they would partner with us, but then when the contract started, after the first few weeks had passed and the new relationship glow had faded; the teams and the account managers reverted to type. I can’t recall how many times I had to have conversations at the supplier governance meetings where I was practically begging them to challenge us; to be a critical friend and push for the right thing; to feedback to us about any issues and suggest improvements. It always felt like we were reaching across a gap and never quite making full contact.

As such, that’s one of the areas in Difrent I (and others) are very keen to embody. We try to be true partners; feeding back proactively where there are issues or concerns or where we have suggestions. Trying to foster collaborative ‘one team’ working.

We’ve obviously had more success with this on some contracts vs others. There’s always more we can learn about how to better partner with our clients; however; given we see a lot of complaining about strained partnerships between clients and suppliers; I thought I’d do a bit of a case study/ reflection and praise of one partnership we’ve been working on recently.

Difrent won a contract with the Planning Inspectorate last year, and it was the first completely remote pitch and award we’d been involved with on a multi million pound contract.

From the start of the procurement it became really clear that the Planning Inspectorate wanted a partner; that this wasn’t just lip service, but something they truly believed it. As part of the procurement process they opened up their github so we could see their code; they opened up their Miro so we could see their service roadmap, they proactively shared their assessment reports with suppliers etc.

For us this made not only a good impression, but enabled us to develop a more informed and valuable pitch.

Since we put virtual feet in the virtual door that dedication to partnership has remained as true 6 months later as it was then. Outside of our weekly governance calls we’ve had multiple workshops to discuss collaboration and ways of working. We’ve had multiple discussions on knowledge transfer and reflecting on progress and ways to iterate and improve.

Where there have been challenges we’ve all worked hard to be proactive and open and honest in talking things through. They’ve welcome our suggestions and feedback (and proactively encouraged them) and been equally proactive on giving us feedback and suggestions.

This has helped us adapt and really think about how we do things like knowledge transfer, always challenging (especially remotely), but something we’re passionate about getting right. We’ve all worked so hard on this, so much so that it’s become on of the core bits of our balanced scorecard; ensuring they as a client can measure the value they’re getting from our partnership not just through our outputs on the projects we’re working on, but our contributions to the organisation as a whole; which is also really helpful for us to be able to help us analyse and iterate our ‘value add’ to our partners; and ensure we’re delivering on our promises.

I think there is a lot of learning for other Departments/ ALB’s out there looking to procure digital services or capability on how a good partnership with a supplier needs to start before the contract is signed.

Thanks to Paul Moffat and Stephen Read at the Planning Inspectorate for helping with this blog – demonstrating that partnership in action!

Digital Transformation is still new

We’re punishing those who are less experienced, and we need to stop.

The timeline of Digital Transformation. Courtesy of Rachelle @ https://www.strangedigital.org/

In the last few weeks I’ve had multiple conversations with clients (both existing and new) who are preparing for or have recently not passed their Digital Service standard assessments who are really struggling to understand what is needed from them in order to pass their assessment.

These teams have tried to engage with the service standards teams, but given those teams are extremely busy; most teams cant get any time with their ‘link’ person until 6 weeks before their assessment; by which time most teams are quite far down their track and potentially leaves them a lot of (re)work to try and do before their assessment.

Having sat in on a few of those calls recently I’ve been surprised how little time is set aside to help the teams prep; and to give them advice on guidance on what to expect at an assessment if they haven’t been through one before. Thos no time or support for mock assessments for new teams. There may be the offer of one or two of the team getting to observe someone else’s assessment if the stars align to allow this; but it’s not proactively planned in; and instead viewed as a nice to have. There seems to be an assumption the project teams should know all of this already; and no recognition that a large number of teams don’t; this is still all new to them.

“In the old days” we as assessors and transformation leads used to set aside time regularly to meet with teams; talk through the problems they were trying to fix, understand any issues they may be facing, provide clarity and guidance before the assessment; so that teams could be confident they were ready to move onto the next phase before their assessment. But when I talk to teams now, so few of them are getting this support. Many teams reach out because the rare bits of guidance they have received hasn’t been clear, and in some cases it’s been contradictory and they don’t know who to talk too to get that clarity.

Instead, more and more of my time at the moment, as a supplier, is being set aside to support teams through their assessment. To provide advice and guidance on what to expect, how to prepare and what approach the team needs to take. Actually what an MVP is; how to decide when you need an assessment, and what elements of the service do you need to have ready to ‘show’ at each stage. What the difference is between Alpha/ Beta and Live assessments and why it matters. For so many teams this is still almost like a foreign language and new.

So, how can we better support teams through this journey?

Stop treating it like this is all old hat and that everyone should know everything about it already.

Digital Transformation has been ‘a thing’ for one generation (if you count from the invention of the internet as a tool for the masses in 1995); Within the public sector, GDS, the Digital Service Standards and the Digital Academy have existed for less than one generation; less than 10 years in-fact.

By treating it as a thing everyone should know, we make it exclusionary. We make people feel less than us for the simple act of not having the same experience we do.

We talk about working in the open, and many team do still strive to do that; but digital transformation is still almost seen as a magical art by many; and how to pass what should be a simple thing like a service standard assessment is still almost viewed as Arcane knowledge held by the few. As a community we need to get better at supporting each other, and especially those new to this experience, along this path.

This isn’t just a nice thing to do, its the fiscally responsible thing to do; by assuming teams already have all this knowledge we’re just increasing the likelihood they will fail, and that comes with a cost.

We need to set aside more time to help and guide each other on this journey; so that we can all succeed; that is how we truly add value, and ensure that Digital Transformation delivers and is around to stay for generations to come.

Talking Digital Transformation

It’s something that has come up a lot in conversations at the moment, what is Digital Transformation? What does Digital Transformation mean to me? I always joke that it’s my TED talk subject, if I had one; as such I thought why not write a blog about it?

What is Digital Transformation?

According to Wikipedia, Digital Transformation “is the adoption of digital technology to transform services or businesses, through replacing non-digital or manual processes with digital processes or replacing older digital technology with newer digital technology.

The Wikipedia definition focuses on 3 of the main areas of Digital Transformation; technology, data, process; which are the areas most people quote when but doesn’t reference organisational change; which is often recognised as the 4th pillar needed for successful transformation.

If we’re being specific, then I agree with the Wikipedia definition at the project or service level, but when someone says Digital Transformation to me; I automatically start thinking about what that means at the organisational level, before moving onto the other areas.

I’ve done plenty of blogs previously on the importance of considering your organisational culture when trying to implement change; and how likely it is that your transformation will fail if you don’t consider your culture as part of it; but that as we see from the Wikipedia Definition; the people side of Digital Transformation is often forgotten.

There’s a good blog here that defines the 4 main challenges organisations face when looking to implement Digital Transformation, which it defines as:

  • Culture.
  • Digital Strategy and Vision.
  • IT infrastructure and digital expertise.
  • Organisational Structure.

Here, we see Culture is the first/largest challenge mainly organisations face; which is why it’s important is’t not treated as an afterthought. Why is that? Is our methodology wrong?

So how do we go about delivering Digital Transformation?

The Enterprise project has a good article here on what it views as the 3 important approaches leaders should take when implementing Digital Transformation.

  • Solve the biggest problem first.
  • Collaborate to gain influence.
  • Keep up with information flows.

There’s (hopefully) nothing revolutionary here; this is (in my opinion) common sense in terms of approach. But so often, when we start talking about Digital Transformation, we can quickly fall into the trap about talking about frameworks and methodology; rather than the how and why of our approach to solving problems. So, are there any particular frameworks we should be using? Does the right framework guarantee success?

There are lots of different frameworks out there; and I can’t document them all; but below are some examples…

This article sums up what it deems as the top 5 Digital Transformation frameworks, which are the big ones; including MIT; DXC; CapGemini; McKinsey; Gartner; Cognizant and PWC. It’s a good summary and I won’t repeat what it says about each, but it looks at them in the following terms that I think are key for successful Digital transformation:

  • customer-centricity
  • opportunity and constraints
  • company culture
  • simplicity

There are obviously a few others out there; and I thought I’d mention a couple:

The first one is this AIMultiple; this one interestingly has culture as the final step; which for me makes it feel like you are ‘doing transformation to the teams rather than engaging teams and bringing them into the transformation; which doesn’t work well for me.

AIMultiple Digital Transformation Framework
https://research.aimultiple.com/what-is-digital-transformation/#what-is-a-digital-transformation-framework

This second one; from ionology, has Digital Culture and Strategy as its first building block; with user engagement as its second building with equal waiting to Processes, Technology and Data. It recognises that all of these elements together are needed to deliver Digital Transformation successfully. This one feels much more user centric to me.

https://www.ionology.com/wp-new/wp-content/uploads/2020/03/Digital-Transformation-Blocks-Equation.jpg

So where do you start?

Each of these frameworks has key elements they consider, in a particular order that they feel works best. But before panicking about which (if any) framework you need to pick; it’s worth remembering that no single framework will work for every business and any business will need to tailor a framework to fit their specific needs. 

How you plan to approach your transformation is more important than the framework you pick. Which is why the Enterprise article above about good leadership for me is spot on. We should always be asking:

  • What is the problem you’re trying to solve within your organisation by transforming it, and why?
  • Who do you need to engage and collaborate with to enable successful transformation?
  • What is the data you need to understand how best to transform your organisation?

Once you know what you’re trying to achieve and why, you can understand the options open to you; you can then start looking at how you can transform your processes, technology, data and organisational structure; at which point you can then define your strategy and roadmap to deliver. All of the above should be developed in conjunction with your teams and stakeholders so that they are engaged with the changes that are/will be happening.

Any framework you pick should be flexible enough to work with you to support you and your organisation; they are a tool to enable successful Digital Transformation; not the answer to what is Digital Transformation.

So, for me; what does Digital Transformation mean?

As the Enterprise Project states; Digital transformation “is the integration of digital technology into all areas of a business, fundamentally changing how you operate and deliver value to customers. It’s also a cultural change that requires organisations to continually challenge the status quo, experiment, and get comfortable with failure.” Which I wholeheartedly agree with.