×

Category: User Centric Design

Becoming Product Led

Recently I was asked how I would go about moving an organisation to being Product Led; when agile and user centric design are equally new to the company, or when agile has not delivered in the way that was expected.

Before diving into the how, I think it’s worth first considering the what and they why.

What do we mean by being ‘product led’?

A product led approach is where your product experience is the central focus of your organisation. Within the public sector we incorporate user centric design into our products to ensure that we deliver real value by:š

  • Taking an outside-in perspective (starting with user needs)š;
  • Rapid, early validation of ideas (testing early and often); š
  • Maturing through iteration (based on user feedback)š and
  • Disciplined prioritisation (using quantitative and qualitative data) to deliver value.

Is this not just another name for agile?

This is a question that comes up regularly; and in my opinion, no it’s not. Agile is a delivery methodology; being product led is wider than that. it’s the wrapper that sits above and surrounds the delivery approach you use. It comes ‘before’ you decide on which delivery methodology you will use; and continues long after. It’s your culture and ways of working. The two can often go hand in hand; but if agile is the how, product is the what and the why.

Why is being product led important?

šWell, by moving to a product led approach we allow the organisation to link their outputs to their customer needs and ensuring they align to their organisational capabilities and strategy. šIt also allows organisations to focus on their customers needs and understand their users perspectivesš. By understanding and focusing on user needs it allows organisations to deliver value faster, making it quicker and easier for organisations to learn from what has gone well (and what hasn’t)š which in turn makes cheaper and faster to address any issues or risksš. It also makes it easier for organisations to spot opportunities for innovation and growth.

How do you move your organisation to being product led?

First things first, a culture that empowers the asking of questions and testing of hypothesis is essential for innovation. But to allow that to happen, organisations need senior leaders who understand and support their teams to work in this way. The appropriate ,light weight/ adaptable, governance and funding approvals processes being in place are critical to enable product innovation and empower delivery teams.

The second element that’s key is having the right data. Good product orientation depends on having access to quality data; what are our current metrics? Where are our current pain points? Do we understand our current costs? What products/ services have the highest demand? etc. This data enables us to make quality decisions and measure our progress our successes.

Thirdly, we need to have clearly articulated strategy/vision for the organisation; what is our USP (Unique Selling Proposition)? What do we want to achieve? What are our goals? What value are we looking to add? What do we want to be different in 5/10 years from now?

To develop that strategy/vision, we need to have a clear understanding about our users and stakeholders. Who are we developing these products for? Who are our stakeholders? How are we engaging with them? What do they need from us?

Finally, once we’ve got the strategy, the vision, an understanding of our user needs and a set of hypothesis we want to test; we need a healthy delivery approach, with skilled teams in place to enable us to test our ideas and deliver that value. As we’ve said previously, to be product centric we need to be able to design services that are based on user needs, so that we can test regularly with our users to ensure we understand, and are meeting, those needs.

What are the sign of a good product led culture?

  • You are regularly engaging with the users; working to understand their needs and iterating your approach and services based on their feedback.
  • Your culture empowers and encourages people to ask questions. “Why are we doing this?”; “Who are we doing this for”, “Is anyone else already doing this?”, “What will happen if we don’t do this {now)?”, “What have we learnt from our previous failures/successes?”
  • Your teams are working collaboratively, policy and operations teams working hand in hand with tech/digital teams; to ensure you’re delivering value.
  • You’re considering and testing multiple options at each stage; looking for innovative solutions, and working to understand which options will best meet your users needs and add the most value.
  • Linked to the above; You’re testing regularly, being willing to ‘throw away’ what doesn’t work and refine your ideas based on what does work.
  • You’re delivering value early and often.
Prioritising the backlog

Which comes first, the Product Manager, or the product culture?

If you don’t have any trained product people, can you begin to move to a product led culture, or must you hire the product people first? This is the chicken and the egg question. For many organisations, especially those already using agile delivery methodologies or engaged in digital transformation; they may have already sunk a lot of time and money into delivery, and pausing their work whilst they change their culture and hire a load of skilled product folk just isn’t going to work; but, you can begin to move towards a product led approach without hiring a load of Product Managers. Whilst having experience product folk can definitely help, you probably have lots of folks in the organisation who are already over half way there and just need some help on that road.

One stumbling block many organisations fall over on their move to a product led approach is the difference between focusing on outcomes, rather than outputs or features.

An output is a product or service that you create; an outcome is the problem that you solve with that product. A feature is something a product or service does, whereas a benefit is what customers actually need. If we go straight to developing features, we could be making decisions based on untested assumptions. 

There are 5 steps to ensure you’re delivering outcomes that add value and deliver benefits vs. focusing on features that simply deliver an output:š

  • State the Problemš – what are we trying to solve/change?
  • Gather User Data – have we understood the problem correctly?
  • Set Concrete Goals and Define Success Criteria – what would success look like? š
  • Develop Hypothesis – how could we best solve this problem? š
  • Test Multiple Ideas – does this actually solve the problem?

When you’re trying to identify the right problem to fix, look at existing data from previous field studiescompetitive analysisanalytics, and feedback from customer support. Use a mix of quantitative and qualitative data to ensure you have understood your user needs, and their behaviours.  Then analyse the information, spot any gaps, and perform any additional research required to help you verify the hypothesis you have developed when trying to decide how you could solve the problem your users are facing.

They key element to being product led is understanding the problem you are trying to fix and focusing on the value you will deliver for your users by fixing it. It’s about not making assumptions you know what your users want, but by engaging with your users to understand what they need. It’s about spotting gaps and opportunities to innovate and add value, rather than simply building from or replacing what already exists. It’s about focusing on delivering that value early and often.

Making User Centred Design more inclusive

How do we support people from minority or disadvantaged backgrounds to get a career in User Centred Design?

If you look around for ways to get a careers in Digital/Tech, you would probably trip over half a dozen Apprenticeships, Academies or Earn as you Learn Schemes; not to mention Graduate Schemes; without even trying. However, all those opportunities would probably be within Software Engineering.

If you want to move into a career in Research, Product or Design; opportunities to do that without a Degree, or years of experience, are sparse.

Paper Prototypes/ Wireframes

When trying to find Design Apprenticeship or Entry Level schemes ahead of a talk I was giving to some sixth formers last month; I really struggled to find any opportunities that didn’t requite a Degree. In 2019 Kainos ran it’s first Design Academy, but for placements and Entry Level roles there was still the expectation you’d have a degree in Design; and its Earn as You Learn programme is for people looking for a career as a developer. Hippo are about to run their first Academy for Digital Change Consultants; which will then facilitate graduates moving into Product or Design careers etc, but it’s only for those with existing work experience looking to change careers; not young adults looking for their first career. FutureGov have previously run Design Academies but again these have been focused at Graduates. MadeTech’s Academy accepts people without a Degree, but is only for those interested in Software Engineering. Even the Civil Service Apprenticeships Scheme is focused on Software Engineering roles; with no opportunities within Product or Design. The National Apprenticeship Service does have a section for Design apprenticeships; but all the roles are focused on Content Marketing etc. rather than User Centric Design; and within the Digital Section, all the opportunities are for Technical Apprenticeships. Google have many Apprenticeship options, but their UX Design one only runs in the US.

After hours of searching I did find several opportunities; the first I found was with Amazon; who are now running their own User Experience Design and Research Apprenticeship, sadly however the criteria for candidates specifies that they must be working towards their Bachelors degree, or be an existing Amazon employee. The Second was a previous apprentice discussing their UX Apprenticeship with Barclays Bank, however when I searched for the Apprenticeship with Barclays itself, I could only find Technical ones, and none for Design, so if it does still exist, it’s not easy to find! While I could find plenty of Design Internships; they were all like the Amazon one; designed for students currently studying for the Bachelors degree.

I finally, FINALLY, found one actual opportunity I could share with the students I was speaking to, so well Done AstraZeneca, who seem to have the only real Research and Design Apprenticeship Programme available in the UK. But that was the only opportunity I found at the time of looking.

(EDITED TO ADD: The NHS Business Service Authority have just recruited their very first UCD Apprentices; all being well this programme will continue!)

group of fresh graduates students throwing their academic hat in the air

So, if you’re a budding 17 year old passionate about User Centred Design (UCD), is graduating from University your only real option? And if so, how many of our potential rising star researchers and designers are we losing because they can’t afford to attend University (or don’t want to)? Why are we (unintentionally or not) making Design so elitist?

There is a lot of data to suggest that Design as a career is predominantly white; there are many articles about the intrinsic racism within Graphic Design (as an example), and how racism has manifested itself in UX Design throughout the years. Given most Design roles insist on candidates having a Bachelors Degree or equivalent, the fact is that 72.6% of people starting undergraduate study in the 2019 to 2020 academic year were White. This, by default, suggests that most graduates will be white; and therefor White people will be the most likely to be able to apply for Entry Level roles in Design.

However, we also know that as a group, white students are the least likely to progress to University, and this is in part due to the wide gap in university participation between students who were on Free School Meals and those that weren’t, which is currently at 19.1% and growing. So, not only are most graduates going to be white, they’re also more likely to be from middle/high class backgrounds. Which could help explain (at least in part) why as a career, Design has struggled to diversify.

Given the massive demand for Designers within the Public Sector (and elsewhere) surely we need to once and for all sit down and crack the topic of Design Apprenticeships and Entry Level roles that don’t require a degree? Surely there’s a way we can give helping hand to those people out there who are interested in user centred design and desperately looking for their way in; but can’t or won’t attend university?

The only way we can make UCD as a career actually representative of the communities we’re meant to be designing for is if we can stop prioritising a Degree over passion and skill. So let’s aim to be more inclusive when we’re thinking about how we recruit the Design Leaders of tomorrow.

After all, inclusive design is the whole central principle of User Centred Design!

person in red sweater holding babys hand

Product vs Service vs Programme?

How we define a product vs a service is a debate that comes up regularly; as proved by Randal Whitmore (Deputy Director of New Propositions at the UKHSA) today on Twitter:

In fact, it comes up so regularly, I could have sworn I’d blogged about it before; but if I have, it isn’t on here! So, what is the difference and does it matter?

If you search online for ‘Product vs. Service’ you’ll get a very dry (an in my opinion not that helpful) answer that “A product is a tangible item that is put on the market for acquisition, attention, or consumption, while a service is an intangible item, which arises from the output of one or more individuals. … In most cases services are intangible, but products are not always tangible.”

There you go, question answered!

Ok, so lets say you actually went a useful response; that is understandable; what’s the answer? The best analogy I have ever found to help describe this is one I heard Ben Holliday use once, and I’ve since stolen and reused any time anyone ever asks me this question (which is pretty regularly)!

So, let’s talk about going on holiday!

Sunglasses on a beach
Dreaming of a sunny holiday

A service is all about someone delivering the outcome you want to achieve.; its the holistic wrapper that contains all the end to end steps needed to enable you to achieve that desired outcome.

Let’s say you want to go on holiday; you can choose to use a travel agency like Tui who offer holidays as a service. Should you decide you want a package holiday, you can book and pay for your entire holiday through Tui and they will organise everything for you. Or you may decide you want to do all the organisation yourself and as such just need to book some flights, and go directly to KLM or EasyJet to book your flights. The services these companies offer are all similar (Tui will let you just book flights for example) but they will all differ in some ways; which is generally where the products that make up the service come in.

Products are the individual components that are part of that holistic service wrapper.

For our example of a package holiday; you can choose your flights; how much luggage you want to take with you, what hotel you want to stay at, whether you want to go on any excursions etc. These are all products a travel agency offer as part of their wider service; and you can choose which products you wish to use; But it’s not only that, you can also choose how you book your holiday. You can book via the app; via their website; you could call them and book over the phone; or you could book in one of their shops (well, ok not so much nowadays, but for our hypothetical example lets say you still can).

Lets say it’s the day before your holiday; A few years ago Tui released a new product; which was their App, which included lots of new features that customers could choose from. Now a days you can check in online; you can download your boarding pass to your phone; you can choose your seats; request special assistance and choose to check your bags in all before you get to the airport via the app.

white airplane on mid air
Come Fly Away

We’ve talked about the customer facing products and features that make up the holiday service a travel agency offers; but there is obviously a lot more to it than that. As part of developing each of these products the travel agencies had to think about how they would all fit together to form the holistic service. Theres also all the back end integration to think about, to offer their holiday Service Tui need to work with other suppliers (like the Airports and hotels; which partner with Tui, but are not owned or controlled by them). Should your flight get cancelled or delayed because of bad weather or congestion at the airport; the travel agency will first need to be notified, and then to notify you as their customer and give you options on what to do next etc.

When they decided to launch the App; or to open up holiday options into a new country; a programme could have been set up to manage this. A programme is one way an organisation may choose to manage multiple work streams or teams that are working to deliver something. They are entirely internal, and make no difference to the end users experience.

So there you have it:

A service is about the desired (intangible) outcome; it’s holistic and made up of many products etc.

A product is a succint (tangible) element that delivers value, it is made up of many features. A product can stand alone or alongside other products as part of a holistic service.

A feature is a componant of a product that adds value as part of the wider product but offers little value when utilised alone.

A programme is an organisational governance mechanism that can be used to organise and manage teams to deliver an outcome.

Getting work in the Digital Industry

Why working in the public sector might be of interest to you!

One of my favourite things about my role, well my career to be honest; is the opportunity to do Educational Outreach type activities. In the last few years especially I’ve been asked a few times to speak to Students (both Sixth form or University students) about why working in Digital might be of interest to them.

As we’re just coming out of a pandemic, and lots of students have missed out on opportunities to do work experience, or listen to guest speakers due to social distancing etc. I thought it might be useful if I made my most recent talk available. This talk was for Sixth form students at Salfords Future Skills Hub

In the slides below, I discuss things like:

  • Pathways into working in Digital
  • Why the public sector might be interesting for you to work in and why you should consider it
  • Thing to consider when thinking about your career
  • Salaries when you are starting out and what your earning potential could be
  • Hot to connect with people and speaking confidently
  • How to interview well

Assessing Innovation

(co-written with Matt Knight)

Some background, for context

Just over a month ago I got approached to ask if I could provide some advice on assessments to support phase two of the GovTech Catalyst (GTC) scheme. For those who aren’t aware of the GovTech Catalyst Scheme, there’s a blog here that explains how the scheme was designed to connect private sector innovators with the public sector sponsors, using Innovate UK’s Small Business Research Initiative (SBRI) to help find promising solutions to some of the hardest public sector challenges.

Person in a lab coat with a stethoscope around their neck looking through a Virtual Reality head set.
Looking for innovation

The Sponsor we were working with (who were one of the public sector sponsors of the scheme) had put two suppliers through to the next phase and allocated funding to see how and where tech innovation could help drive societal improvements in Wales. As part of their spend approval for the next phase, the teams had to pass the equivalent of a Digital Service Standard assessment at the 6 month point in order to get funding to proceed. 

For those who aren’t aware, there used to be a lovely team in GDS who would work with the GTC teams to provide advice and run the Digital Service Standard assessments for the projects; unfortunately this team got stood down last year; after the recent GTC initiatives started, leaving them with no one to talk to about assessments, nor anyone in place to assess them. 

The sponsor had reached out to both GDS and NHS Digital to see if they would be willing to run the assessments or provide advice to the teams, but had no luck; which left them a bit stuck; which is where I came in. I’ve blogged before about the Digital Service Standards; which led to the Sponsor reaching out to me to ask whether I’d be willing and able to help them out; or whether I knew any other assessors who might be willing to help. 

Preparing for the Assessments

As there were two services to assess; one of the first things I did was talk to the wonderful Matt Knight to see if he’d be willing and able to lead one of the assessments. Matt’s done even more assessments than me; and I knew he would be able to give some really good advice to the product teams to get the best out of them and their work. 

Matt and I sat and had a discussion on how to ensure we were approaching our assessments consistently; how to ensure we were honouring and adhering to the core tenants of the Digital Standards whilst also trying to assess the teams innovation and the value for money their services could deliver in line with the criteria for the GovTech scheme.

What became quickly apparent was; because this was to support the GTC scheme; the teams doing the work were fully private sector with little experience of the Digital Service Standards. A normal assessment, with the standard ‘bar’ we’d expect teams to be able to meet, wouldn’t necessarily work well; we’d need to be a little flexible in our approach. 

Obvious, no matter what type of Assessment you’re doing the basic framework of an assessment stays the same (start with user needs, then think about the End-to-End service, then you can talk about the team and design and tech, and along the way you need to ask about the awkward stuff like sustainability and open source and accessibility and metrics) can be applied to almost anything and come up with a useful result, regardless of sector/background/approach. 

As the services were tasked with trying to improve public services in Wales, we also wanted to take account of the newly agreed Welsh Digital Standards; using them alongside the original Digital Standards; obviously the main difference was the bits of the Welsh Standards that covered ensuring the well-being of people in Wales and promoting the Welsh Language (standards 8 & 9), you can read more about the Well being of future generations Act here

The assessments themselves 

An image of a team mapping out a user journey
User Journey Mapping

The assessments themselves ran well, (with thanks to Sam Hall, Coca Rivas and Claire Harrison my co-assessors) while the service teams were new to the process they were both fully open and willing to talk about their work, what went well and not so well and what they had learnt along the way. There was some great work done by both the teams we assessed, and it’s clearly a process that everyone involved learned a lot from, both in terms of the service teams, and the sponsor team, and it was great to hear about how they’d collaborated to support user research activities etc. Both panels went away to write up their notes; at which point Matt and I exchanged notes to see if there were any common themes or issues; and interestingly both assessments had flagged the need for a Service Owner from the sponsor to be more involved in order to help the team identify the success measures etc. 

When we played the recommendations and findings back to the Sponsor, this led to an interesting discussion; although the sponsor had nominated someone to act as the link for the teams in order to answer their questions etc. and to try and provide the teams some guidance and steer where they could. Because of the terms of the GTC scheme, the rules on what steers they could and couldn’t give were quite strict to avoid violating the terms of the competition. Originally the GTC team within GDS would have helped the sponsors navigate these slightly confusing waters in terms of competition rules and processes. However, without an experienced team to turn to for advice it leaves sponsors in a somewhat uncomfortable and unfamiliar position; although they had clearly done their best (and the recommendations in this blog are general comments on how we can improve how we assess innovation across the board and not specifically aimed at them)”

Frustratingly this meant that even when teams were potentially heading into known dead-ends etc; while the sponsor could try to provide some guidance and steer them in a different direction; they couldn’t force the teams pivot or change; instead the only option would be to pull the funding. While this makes sense from a competition point of view; it makes little to no sense from a public purse point of view; or from a Digital Standards point of view. It leaves sponsors stuck (when things might have gone a little off track) rather than being able to get teams to pivot; they are left choosing between potentially throwing away or losing some great work; or investing money in projects that may not be able to deliver. 

Which then raises the question; how should we be assessing and supporting innovation initiatives? How do we ensure they’re delivering value for the public purse whilst also remaining fair and competitive? How do we ensure we’re not missing out on innovative opportunities because of government bureaucracy and processes? 

In this process, what is the point of a Digital Service Standard assessment? 

If it’s like most other assessment protocols (do not start Matt on his gateway rant), then it’s only to assess work that has already happened. If so, then it’s not much good here, when teams are so new to the standards and need flexible advice and support on what they could do next etc.   

If it’s to assess whether a service should be released to end users, then it’s useful in central government when looking to roll out and test a larger service; but not so much use when it’s a small service, mainly internal users or a service that’s earlier on in the process aiming to test a proof of concept etc. 

If it’s to look at all of the constituent areas of a service, and provide help and guidance to a multidisciplinary team in how to make it better and what gaps there are (and a bit of clarity from people who haven’t got too close to see clearly), then it’s a lot of use here, and in other places; but we need to ensure the panel has the right mix of experts to be able to assess this. 

While my panel was all fantastic; and we were able to assess the levels of user research the team had done, their understanding of the problems they were seeing to solve, their ability to integrate with legacy tech solutions and how their team was working together etc. none of us had any experience in assessing innovation business cases or understanding if teams had done the right due diligence on their financial funding models. The standards specify that teams should have their budget sorted for the next phase and a roadmap for future development; in my experience this has generally been a fairly easy yes or no; I certainly wouldn’t know a good business accelerator if it came and bopped me on the nose. So while we could take a very high level call on whether we thought a service could deliver some value to users; and whether a roadmap or budget looked reasonable; a complex discussion on funding models and investment options was a little outside our wheelhouse; so was not an area we could offer any useful advice or recommendations on.  

How can we deliver and assess innovation better going forward? 

If we’re continuing to use schemes like the GTC scheme to sponsor and encourage private sector innovators to work with the public sector to solve important problems affecting our society, then we obviously need a clear way to assess their success. But we also need to ensure we’re setting up these schemes in such a way that the private sector is working with the public sector; and that means we need to be working in partnership; able to advise and guide them where appropriate in order to ensure we’re spending public money wisely. 

There is a lot of great potential out there to use innovative tech to help solve societal issues; but we can’t just throw those problems at the private sector and expect them to do all the hard work. While the private sector can bring innovative and different approaches and expertise, we shouldn’t ignore the wealth of experience and knowledge within the public sector either. We need people within the public sector with the right digital skills, who are able to  prioritise and understand the services that are being developed inorder to ensure that the public purse doesn’t pay for stuff that already exists to be endlessly remade. 

Assessment can have a role in supporting innovation; as long as we take a generous rather than nitpicking (or macro rather than micro) approach to the service standard. Assessments (and the Standards themselves) are a useful format for structuring conversations about services that involve users (hint: that’s most of them) just the act of starting with user needs – pt 1 – rather than tech – changes the whole conversation. 

However,  to make this work and add real value, solve a whole problem for users (point 2 of the new uk govt standard) – is critical, and that involves having someone who can see the entire end to end process for any new service and devise and own success measures for it. The best answer to both delivering innovation, and assessing it, is bringing the private and public sector together to deliver real value; creating a process that builds capacity, maturity and genuine collaboration within the wider public sector. A space to innovate and grow solutions. True multidisciplinary collaboration, working together to deliver real value.

“Together, We Create”

Big thanks to Matt for helping collaborate on this, if you want to find his blog (well worth a read) you can do so here:

Talking Digital Transformation

It’s something that has come up a lot in conversations at the moment, what is Digital Transformation? What does Digital Transformation mean to me? I always joke that it’s my TED talk subject, if I had one; as such I thought why not write a blog about it?

What is Digital Transformation?

According to Wikipedia, Digital Transformation “is the adoption of digital technology to transform services or businesses, through replacing non-digital or manual processes with digital processes or replacing older digital technology with newer digital technology.

The Wikipedia definition focuses on 3 of the main areas of Digital Transformation; technology, data, process; which are the areas most people quote when but doesn’t reference organisational change; which is often recognised as the 4th pillar needed for successful transformation.

If we’re being specific, then I agree with the Wikipedia definition at the project or service level, but when someone says Digital Transformation to me; I automatically start thinking about what that means at the organisational level, before moving onto the other areas.

I’ve done plenty of blogs previously on the importance of considering your organisational culture when trying to implement change; and how likely it is that your transformation will fail if you don’t consider your culture as part of it; but that as we see from the Wikipedia Definition; the people side of Digital Transformation is often forgotten.

There’s a good blog here that defines the 4 main challenges organisations face when looking to implement Digital Transformation, which it defines as:

  • Culture.
  • Digital Strategy and Vision.
  • IT infrastructure and digital expertise.
  • Organisational Structure.

Here, we see Culture is the first/largest challenge mainly organisations face; which is why it’s important is’t not treated as an afterthought. Why is that? Is our methodology wrong?

So how do we go about delivering Digital Transformation?

The Enterprise project has a good article here on what it views as the 3 important approaches leaders should take when implementing Digital Transformation.

  • Solve the biggest problem first.
  • Collaborate to gain influence.
  • Keep up with information flows.

There’s (hopefully) nothing revolutionary here; this is (in my opinion) common sense in terms of approach. But so often, when we start talking about Digital Transformation, we can quickly fall into the trap about talking about frameworks and methodology; rather than the how and why of our approach to solving problems. So, are there any particular frameworks we should be using? Does the right framework guarantee success?

There are lots of different frameworks out there; and I can’t document them all; but below are some examples…

This article sums up what it deems as the top 5 Digital Transformation frameworks, which are the big ones; including MIT; DXC; CapGemini; McKinsey; Gartner; Cognizant and PWC. It’s a good summary and I won’t repeat what it says about each, but it looks at them in the following terms that I think are key for successful Digital transformation:

  • customer-centricity
  • opportunity and constraints
  • company culture
  • simplicity

There are obviously a few others out there; and I thought I’d mention a couple:

The first one is this AIMultiple; this one interestingly has culture as the final step; which for me makes it feel like you are ‘doing transformation to the teams rather than engaging teams and bringing them into the transformation; which doesn’t work well for me.

AIMultiple Digital Transformation Framework
https://research.aimultiple.com/what-is-digital-transformation/#what-is-a-digital-transformation-framework

This second one; from ionology, has Digital Culture and Strategy as its first building block; with user engagement as its second building with equal waiting to Processes, Technology and Data. It recognises that all of these elements together are needed to deliver Digital Transformation successfully. This one feels much more user centric to me.

https://www.ionology.com/wp-new/wp-content/uploads/2020/03/Digital-Transformation-Blocks-Equation.jpg

So where do you start?

Each of these frameworks has key elements they consider, in a particular order that they feel works best. But before panicking about which (if any) framework you need to pick; it’s worth remembering that no single framework will work for every business and any business will need to tailor a framework to fit their specific needs. 

How you plan to approach your transformation is more important than the framework you pick. Which is why the Enterprise article above about good leadership for me is spot on. We should always be asking:

  • What is the problem you’re trying to solve within your organisation by transforming it, and why?
  • Who do you need to engage and collaborate with to enable successful transformation?
  • What is the data you need to understand how best to transform your organisation?

Once you know what you’re trying to achieve and why, you can understand the options open to you; you can then start looking at how you can transform your processes, technology, data and organisational structure; at which point you can then define your strategy and roadmap to deliver. All of the above should be developed in conjunction with your teams and stakeholders so that they are engaged with the changes that are/will be happening.

Any framework you pick should be flexible enough to work with you to support you and your organisation; they are a tool to enable successful Digital Transformation; not the answer to what is Digital Transformation.

So, for me; what does Digital Transformation mean?

As the Enterprise Project states; Digital transformation “is the integration of digital technology into all areas of a business, fundamentally changing how you operate and deliver value to customers. It’s also a cultural change that requires organisations to continually challenge the status quo, experiment, and get comfortable with failure.” Which I wholeheartedly agree with.

And this is why we test with users…

A blog on the new National Careers ‘Discover your skills and careers’ Service

As I sit here are ten past ten on a Wednesday night watching social media have a field day with the new National Careers service, I’m yet again reminded about the importance of the Digital Service Standard, especially Standard Number One – Understand users and their needs. And why we need to get Ministers and senior leaders to understand their importance.

The first role of any good User Centric designer or Product Manager within the public sector is understanding the problem you’re trying to solve.

In this case, the problem we’re facing is not a small one. Because of COVID-19 we currently have approximately 1.4M people unemployed with many more still facing redundancy due to the ongoing pandemic. ONS data states that between March and August, the number of people claiming benefits rose 120% to 2.7 million.

The Entertainment, Leisure and Hospitality sectors have been decimated, amongst many others. Just this week we’ve had Cineworld announce 45,000 job loses and Odeon may soon be following suit. Theatres and live event venues across the country are reporting they are on the brink of collapse.

So, when the Chancellor announced as part of the summer statement, a whole host of support for people too retrain; it included advice for people to use the new Careers and Skills advice service to get ideas on new career options.

A service to help people understand new career options right now is a great idea, it absolutely should meet user need.

A screenshot of the national careers service skills assessment

Unfortunately, you only have to look at the headlines to see how well the new service has been received. The service is currently such a laughing stock that no-one is taking it seriously; which is a massive shame, because it’s trying to solve a very real problem.

A number of my friends and acquaintances have now taken the quiz (as has half of twitter apparently) and it was suggested I have a look. So I did. (As an aside, it recommended I retrain in the hospitality industry, all who know me know how terrible this would be for all involved, last week I managed to forget to cook 50% of our dinner, and I am clinically unable to make a good cup of coffee, never mind clean or tidy anything!)

It has good intentions, and in a number of cases, it may not be too far off the mark; the team behind the service have done a write up here* of how they have developed it, and what they set out to achieve. Unfortunately, while the service seems to be simple to understand and accessible to use; what it seems to be missing is any level of context or practicality that would help it meet the problem it’s being used for.

*EDIT: Which has sadly now been taken down, which is a massive shame, because they did good work, but sadly I suspect under political pressure to get something out there quickly. We’ve all been there, it’s a horrid position to be in.

While they have tested with users with accessibility needs, the focus seems to have been on whether they can use the digital service; not does the service actually meet their needs?

My friend with severe mobility and hearing issues was advised to retrain as a builder. Another friend with physical impairments (and a profound phobia of blood) was advised they were best suited to a role as a paramedic. A friend with ASD who also has severe anxiety and an aversion to people they don’t know, was advised to become a beautician. Another friend who is a single parent was given three career options that all required evening and weekend work. At no point does this service ask whether you have any medical conditions or caring needs that would limit the work you could do. While you can argue that that level of detail falls under the remit of a jobs coach; it can understandable be seen as insensitive and demoralising to be recommending careers to people they are physically unable to do.

Equally, unhelpful is the fact the service which has been especially recommended to people who have been made redundant from the worst hit industries; is recommending those same decimated industries to work in, with no recognition of the current jobs market.

My partner, who was actually made redundant from her creative role due to COVID-19, (and the target audience for this service according to the Chancellor) was advised to seek a role in the creative industries; an industry that doesn’t currently exist; and a quick look on social media proves she isn’t alone.

The service doesn’t actually collect enough (well, any) data about the career someone is in, nor does it seem to have any interface to the current jobs market to understand whether the careers its recommending are actually viable.

Unfortunately, the service is too generic, and while it would possibly help school/ college students who are trying to choose their future career paths in a ‘normal’ job market, (And I honestly suspect that’s who it was actually developed for!) it’s not meetings the fundamental problem we are facing at the moment; ie. help people understand their career options in the current market.

If you’ve worked within Digital in the Public Sector you’ve had to deal with Ministers and Directors who don’t really understand the value of user research or why we need to test things properly before we role them out nationally. The current debacle with the careers website is possible a perfect example of why you need to make sure you actually test your service with a wide range of users regularly; not just rely on assumptions and user personas; and why its important to test and iterate the service with real users multiple times before it gets launched. It highlights the need for us to get Ministers to understand that rushing a service out there quickly isn’t always the right answer.

We all need to understand users and their needs. Just because a service is accessible doesn’t mean it solves the problem users are facing.

Notes from some Digital Service Standard Assessors on the Beta Assessment

The Beta Assessment is probably the one I get the most questions about; Primarily, “when do we actually go for our Beta Assessment and what does it involve?” 

Firstly what is an Assessment? Why do we assess products and services?

If you’ve never been to a Digital Service Standard Assessment it can be daunting; so I thought it might be useful to pull together some notes from a group of assessors, to show what we are looking for when we assess a service. 

Claire Harrison (Chief Architect at Homes England and leading Tech Assessor) and Gavin Elliot (Head of Design at DWP and a leading Design Assessor, you can find his blog here) helped me pull together some thoughts about what a good assessment looks like, and what we are specifically looking for when it comes to a Beta Assessment. 

I always describe a good assessment as the team telling the assessment panel a story. So, what we want to hear is:

  • What was the problem you were trying to solve?
  • Who are you solving this problem for? (who are your users?)
  • Why do you think this is a problem that needs solving? (What research have you done? Tell us about the users journey)
  • How did you decide to solve it and what options did you consider? (What analysis have you done?) 
  • How did you prove the option you chose was the right one? (How did you test this?)

One of the great things about the Service Manual is that it explains what each delivery phase should look like, and what the assessment team are considering at each assessment.

So what are we looking for at a Beta Assessment?

By the time it comes to your Beta Assessment, you should have been running your service for a little while now with a restricted number of users in a Private Beta. You should have real data you’ve gathered from real users who were invited to use your service, and your service should have iterated several times by now given all the things you have learnt. 

Before you are ready to move into Public Beta and roll your service out Nationally there are several things we want to check during an assessment. 

You need to prove you have considered the whole service for your users and have provided a joined up experience across all channels.

  • We don’t want to just hear about the ‘digital’ experience; we want to understand how you have/will provide a consistent and joined up experience across all channels.
  • Are there any paper or telephony elements to your service? How have you ensured that those users have received a consistent experience?
  • What changes have you made to the back end processes and how has this changed the user experience for any staff using the service?
  • Were there any policy or legislative constraints you had to deal with to ensure a joined up experience?
  • Has the scope of your MVP changed at all so far in Beta given the feedback you have received from users? 
  • Are there any changes you plan to implement in Public Beta?

As a Lead Assessor this is where I always find that teams who have suffered with empowerment or organisational silos may struggle.

If the team are only empowered to look at the Digital service, and have struggled to make any changes to the paper/ telephony or face to face channels due to siloed working in their Department between Digital and Ops (as an example) the Digital product will offer a very different experience to the rest of the service. 

As part of that discussion we will also want to understand how you have supported users who need help getting online; and what assisted digital support you are providing.

At previous assessments you should have had a plan for the support you intended to provide, you should now be able to talk though how you are putting that into action. This could be telephony support or a web chat function; but we want to ensure the service being offered is/will be consistent to the wider service experience, and meeting your users needs. We also want to understand how it’s being funded and how you plan to publish your accessibility info on your service. 

We also expect by this point that you have run an accessibility audit and have carried out regular accessibility testing. It’s worth noting, if you don’t have anyone in house who is trained in running Accessibility audits (We’re lucky in Difrent as we have a DAC assessor in house), then you will need to have factored in the time it takes to get an audit booked in and run well before you think about your Beta Assessment).

Similarly, by the time you go for your Beta Assessment we would generally expect a Welsh language version of your service available; again, this needs to be planned well in advance as this can take time to get, and is not (or shouldn’t be) a last minute job! Something in my experience a lot of teams forget to prioritise and plan for.

And finally assuming you are planning to put your service on GOV.UK, you’ll need to have agreed the following things with the GOV.UK team at GDS before going into public beta:

Again, while it shouldn’t take long to get these things sorted with the GOV.UK team, they can sometimes have backlogs and as such it’s worth making sure you’ve planned in enough time to get this sorted. 

The other things we will want to hear about are how you’ve ensured your service is scalable and secure. How have you dealt with any technical constraints? 

The architecture and technology – Claire

From an architecture perspective, at the Beta phases I’m still interested in the design of the service but I also have a focus on it’s implementation, and the provisions in place to support sustainability of the service. My mantra is ‘end-to-end, top-to-bottom service architecture’!

 An obvious consideration in both the design and deployment of a service is that of security – how the solution conforms to industry, Government and legal standards, and how security is baked into a good technical design. With data, I want to understand the characteristics and lifecycle of data, are data identifiable, how is it collected, where is it stored, hosted, who has access to it, is it encrypted, if so when, where and how? I find it encouraging that in recent years there has been a shift in thinking not only about how to prevent security breaches but also how to recover from them.

Security is sometimes cited as a reason not to code in the open but in actual fact this is hardly ever the case. As services are assessed on this there needs to be a very good reason why code can’t be open. After all a key principle of GDS is reuse – in both directions – for example making use of common government platforms, and also publishing code for it to be used by others.

Government services such as Pay and Notify can help with some of a Technologist’s decisions and should be used as the default, as should open standards and open source technologies. When  this isn’t the case I’m really interested in the selection and evaluation of the tools, systems, products and technologies that form part of the service design. This might include integration and interoperability, constraints in the technology space, vendor lock-in, route to procurement, total cost of ownership, alignment with internal and external skills etc etc.

Some useful advice would be to think about the technology choices as a collective – rather than piecemeal, as and when a particular tool or technology is needed. Yesterday I gave a peer review of a solution under development where one tool had been deployed but in isolation, and not as part of an evaluation of the full technology stack. This meant that there were integration problems as new technologies were added to the stack. 

The way that a service evolves is really important too along with the measures in place to support its growth. Cloud based solutions help take care of some of the more traditional scalability and capacity issues and I’m interested in understanding the designs around these, as well as any other mitigations in place to help assure availability of a service. As part of the Beta assessment, the team will need to show the plan to deal with the event of the service being taken temporarily offline – detail such as strategies for dealing with incidents that impact availability, and the strategy to recover from downtime and how these have been tested.

Although a GDS Beta assessment focuses on a specific service, it goes without saying that a good Technologist will be mindful of how the service they’ve architected impacts the enterprise architecture and vice-versa. For example if a new service built with microservices and also introduces an increased volume and velocity of data, does the network need to be strengthened to meet the increase in communications traversing the network?

Legacy technology (as well as legacy ‘Commercials’ and ways of working) is always on my mind. Obviously during an assessment a team can show how they address legacy in the scope of that particular service, be it some form of integration with legacy or applying the strangler pattern, but organisations really need to put the effort into dealing with legacy as much as they focus on new digital services. Furthermore they need to think about how to avoid creating ‘legacy systems of the future’ by ensuring sustainability of their service – be it from a technical, financial and resource perspective. I appreciate this isn’t always easy! However I do believe that GDS should and will put much more scrutiny on organisations’ plans to address legacy issues.

One final point from me is that teams should embrace an assessment. Clearly the focus is on passing an assessment but regardless of the outcome there’s lots of value in gaining that feedback. It’s far better to get constructive feedback during the assessment stages rather than having to deal with disappointed stakeholders further down the line, and probably having to spend more time and money to strengthen or redesign the technical architecture.

How do you decide when to go for your Beta Assessment?

Many services (for both good and bad reasons) have struggled with the MVP concept; and as such the journey to get their MVP rolled out nationally has taken a long time, and contained more features and functionality then teams might have initially imagined.  

This can make it very hard to decide when you should go for an Assessment to move from Private to Public Beta. If your service is going to be rolled out to millions of people; or has a large number of user groups with very different needs; it can be hard to decide what functionality is needed in Private Beta vs. Public Beta or what can be saved until Live and rolled out as additional functionality. 

The other things to consider is, what does your rollout plan actually look like? Are you able to go national with the service once you’ve tested with a few hundred people from each user group? Or, as is more common with large services like NHS Jobs, where you are replacing an older service, does the service need to be rolled out in a very set way? If so, you might need to keep inviting users in until full rollout is almost complete; making it hard to judge when the right time for your Beta assessment is. 

There is no right or wrong answer here, the main thing to consider is that you will need to understand all of the above before you can roll your service out nationally, and be able to tell that story to the panel successfully. 

This is because theoretically most of the heavy lifting is done in Private Beta, and once you have rolled out your service into Public Beta, the main things left to test are whether your service scaled and worked as you anticipated. Admittedly this (combined with a confusion about the scope of an MVP) is why most Services never actually bother with their Live Assessment. For most Services, once you’re in Public Beta the hard work has been done; there’s nothing more to do, so why bother with a Live Assessment? But that’s an entirely different blog! 

Reviewing the service together.

 

Having pride in our diversity and learning to be inclusive

June is Pride Month when members of the LGBTQ+ community and their allies come together in different ways to celebrate, remember and reflect. As such, now June is over, I wanted to reflect on the things I learnt this year.

Pride and Trans Pride Flags.

This June was a Pride Month like no over, because of COVID-19; lockdown meant that the usual pride marches were cancelled and then moved online.

June was also the month that #BlackLivesMatter came to the forefront of Western consensus because of unforgivable killing of George Floyd in the US, amongst sadly so many others around the globe. With marches and rallies both in the US and UK (and elsewhere) to call for the end of police brutality and discrimination against Black people.

And finally, June (yep, still Pride Month) was when JKR yet again decided to use her platform to gatekeep women’s spaces and to decry the acceptance of trans women as women. (I’m not linking to her article, because I won’t give it airspace, but there are MANY fantastic pieces that explain why this stance is harmful, here’s just one. But the Tl:Dr version is, Trans Women are Women.)

As such, this month, more than any other June that we have seen in a long time, has been one in which the conversations about diversity and inclusion have been so important.

I was asked this month, why diversity and inclusion were important to me?

As the very wise Fareeha Usman, founder of Being women, said “Discrimination can only be tackled if we first tackle our own insecurities.”

Working within and alongside the public sector, we develop policies, products and services for the public; for citizens, for society. We can not develop things for people, if we can not empathise with them; if we can not understand where they are coming from and the problems/ barriers they are facing. The people we are building form come from diverse backgrounds. If our teams all look and sound the same, and have the same life experiences, then we will never be able to deliver things that meet the diverse needs of our users.

Lots of hands coming together with a heart over the top

The Lesbians who tech (and allies) held their annual pride summit from the 22nd to 26th of June, and this year there was a clear focus on #BlackLivesMatter and also #TransWomenAreWomen as well; with a whole host of fantastic speakers discussing actions we can all take to be more inclusive. I was also lucky enough to be asked to speak at a D&I panel* on the 24th held by @SR2 and had the opportunity to attend the Dynamo North East event on the 25th, and to attend several other virtual pride events.

Key things I learned:

  • Locational geo-clusters can be a blocker to diversity and reinforce racial discrimination – @LorraineBardeen
  • When attending a meeting/ workshop or invited to sit on a panel, it’s our responsibility to check who else is ‘in the room’ and see if we are needed there, or is there someone else from a different group who’s voice needs amplifying more than ours. – @JasmineMcElry
  • When awarding contracts we need to look at companies track record on diversity / pay etc. And make sure we are not unconsciously biased against companies that have a makes up that does not match our own. – @SenatorElizabethWarren
  • It is our job to educate ourselves and not ask anyone else to educate us; as leaders our role is to admit we don’t know everything, that we are still learning, and to actively listen to others – @TiffanyDawnson

COVID-19, if nothing else, has given us the opportunity to think about the society we want to see coming out of this pandemic. We have all embraced things like remote working to help us keep working, now is the time to consider whether these tools can also help us going forward to be more inclusive in our workforce, and our society.

Removing the dependance on geographical hiring would enable us to include people from wider ethnic communities, as well as disabled people who have often found themselves excluded from office jobs by the commute etc; or people with caring responsibilities for who the standard 9-5 office job doesn’t work.

A fantastic session led by Nic Palmarini, Director of NICA, on Agism stated that “We need to reimagine a new society that is more inclusive”. This for me sums up the conversations I have seen, heard and been lucky enough to be part of this month; and I am proud to be part of a company, an industry and a community, that is trying its hardest to do just that.

Difrent's Diversity stats
The Diversity stats from Difrent

*If you fancy catching up on the panel, details are here: https://zoom.us/rec/share/7uV5L-rezkhIZZXT8FjFVKQIAZTCeaa82yJI-Pdby0whYlngi4VRx3mii2Gvb-zR Password: 1H+a15=#

The people getting left behind

Why ‘in the era of remote working we need to stop thinking about ‘digital services’ as a separate thing, and just think about ‘services’.

Last night when chatting to @RachelleMoose about whether digital is a privilege, which she’s blogged about here, it made me remember a conversation from a few weeks ago with @JanetHughes about the work DEFRA were doing, and their remit as part of the response to the current pandemic (which it turns out is not just the obvious things like food and water supplies, but also what do we do about Zoo’s and Aquariums during a lockdown?!)

A giraffe

This in turn got me thinking about the consequences of lockdown that we might never have really have considered before the COVID 19 pandemic hit; and the impact a lack of digital access has on peoples ability to access public services.

There are many critical services we offer everyday that are vital to peoples lives that we never imagined previously as ‘digital’ services which are now being forced to rely on digital as a means of delivery, and not only are those services themselves struggling to adapt but we are also at risk of forgetting those people for whom digital isn’t an easy option.

All ‘digital’ services have to prove they have considered Digital Inclusion, back in 2014 it was found approx. 20% of Britains had basic digital literacy skills, and the Digital Literacy Strategy aimed to have everyone who could be digital literate, digitally able by 2020. However it was believed that 10% of the population would never be able to get online, and the Assisted Digital paper published in 2013 set out how government would enable equal access to users to ensure digital excluded people were still able to access services. A report by the ONS last year backs this assumption up, showing that in 2019 10% of the population were still digital excluded.

However, as the effects of lockdown begin to be considered, we need to think about whether our assisted digital support goes far enough; and whether we are really approaching how we develop public services holistically, how we ensure they are future proof and whether we are truly including everyone.

There have been lots of really interesting articles and blogs about the impact of digital (or the lack of access to digital) on children’s education. With bodies like Ofsted expressing concerns that the lockdown will widen the gap education between children from disadvantaged backgrounds and children from more affluent homes; with only 5% of the children classified as ‘in need’ who were expected to still be attending school turning up.

An empty school room

According to the IPPR, around a million children do not have access to a device suitable for online lessons; the DfE came out last month to say there were offering free laptops and routers to families in need; however a recent survey showed that while over a quarter of teachers in private schools were having daily interaction with their pupils online less than 5% of those in state schools were interacting with their pupils daily online. One Academy chain in the North West is still having to print home learning packs and arrange for families to physically pick up and drop off school work.

The Good Things Foundation has shared its concerns similarly about the isolating effects of lockdown, and the digital divide that is being created, not just for families with children, but for people with disabilities, elderly or vulnerable people or households in poverty. Almost 2 million homes have no internet access, and 26 million rely on pay as you go data to get online. There has been a lot of concern raised about people in homes with domestic violence who have no access to phones or the internet to get help. Many companies are doing what they can to try and help vulnerable people stay connected or receive support but it has highlighted that our current approach to designing services is possibly not as fit for the future as we thought.

The current pandemic has highlighted the vital importance for those of us working in or with the public sector to understand users and their needs, but to also ensure everyone can access services. The Digital Service Standards were designed with ‘digital’ services in mind, and it was never considered 6 months ago, that children’s education, or people’s health care needed to be considered and assessed against those same standards.

The standards themselves say that the criteria for assessing products or services is applicable if either of the following apply:

  • getting assessed is a condition of your Cabinet Office spend approval
  • it’s a transactional service that’s new or being rebuilt – your spend approval will say whether what you’re doing counts as a rebuild

The key phrase here for me is ‘transactional service’ ie. the service allows:

  • an exchange of information, money, permission, goods or services
  • submitting of personal information that results in a change to a government record

While we may never have considered education as a transactional service before now, as we consider ‘the new normal’ we as service designers and leaders in the transformation space need to consider which of our key services are transactional, how we are providing a joined up experience across all channels; and what holistic service design really means. We need to move away from thinking about ‘digital and non digital services’ and can no longer ‘wait’ to assess new services, instead we need to step back and consider how we can offer ANY critical service remotely going forward should we need to do so.

A child using a tablet

Digital can no longer be the thing that defines those with privilege, COVID 19 has proved that now more than ever it is an everyday essential, and we must adapt our policies and approach to service design to reflect that. As such, I think it’s time that we reassess whether the Digital Service Standards should be applied to more services than they currently are; which services we consider to be ‘digital’ and whether that should even be a differentiator anymore. In a world where all services need to be able to operate remotely, we need to approach how we offer our services differently if we don’t want to keep leaving people behind.

Matt Knight has also recently blogged on the same subject, so linking to his blog here as it is spot on!