×

Tag: GDS

And this is why we test with users…

A blog on the new National Careers ‘Discover your skills and careers’ Service

As I sit here are ten past ten on a Wednesday night watching social media have a field day with the new National Careers service, I’m yet again reminded about the importance of the Digital Service Standard, especially Standard Number One – Understand users and their needs. And why we need to get Ministers and senior leaders to understand their importance.

The first role of any good User Centric designer or Product Manager within the public sector is understanding the problem you’re trying to solve.

In this case, the problem we’re facing is not a small one. Because of COVID-19 we currently have approximately 1.4M people unemployed with many more still facing redundancy due to the ongoing pandemic. ONS data states that between March and August, the number of people claiming benefits rose 120% to 2.7 million.

The Entertainment, Leisure and Hospitality sectors have been decimated, amongst many others. Just this week we’ve had Cineworld announce 45,000 job loses and Odeon may soon be following suit. Theatres and live event venues across the country are reporting they are on the brink of collapse.

So, when the Chancellor announced as part of the summer statement, a whole host of support for people too retrain; it included advice for people to use the new Careers and Skills advice service to get ideas on new career options.

A service to help people understand new career options right now is a great idea, it absolutely should meet user need.

A screenshot of the national careers service skills assessment

Unfortunately, you only have to look at the headlines to see how well the new service has been received. The service is currently such a laughing stock that no-one is taking it seriously; which is a massive shame, because it’s trying to solve a very real problem.

A number of my friends and acquaintances have now taken the quiz (as has half of twitter apparently) and it was suggested I have a look. So I did. (As an aside, it recommended I retrain in the hospitality industry, all who know me know how terrible this would be for all involved, last week I managed to forget to cook 50% of our dinner, and I am clinically unable to make a good cup of coffee, never mind clean or tidy anything!)

It has good intentions, and in a number of cases, it may not be too far off the mark; the team behind the service have done a write up here* of how they have developed it, and what they set out to achieve. Unfortunately, while the service seems to be simple to understand and accessible to use; what it seems to be missing is any level of context or practicality that would help it meet the problem it’s being used for.

*EDIT: Which has sadly now been taken down, which is a massive shame, because they did good work, but sadly I suspect under political pressure to get something out there quickly. We’ve all been there, it’s a horrid position to be in.

While they have tested with users with accessibility needs, the focus seems to have been on whether they can use the digital service; not does the service actually meet their needs?

My friend with severe mobility and hearing issues was advised to retrain as a builder. Another friend with physical impairments (and a profound phobia of blood) was advised they were best suited to a role as a paramedic. A friend with ASD who also has severe anxiety and an aversion to people they don’t know, was advised to become a beautician. Another friend who is a single parent was given three career options that all required evening and weekend work. At no point does this service ask whether you have any medical conditions or caring needs that would limit the work you could do. While you can argue that that level of detail falls under the remit of a jobs coach; it can understandable be seen as insensitive and demoralising to be recommending careers to people they are physically unable to do.

Equally, unhelpful is the fact the service which has been especially recommended to people who have been made redundant from the worst hit industries; is recommending those same decimated industries to work in, with no recognition of the current jobs market.

My partner, who was actually made redundant from her creative role due to COVID-19, (and the target audience for this service according to the Chancellor) was advised to seek a role in the creative industries; an industry that doesn’t currently exist; and a quick look on social media proves she isn’t alone.

The service doesn’t actually collect enough (well, any) data about the career someone is in, nor does it seem to have any interface to the current jobs market to understand whether the careers its recommending are actually viable.

Unfortunately, the service is too generic, and while it would possibly help school/ college students who are trying to choose their future career paths in a ‘normal’ job market, (And I honestly suspect that’s who it was actually developed for!) it’s not meetings the fundamental problem we are facing at the moment; ie. help people understand their career options in the current market.

If you’ve worked within Digital in the Public Sector you’ve had to deal with Ministers and Directors who don’t really understand the value of user research or why we need to test things properly before we role them out nationally. The current debacle with the careers website is possible a perfect example of why you need to make sure you actually test your service with a wide range of users regularly; not just rely on assumptions and user personas; and why its important to test and iterate the service with real users multiple times before it gets launched. It highlights the need for us to get Ministers to understand that rushing a service out there quickly isn’t always the right answer.

We all need to understand users and their needs. Just because a service is accessible doesn’t mean it solves the problem users are facing.

Notes from some Digital Service Standard Assessors on the Beta Assessment

The Beta Assessment is probably the one I get the most questions about; Primarily, “when do we actually go for our Beta Assessment and what does it involve?” 

Firstly what is an Assessment? Why do we assess products and services?

If you’ve never been to a Digital Service Standard Assessment it can be daunting; so I thought it might be useful to pull together some notes from a group of assessors, to show what we are looking for when we assess a service. 

Claire Harrison (Chief Architect at Homes England and leading Tech Assessor) and Gavin Elliot (Head of Design at DWP and a leading Design Assessor, you can find his blog here) helped me pull together some thoughts about what a good assessment looks like, and what we are specifically looking for when it comes to a Beta Assessment. 

I always describe a good assessment as the team telling the assessment panel a story. So, what we want to hear is:

  • What was the problem you were trying to solve?
  • Who are you solving this problem for? (who are your users?)
  • Why do you think this is a problem that needs solving? (What research have you done? Tell us about the users journey)
  • How did you decide to solve it and what options did you consider? (What analysis have you done?) 
  • How did you prove the option you chose was the right one? (How did you test this?)

One of the great things about the Service Manual is that it explains what each delivery phase should look like, and what the assessment team are considering at each assessment.

So what are we looking for at a Beta Assessment?

By the time it comes to your Beta Assessment, you should have been running your service for a little while now with a restricted number of users in a Private Beta. You should have real data you’ve gathered from real users who were invited to use your service, and your service should have iterated several times by now given all the things you have learnt. 

Before you are ready to move into Public Beta and roll your service out Nationally there are several things we want to check during an assessment. 

You need to prove you have considered the whole service for your users and have provided a joined up experience across all channels.

  • We don’t want to just hear about the ‘digital’ experience; we want to understand how you have/will provide a consistent and joined up experience across all channels.
  • Are there any paper or telephony elements to your service? How have you ensured that those users have received a consistent experience?
  • What changes have you made to the back end processes and how has this changed the user experience for any staff using the service?
  • Were there any policy or legislative constraints you had to deal with to ensure a joined up experience?
  • Has the scope of your MVP changed at all so far in Beta given the feedback you have received from users? 
  • Are there any changes you plan to implement in Public Beta?

As a Lead Assessor this is where I always find that teams who have suffered with empowerment or organisational silos may struggle.

If the team are only empowered to look at the Digital service, and have struggled to make any changes to the paper/ telephony or face to face channels due to siloed working in their Department between Digital and Ops (as an example) the Digital product will offer a very different experience to the rest of the service. 

As part of that discussion we will also want to understand how you have supported users who need help getting online; and what assisted digital support you are providing.

At previous assessments you should have had a plan for the support you intended to provide, you should now be able to talk though how you are putting that into action. This could be telephony support or a web chat function; but we want to ensure the service being offered is/will be consistent to the wider service experience, and meeting your users needs. We also want to understand how it’s being funded and how you plan to publish your accessibility info on your service. 

We also expect by this point that you have run an accessibility audit and have carried out regular accessibility testing. It’s worth noting, if you don’t have anyone in house who is trained in running Accessibility audits (We’re lucky in Difrent as we have a DAC assessor in house), then you will need to have factored in the time it takes to get an audit booked in and run well before you think about your Beta Assessment).

Similarly, by the time you go for your Beta Assessment we would generally expect a Welsh language version of your service available; again, this needs to be planned well in advance as this can take time to get, and is not (or shouldn’t be) a last minute job! Something in my experience a lot of teams forget to prioritise and plan for.

And finally assuming you are planning to put your service on GOV.UK, you’ll need to have agreed the following things with the GOV.UK team at GDS before going into public beta:

Again, while it shouldn’t take long to get these things sorted with the GOV.UK team, they can sometimes have backlogs and as such it’s worth making sure you’ve planned in enough time to get this sorted. 

The other things we will want to hear about are how you’ve ensured your service is scalable and secure. How have you dealt with any technical constraints? 

The architecture and technology – Claire

From an architecture perspective, at the Beta phases I’m still interested in the design of the service but I also have a focus on it’s implementation, and the provisions in place to support sustainability of the service. My mantra is ‘end-to-end, top-to-bottom service architecture’!

 An obvious consideration in both the design and deployment of a service is that of security – how the solution conforms to industry, Government and legal standards, and how security is baked into a good technical design. With data, I want to understand the characteristics and lifecycle of data, are data identifiable, how is it collected, where is it stored, hosted, who has access to it, is it encrypted, if so when, where and how? I find it encouraging that in recent years there has been a shift in thinking not only about how to prevent security breaches but also how to recover from them.

Security is sometimes cited as a reason not to code in the open but in actual fact this is hardly ever the case. As services are assessed on this there needs to be a very good reason why code can’t be open. After all a key principle of GDS is reuse – in both directions – for example making use of common government platforms, and also publishing code for it to be used by others.

Government services such as Pay and Notify can help with some of a Technologist’s decisions and should be used as the default, as should open standards and open source technologies. When  this isn’t the case I’m really interested in the selection and evaluation of the tools, systems, products and technologies that form part of the service design. This might include integration and interoperability, constraints in the technology space, vendor lock-in, route to procurement, total cost of ownership, alignment with internal and external skills etc etc.

Some useful advice would be to think about the technology choices as a collective – rather than piecemeal, as and when a particular tool or technology is needed. Yesterday I gave a peer review of a solution under development where one tool had been deployed but in isolation, and not as part of an evaluation of the full technology stack. This meant that there were integration problems as new technologies were added to the stack. 

The way that a service evolves is really important too along with the measures in place to support its growth. Cloud based solutions help take care of some of the more traditional scalability and capacity issues and I’m interested in understanding the designs around these, as well as any other mitigations in place to help assure availability of a service. As part of the Beta assessment, the team will need to show the plan to deal with the event of the service being taken temporarily offline – detail such as strategies for dealing with incidents that impact availability, and the strategy to recover from downtime and how these have been tested.

Although a GDS Beta assessment focuses on a specific service, it goes without saying that a good Technologist will be mindful of how the service they’ve architected impacts the enterprise architecture and vice-versa. For example if a new service built with microservices and also introduces an increased volume and velocity of data, does the network need to be strengthened to meet the increase in communications traversing the network?

Legacy technology (as well as legacy ‘Commercials’ and ways of working) is always on my mind. Obviously during an assessment a team can show how they address legacy in the scope of that particular service, be it some form of integration with legacy or applying the strangler pattern, but organisations really need to put the effort into dealing with legacy as much as they focus on new digital services. Furthermore they need to think about how to avoid creating ‘legacy systems of the future’ by ensuring sustainability of their service – be it from a technical, financial and resource perspective. I appreciate this isn’t always easy! However I do believe that GDS should and will put much more scrutiny on organisations’ plans to address legacy issues.

One final point from me is that teams should embrace an assessment. Clearly the focus is on passing an assessment but regardless of the outcome there’s lots of value in gaining that feedback. It’s far better to get constructive feedback during the assessment stages rather than having to deal with disappointed stakeholders further down the line, and probably having to spend more time and money to strengthen or redesign the technical architecture.

How do you decide when to go for your Beta Assessment?

Many services (for both good and bad reasons) have struggled with the MVP concept; and as such the journey to get their MVP rolled out nationally has taken a long time, and contained more features and functionality then teams might have initially imagined.  

This can make it very hard to decide when you should go for an Assessment to move from Private to Public Beta. If your service is going to be rolled out to millions of people; or has a large number of user groups with very different needs; it can be hard to decide what functionality is needed in Private Beta vs. Public Beta or what can be saved until Live and rolled out as additional functionality. 

The other things to consider is, what does your rollout plan actually look like? Are you able to go national with the service once you’ve tested with a few hundred people from each user group? Or, as is more common with large services like NHS Jobs, where you are replacing an older service, does the service need to be rolled out in a very set way? If so, you might need to keep inviting users in until full rollout is almost complete; making it hard to judge when the right time for your Beta assessment is. 

There is no right or wrong answer here, the main thing to consider is that you will need to understand all of the above before you can roll your service out nationally, and be able to tell that story to the panel successfully. 

This is because theoretically most of the heavy lifting is done in Private Beta, and once you have rolled out your service into Public Beta, the main things left to test are whether your service scaled and worked as you anticipated. Admittedly this (combined with a confusion about the scope of an MVP) is why most Services never actually bother with their Live Assessment. For most Services, once you’re in Public Beta the hard work has been done; there’s nothing more to do, so why bother with a Live Assessment? But that’s an entirely different blog! 

Reviewing the service together.

 

Delivering Digital Government 2019

This week Claire Harrison (Head of Architecture from CQC) and I had the opportunity to attend the Delivering Digital Goverment event run by Worth Systems in The Hague.

The event was focused on how digital has transformed governments across the world, sharing best practices and lessons learned. With speakers from the founding of GDS, like Lord Maude, as well as speakers from the Netherlands, and it was a great opportunity to meet others working on solving problems for users in the Government space wider than the UK.

A lot of the talks, especially by the GDS alum were things I had heard before, but I actually found that reassuring, that over 5 years later I am still doing the right things, and approaching problems in the right way.

It was especially interesting to hear from both Lord Maude, and others, about the work they have been doing with foreign governments, for example in Canada, Peru and Hawaii. The map Andrew Greenway, previous of GDS now from Public Digital, shared of the digital government movement was fantastic to see, and really made me realise how big what we are trying to achieve around the world really is.

@ad_greenway sharing a map of the Digtial Government transformations happening around the world

The talks from some of the Dutch speakers were really interesting. I loved hearing about the approach the council in The Hague are taking to digital innovations, and their soon to be published digital strategy. One of the pilots the city are running in particular intrigued me; in an effort to reduce traffic, they put sensors onto parking spaces in key shopping streets and all disabled parking bays in the city. This gave them real time information on the use of the parking spaces, and where available spaces were and successfully decreased traffic from people driving around searching for spaces. They were now looking at how to scale the pilot an manage the infrastructure and senor data for a ‘smart’ city, working with local business to enable new services to be offered.

The draft digital strategy for the city of The Hague

We also heard about the work the Netherlands has been doing to pilot other innovative digital services, like a new service that allows residents in an area to submit planning ideas to improve their neighbourhoods, with the first trial receiving over 50 suggestions, of those 4 have been chosen to take forward. We heard about the support that was given to enable everyone to take part, and it was nice to hear about the 78 year old resident who’s suggestion came 5th.

It was also great to hear from the speaker from Matthij from Novum, a digital innovation lab in the Netherlands, who talked about his own personal journey into Digital transformation, learning from failures and ensuring that you prepare for failure from the start. He also told us about some fascinating research they have been doing into the use of smart speakers, especially with the elderly, to enable better engagement and use of government services to those that need assistive technologies.

An image of an older lady talking to an AI robot, courtesy of Novum

Realising that 30% of eligible claimants for the Dutch state pension supplement were not claiming it, they believed that this was potentially down to the complexity of the form. They hypothesised that smart speakers might be one way to solve this problem. However recognising that it was no good to make assumptions and design a solution for users without ensuring they had understood the problem their users were facing properly they did a small sample test with elderly users to see whether they could use smart speakers to check the date of their next pension payment (one of the largest contributors to inbound calls to the Sociale Verzekeringsbank), they found that not only could elderly users use the smart speakers, but that the introduction of smart speakers into their homes decreased loneliness dramatically.

There were other good sessions with James Stewart from GDS & Public Digtial on technology within digital, and an interesting panel session at the end. Every session was good, and I learnt something I heard something new at each one. My only grumble from the day was the lack of diversity in the speakers. Which the organises themselves put their hands up and admitted before they were called out on it. A quick call on twitter and the ever amazing Joanne Rewcaslte from DWP shared a list of amazing female speakers, so hopefully that will help with the next event.

One key thing I took away from the day is that the challenges are the same everyone, but the message is also the same, involve users from the start. In the practical steps everyone could start tomorrow, Matthij talked about ensuring you interview 5 end users, and some steps to simple prototypes you could develop to engage your users.

This slide from Lord Maude summed up three of the main things any organisations needs to succeed in delivering Digital Transformation

Lord Maude talked about the importance of a strong mandate, Novum talked about having a good understanding of the problem you are trying to fix at the start. The digital strategy from the Hague highlights the fact they want everyone to be able to participate and deliver a personal service to their citizens. As Andrew Greenaway said, they key thing is to “start with user needs”.

The other second key message from the day was that, as Lord Maude put it… “Just Do it!” A digital strategy delivers nothing, the strategy should be delivery, instead of spending months on developing a digital strategy, “you just have to start” by doing something, this in turn will help you develop your strategy once you understand the problems you are trying to solve, the people you will need, and the set up and way of doing things that works best in your organisation. This was a message reinforced by every speaker throughout the day.

@jystewart sharing a statement from Ivana Osores from Interbank… “You have to just start”

The third key message was the importance of good leadership, good teams and good people. Talk in the open about the failures you’ve made and what you have learned. Build strong multidisciplinary and diverse teams. As Andrew Greenway said, Start with teams, not apps or documents. In the round table discussion on building capability we spent a lot of time discussing the best ways to build capability, and the fact that in order to get good people and be able to keep them, and to go on to develop good things, you need strong leadership that is bought in to the culture you need to deliver.

I left the day with a number of good contacts, had some great conversations, and felt reinvigorated and reassured. Speaking to Worth I know they are aiming to run another event next year, with both an even more diverse international cohort and an equal number of female speakers, and I for one will definitely be signing up again for the next event.

Lord Maude, myself and Claire Harrison at the social gathering after the event

Round and round we go.

In other words Agile isn’t linear so stop making it look like it is.

Most people within the public sector who work in Digital transformation have seen the GDS version of the Alpha lifecycle:

Which aims to demonstrate that developing services starts with user needs, and that projects will move from Discovery to Live, with iterations at each stage of the lifecycle.

The problem with this image of Agile is that it still makes the development of Products and Services seem linear, which it very rarely is. Most Products and Services I know, certainly the big complex ones, will need several cracks at a Discovery. They move into Alpha and then back to Discovery. They may get to Beta, stop and then start again. The more we move to a Service Design mentality, and approach problems holistically, the more complex we realise they are, and this means developing Products and Services that meet user needs is very rarely as simple and straightforward as the GDS Lifecycle makes it appear.

And this is fine, one of the core principles of Agile is failing fast. Stopping things rather than carrying on regardless. We iterate our Products and Services because we realise there is more to learn. More to Discover.

The problem is, especially in organisations new to Agile and the GDS way of working, they see the above image, and its more linear portrayal seems familiar and understandable to them, because they are generally user to Waterfall projects which are linear. So when something doesn’t move from Alpha to Beta, when it needs to go back into Discovery they see that as a failure of the team, of the Project. Sometimes it is, but more not always, sometimes the team have done exactly what they were meant to do, they realised the problem identified at the start wasn’t the right problem to fix because they have tested assumptions and learned from their research. This is what we want them to do.

The second problem with the image put forward in the GDS lifecycle is that it doesn’t demonstrate how additional features are added. The principle of Agile is getting the smallest usable bit of your Product or Service out and being used by users as soon as you can, the minimum viable product (MVP), and this is right. But once you have your MVP live what then? The Service Manual talks about keeping iterating in Live, but if your Product or Service is large or complex, then your MVP might just be for one of your user groups, and now you need to develop the rest. So what do you do? Do you go back into Discovery for the next user segment (ideally, if you need to yes), but the GDS lifecycle doesn’t show that.

As such, again for those organisations new to Agile, they don’t’ factor that in to their business cases, it’s not within the expectations of the stakeholders, and this is where Projects end up with bloated scopes and get stuck forever in Discovery or Alpha because the Project is too big to deliver.

With Public Services being developed to the Digital Service Standards set by GDS, we need a version of the lifecycle that breaks that linear mindset and helps everyone understand that within an Ariel project you will go around and around the lifecycle and back and forwards several times before you are done.

Agile is not a sprint, a race, or a marathon, it’s a game of snakes and ladders. You can get off, go back to the start or go back a phase or two if you need to. You win when all your user needs are met, but as user needs can change over time, you have to keep your eye on the board, and you only really stop playing once you decommission your Product or Service!


Speak Agile To Me:

I have blogged about some of these elsewhere, but a quick glossary of terms that you might hear when talking Agile or Digital Transformation.

Agile: A change methodology, focusing on delivering value as early as possible, iterating and testing regularly.

Waterfall: A Change methodology, focusing on a linear lifecycle delivering a project based on requirements gathered upfront.

Scrum: A type of Agile, based on daily communication and the flexible iteration of plans that are carried out in short timeboxes of work.

Kanban: A type of Agile, based on limiting throughput and the amount of work in progress.

The Agile Lifecycle: Similar to other change methodology lifecycles, the agile lifecycle is the stages a project has to go through. Unlike other lifecycles, agile is not a linear process, and products or services may go around the agile lifecycle several times before they are decommissioned.

Discovery: The first stage of the agile lifecycle, all about understanding who your users are; what they need and the problem you are trying to fix. Developing assumptions and hypothesis. Identifying a MVP that you think will fix the problem you have identified. Prioritising your user needs and
turning them into epic user stories.Akin to the requirements gathering stage in Waterfall.

Alpha: The design and development stage. Building prototypes of your service and testing it with your users. Breaking user needs and Epics into user stories and prioritising them. Identifying risks and issues understanding the architecture and infrastructure you will need prior to build. Akin to the design and implementation stage in Waterfall.

Beta: The build and test stage. Building a working version of your service. Ensuring your service is accessible, secure and scalable. Improving the service based on user feedback, measuring the impact of your service or product on the problem you were trying to fix. Can feature Private and Public Beta. Akin to the Testing and development stage in Waterfall.

Private Beta: Testing with a restricted number of users. A limited test. Users can be invited to user the service or limited by geographical region etc.

Public Beta: A product still in test phase but open to a wider audience, users are no longer invited in, but should be aware they product is still in test phase.

Live: Once you know your service meets all the user needs identified within your MVP, you are sure it is accessible, secure and scalable, and you have a clear plan to keep iterating and supporting it then you can go live. Akin to the Maintenance stage in Waterfall.

MVP: The Minimum Viable Product, the smallest releasable product with just enough features to meet user needs, and to provide feedback for future product development.

User Needs: The things your users need, evidenced by user research and testing. Akin to business requirements in Waterfall and other methodologies.

GDS: Government Digital Services, part of the Cabinet Office, leading digital transformation for Government, setting the Digital Service Standard that all Government Departments must meet when developing digital products and services.

The Digital Service Standards: https://www.gov.uk/service-manual/service-standard 18 standards all government digital services should meet when developing products and services.

Service Design: Looking at your Product or Service holistically, keeping it user focused while ensuring it aligns with your organisation strategy.

User Centric Design (UCD): The principles of user centric design are very simple, that you keep the users (both internal and external) at the heart of everything you do. This means involving users in the design process, rather than using ‘proxy’ users (people acting like users), you involve actual users throughout the design and development process. Recognising different users (and user groups) have different needs and that the best way to design services that meet those needs is to keep engaging with the users.