×

Tag: user needs

Assessing Innovation

(co-written with Matt Knight)

Some background, for context

Just over a month ago I got approached to ask if I could provide some advice on assessments to support phase two of the GovTech Catalyst (GTC) scheme. For those who aren’t aware of the GovTech Catalyst Scheme, there’s a blog here that explains how the scheme was designed to connect private sector innovators with the public sector sponsors, using Innovate UK’s Small Business Research Initiative (SBRI) to help find promising solutions to some of the hardest public sector challenges.

Person in a lab coat with a stethoscope around their neck looking through a Virtual Reality head set.
Looking for innovation

The Sponsor we were working with (who were one of the public sector sponsors of the scheme) had put two suppliers through to the next phase and allocated funding to see how and where tech innovation could help drive societal improvements in Wales. As part of their spend approval for the next phase, the teams had to pass the equivalent of a Digital Service Standard assessment at the 6 month point in order to get funding to proceed. 

For those who aren’t aware, there used to be a lovely team in GDS who would work with the GTC teams to provide advice and run the Digital Service Standard assessments for the projects; unfortunately this team got stood down last year; after the recent GTC initiatives started, leaving them with no one to talk to about assessments, nor anyone in place to assess them. 

The sponsor had reached out to both GDS and NHS Digital to see if they would be willing to run the assessments or provide advice to the teams, but had no luck; which left them a bit stuck; which is where I came in. I’ve blogged before about the Digital Service Standards; which led to the Sponsor reaching out to me to ask whether I’d be willing and able to help them out; or whether I knew any other assessors who might be willing to help. 

Preparing for the Assessments

As there were two services to assess; one of the first things I did was talk to the wonderful Matt Knight to see if he’d be willing and able to lead one of the assessments. Matt’s done even more assessments than me; and I knew he would be able to give some really good advice to the product teams to get the best out of them and their work. 

Matt and I sat and had a discussion on how to ensure we were approaching our assessments consistently; how to ensure we were honouring and adhering to the core tenants of the Digital Standards whilst also trying to assess the teams innovation and the value for money their services could deliver in line with the criteria for the GovTech scheme.

What became quickly apparent was; because this was to support the GTC scheme; the teams doing the work were fully private sector with little experience of the Digital Service Standards. A normal assessment, with the standard ‘bar’ we’d expect teams to be able to meet, wouldn’t necessarily work well; we’d need to be a little flexible in our approach. 

Obvious, no matter what type of Assessment you’re doing the basic framework of an assessment stays the same (start with user needs, then think about the End-to-End service, then you can talk about the team and design and tech, and along the way you need to ask about the awkward stuff like sustainability and open source and accessibility and metrics) can be applied to almost anything and come up with a useful result, regardless of sector/background/approach. 

As the services were tasked with trying to improve public services in Wales, we also wanted to take account of the newly agreed Welsh Digital Standards; using them alongside the original Digital Standards; obviously the main difference was the bits of the Welsh Standards that covered ensuring the well-being of people in Wales and promoting the Welsh Language (standards 8 & 9), you can read more about the Well being of future generations Act here

The assessments themselves 

An image of a team mapping out a user journey
User Journey Mapping

The assessments themselves ran well, (with thanks to Sam Hall, Coca Rivas and Claire Harrison my co-assessors) while the service teams were new to the process they were both fully open and willing to talk about their work, what went well and not so well and what they had learnt along the way. There was some great work done by both the teams we assessed, and it’s clearly a process that everyone involved learned a lot from, both in terms of the service teams, and the sponsor team, and it was great to hear about how they’d collaborated to support user research activities etc. Both panels went away to write up their notes; at which point Matt and I exchanged notes to see if there were any common themes or issues; and interestingly both assessments had flagged the need for a Service Owner from the sponsor to be more involved in order to help the team identify the success measures etc. 

When we played the recommendations and findings back to the Sponsor, this led to an interesting discussion; although the sponsor had nominated someone to act as the link for the teams in order to answer their questions etc. and to try and provide the teams some guidance and steer where they could. Because of the terms of the GTC scheme, the rules on what steers they could and couldn’t give were quite strict to avoid violating the terms of the competition. Originally the GTC team within GDS would have helped the sponsors navigate these slightly confusing waters in terms of competition rules and processes. However, without an experienced team to turn to for advice it leaves sponsors in a somewhat uncomfortable and unfamiliar position; although they had clearly done their best (and the recommendations in this blog are general comments on how we can improve how we assess innovation across the board and not specifically aimed at them)”

Frustratingly this meant that even when teams were potentially heading into known dead-ends etc; while the sponsor could try to provide some guidance and steer them in a different direction; they couldn’t force the teams pivot or change; instead the only option would be to pull the funding. While this makes sense from a competition point of view; it makes little to no sense from a public purse point of view; or from a Digital Standards point of view. It leaves sponsors stuck (when things might have gone a little off track) rather than being able to get teams to pivot; they are left choosing between potentially throwing away or losing some great work; or investing money in projects that may not be able to deliver. 

Which then raises the question; how should we be assessing and supporting innovation initiatives? How do we ensure they’re delivering value for the public purse whilst also remaining fair and competitive? How do we ensure we’re not missing out on innovative opportunities because of government bureaucracy and processes? 

In this process, what is the point of a Digital Service Standard assessment? 

If it’s like most other assessment protocols (do not start Matt on his gateway rant), then it’s only to assess work that has already happened. If so, then it’s not much good here, when teams are so new to the standards and need flexible advice and support on what they could do next etc.   

If it’s to assess whether a service should be released to end users, then it’s useful in central government when looking to roll out and test a larger service; but not so much use when it’s a small service, mainly internal users or a service that’s earlier on in the process aiming to test a proof of concept etc. 

If it’s to look at all of the constituent areas of a service, and provide help and guidance to a multidisciplinary team in how to make it better and what gaps there are (and a bit of clarity from people who haven’t got too close to see clearly), then it’s a lot of use here, and in other places; but we need to ensure the panel has the right mix of experts to be able to assess this. 

While my panel was all fantastic; and we were able to assess the levels of user research the team had done, their understanding of the problems they were seeing to solve, their ability to integrate with legacy tech solutions and how their team was working together etc. none of us had any experience in assessing innovation business cases or understanding if teams had done the right due diligence on their financial funding models. The standards specify that teams should have their budget sorted for the next phase and a roadmap for future development; in my experience this has generally been a fairly easy yes or no; I certainly wouldn’t know a good business accelerator if it came and bopped me on the nose. So while we could take a very high level call on whether we thought a service could deliver some value to users; and whether a roadmap or budget looked reasonable; a complex discussion on funding models and investment options was a little outside our wheelhouse; so was not an area we could offer any useful advice or recommendations on.  

How can we deliver and assess innovation better going forward? 

If we’re continuing to use schemes like the GTC scheme to sponsor and encourage private sector innovators to work with the public sector to solve important problems affecting our society, then we obviously need a clear way to assess their success. But we also need to ensure we’re setting up these schemes in such a way that the private sector is working with the public sector; and that means we need to be working in partnership; able to advise and guide them where appropriate in order to ensure we’re spending public money wisely. 

There is a lot of great potential out there to use innovative tech to help solve societal issues; but we can’t just throw those problems at the private sector and expect them to do all the hard work. While the private sector can bring innovative and different approaches and expertise, we shouldn’t ignore the wealth of experience and knowledge within the public sector either. We need people within the public sector with the right digital skills, who are able to  prioritise and understand the services that are being developed inorder to ensure that the public purse doesn’t pay for stuff that already exists to be endlessly remade. 

Assessment can have a role in supporting innovation; as long as we take a generous rather than nitpicking (or macro rather than micro) approach to the service standard. Assessments (and the Standards themselves) are a useful format for structuring conversations about services that involve users (hint: that’s most of them) just the act of starting with user needs – pt 1 – rather than tech – changes the whole conversation. 

However,  to make this work and add real value, solve a whole problem for users (point 2 of the new uk govt standard) – is critical, and that involves having someone who can see the entire end to end process for any new service and devise and own success measures for it. The best answer to both delivering innovation, and assessing it, is bringing the private and public sector together to deliver real value; creating a process that builds capacity, maturity and genuine collaboration within the wider public sector. A space to innovate and grow solutions. True multidisciplinary collaboration, working together to deliver real value.

“Together, We Create”

Big thanks to Matt for helping collaborate on this, if you want to find his blog (well worth a read) you can do so here:

Cost vs. Quality

A debate as old as time, and a loop that goes around and around; or so it seems in the Public Sector commercial space.

Every few years, often every couple of spend control cycles, the debate of cost vs. quality rears its head again; with Commercial weighting flip flopping between Quality as the most important factor, to cost (or lowest cost) as the highest priority.

When quality is the most important factor in the commercial space; Government Departments will prioritise the outputs they want to achieve; and weighting their commercial scores to the areas that indicate Quality – things like ‘Value Add’; ‘Delivering Quality’, ‘Culture’, ‘Delivering in Partnership etc’. We will see more output focused contracts coming out on to the market; with organisations clear on the vision they want to achieve and problems they need to solve and looking for the supplier that can best help them achieve that.

When reducing costs becomes the highest priority, the commercial weighting moves to ‘Value for Money’. Contracts are more likely to be fixed price and are often thinly veiled requests for suppliers to act as body shops rather than partners with commercial tenders scoring day rate cards rather than requesting the cost for overall delivery of outcomes.

Unfortunately, a lot of the time, when the priority switches to cost over quality; we end up with a lot of projects not being delivered; of outcomes being missed, and user needs not being met. In order to cut more and more costs, offshoring resource can become the only way to deliver the results cheaply; with the departmental project teams working out of sync with their offshore delivery partners; making co-design and delivery much harder to do, and making it almost impossible to achieve the required quality. This goes in a cycle, with Departments toting and grooming between “offshore as much as possible to cut costs” and “the only way to deliver quality is for everyone to be collocated in the office 100% of the time”. Full collocation of the teams inevitably driving up the costs again.

So, does that mean in order to get quality we have to have high costs? Surely there is an obviously a sweet spot we’re all looking for, where cost and quality align; but why does it seem so hard to achieve within the Public Sector and what do we need to be looking at to achieve it?

When the government commercial function (and GDS) shook up the public sector digital world over nearly a decade ago they introduced things like the Digital Marketplace and implemented the Spend Control pipeline; with the aim of moving departments away from the large SI’s that won 90% of government contracts. These suppliers often charged a fortune and rarely seemed to deliver what was actually needed. (This blog gives the details on what they intended, back in 2014).

Lots of SME suppliers began to enter the market and began to win contracts and change up how contracts were delivered, as completion increased, costs decreased; with quality partnerships forming between new suppliers and government departments; and the quality of delivery increased as new options, solutions and was of working were explored.

However, this left Departments managing lots of individual contracts; which grew increasingly complex and time consuming to mange. In order to try and reduce the number of contracts they had to manage; the scale of the contracts began to increase, with more and more multimillion pound contacts emerging.

As the size and value of the contracts increased, SME’s began to struggle to win them, as they couldn’t stand up the teams needed quickly; nor could they demonstrate they had the experience in delivering contracts of that scale; which became a bit of a self-fulfilling prophecy, as the larger SI’s continued to win the larger contracts as they were the only ones able to provide the evidence they could staff and deliver them; and their costs remained high.

This left the SME’s facing three options:

  • Decide not to try for the larger contracts, reducing the amount of competition; potentially increasing costs and decreasing quality in the long run);
  • Form partnership agreements with a number of other SME’s or a larger supplier (again reducing the amount of completion) in order to be able to stand up the teams needed and enable delivery of larger contracts. However having a consortium of suppliers not used to working together could complicate delivery, which could in turn decrease the quality or speed of delivery if not carefully managed; as such not all contracts allowed consortium or partnership bids due to the perceived complexity they could bring.
  • Or the SME aimed to grow to allow them to be able to win and deliver the larger contracts. As SME’s grew however, they would often have to either increase their costs in order to run a larger organisation that could still deliver the same quality they did as before; or they could keep their costs low, but their quality would likely decrease.

Throughout the pandemic, the focus has been on delivery; and there’s been a healthy mix of both small and large contracts coming out, meaning lots of competition. While costs have always been a factor;  the pandemic allowed both departments and suppliers to remove much of the costly admin and bureaucratic approval processes in favour of lightweight approaches involved to bring on suppliers and manage teams outputs, encouraging innovation in delivery and cost; with lockdowns ensuring co-location was now out of the question many suppliers were able to reduce their rates to support the pandemic response as both departments and suppliers agreeing that the priority had been on delivering quality products and services to meet organisations and users urgent needs.The removal of co-location as a prerequisite also open up the market to more suppliers to bid for work, and more individuals applying for more roles; which increased competition and inevitably improved the quality out the outputs being produced. This in fact led to a lot of innovation being delivered throughout the pandemic which has benefited us all.

As we move out of the pandemic and into the next spending review round; the signs are that the focus is about to swing back to costs as the highest priority. With larger contracts coming out that are looking for cheaper day rates in order to allow departments to balance their own budgets; but as the economy bounces back and departments begin to insist again that teams return to the office, most suppliers will want to increase their costs to pre-pandemic levels. If we’re not careful the focus on cost reduction will mean we could decrease the quality and innovation that has been being delivered throughout the pandemic; and could cost the taxpayers more in the long run. Look at DWP’s first attempt to deliver Universal Credit for how badly things can go wrong when cost is the highest priority and when the Commercial team and runs the procurement process with minimum input from Delivery; driving the commercial and deliver decisions being made more than quality.

To find the sweet spot between Cost and Quality we need to create the best environment for innovation and competition. Allowing flexibility on where teams can be based will support this; supporting and encouraging SME’s and Medium sized suppliers to bid for and win contracts by varying contract sizes and values. Focusing on outputs over body shopping. Looking for what value suppliers can add in terms of knowledge transfer and partnership rather than simply prioritising who is the cheapest.

It’s important we all work together to get the balance between cost and quality right, and ensure we remain focused on delivering the right things in the right way.

Seesaw

And this is why we test with users…

A blog on the new National Careers ‘Discover your skills and careers’ Service

As I sit here are ten past ten on a Wednesday night watching social media have a field day with the new National Careers service, I’m yet again reminded about the importance of the Digital Service Standard, especially Standard Number One – Understand users and their needs. And why we need to get Ministers and senior leaders to understand their importance.

The first role of any good User Centric designer or Product Manager within the public sector is understanding the problem you’re trying to solve.

In this case, the problem we’re facing is not a small one. Because of COVID-19 we currently have approximately 1.4M people unemployed with many more still facing redundancy due to the ongoing pandemic. ONS data states that between March and August, the number of people claiming benefits rose 120% to 2.7 million.

The Entertainment, Leisure and Hospitality sectors have been decimated, amongst many others. Just this week we’ve had Cineworld announce 45,000 job loses and Odeon may soon be following suit. Theatres and live event venues across the country are reporting they are on the brink of collapse.

So, when the Chancellor announced as part of the summer statement, a whole host of support for people too retrain; it included advice for people to use the new Careers and Skills advice service to get ideas on new career options.

A service to help people understand new career options right now is a great idea, it absolutely should meet user need.

A screenshot of the national careers service skills assessment

Unfortunately, you only have to look at the headlines to see how well the new service has been received. The service is currently such a laughing stock that no-one is taking it seriously; which is a massive shame, because it’s trying to solve a very real problem.

A number of my friends and acquaintances have now taken the quiz (as has half of twitter apparently) and it was suggested I have a look. So I did. (As an aside, it recommended I retrain in the hospitality industry, all who know me know how terrible this would be for all involved, last week I managed to forget to cook 50% of our dinner, and I am clinically unable to make a good cup of coffee, never mind clean or tidy anything!)

It has good intentions, and in a number of cases, it may not be too far off the mark; the team behind the service have done a write up here* of how they have developed it, and what they set out to achieve. Unfortunately, while the service seems to be simple to understand and accessible to use; what it seems to be missing is any level of context or practicality that would help it meet the problem it’s being used for.

*EDIT: Which has sadly now been taken down, which is a massive shame, because they did good work, but sadly I suspect under political pressure to get something out there quickly. We’ve all been there, it’s a horrid position to be in.

While they have tested with users with accessibility needs, the focus seems to have been on whether they can use the digital service; not does the service actually meet their needs?

My friend with severe mobility and hearing issues was advised to retrain as a builder. Another friend with physical impairments (and a profound phobia of blood) was advised they were best suited to a role as a paramedic. A friend with ASD who also has severe anxiety and an aversion to people they don’t know, was advised to become a beautician. Another friend who is a single parent was given three career options that all required evening and weekend work. At no point does this service ask whether you have any medical conditions or caring needs that would limit the work you could do. While you can argue that that level of detail falls under the remit of a jobs coach; it can understandable be seen as insensitive and demoralising to be recommending careers to people they are physically unable to do.

Equally, unhelpful is the fact the service which has been especially recommended to people who have been made redundant from the worst hit industries; is recommending those same decimated industries to work in, with no recognition of the current jobs market.

My partner, who was actually made redundant from her creative role due to COVID-19, (and the target audience for this service according to the Chancellor) was advised to seek a role in the creative industries; an industry that doesn’t currently exist; and a quick look on social media proves she isn’t alone.

The service doesn’t actually collect enough (well, any) data about the career someone is in, nor does it seem to have any interface to the current jobs market to understand whether the careers its recommending are actually viable.

Unfortunately, the service is too generic, and while it would possibly help school/ college students who are trying to choose their future career paths in a ‘normal’ job market, (And I honestly suspect that’s who it was actually developed for!) it’s not meetings the fundamental problem we are facing at the moment; ie. help people understand their career options in the current market.

If you’ve worked within Digital in the Public Sector you’ve had to deal with Ministers and Directors who don’t really understand the value of user research or why we need to test things properly before we role them out nationally. The current debacle with the careers website is possible a perfect example of why you need to make sure you actually test your service with a wide range of users regularly; not just rely on assumptions and user personas; and why its important to test and iterate the service with real users multiple times before it gets launched. It highlights the need for us to get Ministers to understand that rushing a service out there quickly isn’t always the right answer.

We all need to understand users and their needs. Just because a service is accessible doesn’t mean it solves the problem users are facing.

The people getting left behind

Why ‘in the era of remote working we need to stop thinking about ‘digital services’ as a separate thing, and just think about ‘services’.

Last night when chatting to @RachelleMoose about whether digital is a privilege, which she’s blogged about here, it made me remember a conversation from a few weeks ago with @JanetHughes about the work DEFRA were doing, and their remit as part of the response to the current pandemic (which it turns out is not just the obvious things like food and water supplies, but also what do we do about Zoo’s and Aquariums during a lockdown?!)

A giraffe

This in turn got me thinking about the consequences of lockdown that we might never have really have considered before the COVID 19 pandemic hit; and the impact a lack of digital access has on peoples ability to access public services.

There are many critical services we offer everyday that are vital to peoples lives that we never imagined previously as ‘digital’ services which are now being forced to rely on digital as a means of delivery, and not only are those services themselves struggling to adapt but we are also at risk of forgetting those people for whom digital isn’t an easy option.

All ‘digital’ services have to prove they have considered Digital Inclusion, back in 2014 it was found approx. 20% of Britains had basic digital literacy skills, and the Digital Literacy Strategy aimed to have everyone who could be digital literate, digitally able by 2020. However it was believed that 10% of the population would never be able to get online, and the Assisted Digital paper published in 2013 set out how government would enable equal access to users to ensure digital excluded people were still able to access services. A report by the ONS last year backs this assumption up, showing that in 2019 10% of the population were still digital excluded.

However, as the effects of lockdown begin to be considered, we need to think about whether our assisted digital support goes far enough; and whether we are really approaching how we develop public services holistically, how we ensure they are future proof and whether we are truly including everyone.

There have been lots of really interesting articles and blogs about the impact of digital (or the lack of access to digital) on children’s education. With bodies like Ofsted expressing concerns that the lockdown will widen the gap education between children from disadvantaged backgrounds and children from more affluent homes; with only 5% of the children classified as ‘in need’ who were expected to still be attending school turning up.

An empty school room

According to the IPPR, around a million children do not have access to a device suitable for online lessons; the DfE came out last month to say there were offering free laptops and routers to families in need; however a recent survey showed that while over a quarter of teachers in private schools were having daily interaction with their pupils online less than 5% of those in state schools were interacting with their pupils daily online. One Academy chain in the North West is still having to print home learning packs and arrange for families to physically pick up and drop off school work.

The Good Things Foundation has shared its concerns similarly about the isolating effects of lockdown, and the digital divide that is being created, not just for families with children, but for people with disabilities, elderly or vulnerable people or households in poverty. Almost 2 million homes have no internet access, and 26 million rely on pay as you go data to get online. There has been a lot of concern raised about people in homes with domestic violence who have no access to phones or the internet to get help. Many companies are doing what they can to try and help vulnerable people stay connected or receive support but it has highlighted that our current approach to designing services is possibly not as fit for the future as we thought.

The current pandemic has highlighted the vital importance for those of us working in or with the public sector to understand users and their needs, but to also ensure everyone can access services. The Digital Service Standards were designed with ‘digital’ services in mind, and it was never considered 6 months ago, that children’s education, or people’s health care needed to be considered and assessed against those same standards.

The standards themselves say that the criteria for assessing products or services is applicable if either of the following apply:

  • getting assessed is a condition of your Cabinet Office spend approval
  • it’s a transactional service that’s new or being rebuilt – your spend approval will say whether what you’re doing counts as a rebuild

The key phrase here for me is ‘transactional service’ ie. the service allows:

  • an exchange of information, money, permission, goods or services
  • submitting of personal information that results in a change to a government record

While we may never have considered education as a transactional service before now, as we consider ‘the new normal’ we as service designers and leaders in the transformation space need to consider which of our key services are transactional, how we are providing a joined up experience across all channels; and what holistic service design really means. We need to move away from thinking about ‘digital and non digital services’ and can no longer ‘wait’ to assess new services, instead we need to step back and consider how we can offer ANY critical service remotely going forward should we need to do so.

A child using a tablet

Digital can no longer be the thing that defines those with privilege, COVID 19 has proved that now more than ever it is an everyday essential, and we must adapt our policies and approach to service design to reflect that. As such, I think it’s time that we reassess whether the Digital Service Standards should be applied to more services than they currently are; which services we consider to be ‘digital’ and whether that should even be a differentiator anymore. In a world where all services need to be able to operate remotely, we need to approach how we offer our services differently if we don’t want to keep leaving people behind.

Matt Knight has also recently blogged on the same subject, so linking to his blog here as it is spot on!

#GovermentIsOpen

Why we need to bring user centric design into our Communications in the public Sector.

Having been involved in the hiring of many Content and Interaction Designers in the last few years, we’ve always preferred candidates from within the Public Sector, because they tend to have the same specialisms as we in the Digital Data and Technology (DDaT) Profession have, looking down our nose a little at applicants from the private sector who seem to be a bit of a ‘jack of all trades of design’ doing some social media, some UX and some content design.

A Neon sign showing 0 likes.

We want people who understand user centric design, who design services based on user needs. We want content designers used to working in multidisciplinary teams designing and developing services. We want Content Designers who are used to designing what ‘we’ class as Content, which having spoken to people interested in applying for our roles seems to be quite often different, or at least a narrower definition, to what the wider industry classes as Content. A search for content design jobs online shows the breadth of jobs that can fall under that category.

But in the last year or so I’ve begun to look at those we have left behind with this approach, those we have excluded and where this has left us, especially in terms of both recruitement, and our engagement with our users.

The Government Design community is constantly growing and expanding. With the salaries being offered quickly outsripping the number of candidates we have available. We are all constantly stealing candidates from each other, and those departments and agencies that can’t afford to pay that much, are left relaying on contractors because we can’t hire people.

Digital is seen as a channel for contact, and within the public sector we are moving our products and services online. However, social media is generally not considered as part of that transformation. It is not a transactional service, and therefor generally not considered within the remit of the Digital design teams. The content we put out on social media is seen as the same as we put out to the press, it is a tool for giving out information, as such the people on our social media teams tend to be comms professionals, or people with a background in journalism or marketting.

People looking at their phones

Interestingly Social Media teams are not generally included within the Government design community, and until a conversation 18 months ago with Joanne Rewcastle at DWP Digital I’d never really thought about that. The DDaT roles are based around the roles first needed by Gov.uk and expanded on from there as part of the work by GDS. As such these are the roles needed to design and develop transactional services. Which makes sense.

However, it means we are not thinking about what our users need from our social media. We are not designing the content we put on social media in the same way as the content we put on our digital services, or even our websites.

Also, it means when it comes to recruitment, we are not looking preferably on those people who have a social media or wider comms background as they are not, by the DDaT definition, Content Designers, and unfortunately it is currently quite hard for people working in Social media or wider comms to move over into the Content Design space as they tend to not have the experience of working in multidisciplinary teams or on transactional user needs driven services we are looking for.

With our digital services we have to ensure they are accessible. Our content designers and interaction designers are experts in making sure our content is accessible and understandable by everyone. But in my experience we haven’t been making sure our social media teams are experts in that as well.

A keyboard with an accessibility symbol

It was from Content Design and Accessibility expert colleagues I learned the rule of #CapatalisingYourHashTags so that they can be better understood by accessibility software. The same goes for images and emojis, are we all making sure we’re using them in such a way that screen readers and accessibility software can understand them? If our users are using social media, if that is a service we offer, then do we not have the same responsibility to make sure that service is as usable and accessible as any other service we offer? Even if it isn’t ‘transactional’.

Our Social media colleagues are generally great in helping us think about how to design messages in ways to engage the audiences on different channels, they understand the demographics of the users on the different platforms and what messages work best with which users where. They often have a wealth of data and evidence regarding our users that could benefit Product Development teams. When we’re considering as Product teams how to engage our users it seems to me that is a great time to engage with our social media colleagues. Equally, Product teams, through user research sessions and user needs analysis collect a lot of evidence and data teams that could benefit our Social Media colleagues. Unfortunately I’ve seen very few places pulling those skills together well.

Full credit to DWP Digital’s social media team here, where the team reached out and joined up with the content design community even though they were not officially part of it according to the DDaT professions, to ensure they were considering user needs in how they used social media. That team worked incredibly hard to build people’s awareness of how to use social media, to ensure content was accessible and usable.

A mix of laptops and smartphones on a desk

A few other Departments have done simillar, and I think that is a good thing. But I also think we need to look again at social media across the public sector. It’s not just a marketing tool anymore, In the age of the internet a good social media presence can make or break a company. Nothing is ever really gone from the internet, and that tweet or Facebook post from 5 years ago can come back to bite you on the bum.

So why are more places not using the principles of user design in our social media, or recognising the hard work of those people who are pushing for accessibility and user design in social media as much as those who are designing good content for a website or transactional service?

We need to recognise that the people within our Social Media teams and our Content Design teams have more in common than not, and that when we are recruiting we can gain a lot from people who come form both sides of that bridge.

Delivering Digital Government 2019

This week Claire Harrison (Head of Architecture from CQC) and I had the opportunity to attend the Delivering Digital Goverment event run by Worth Systems in The Hague.

The event was focused on how digital has transformed governments across the world, sharing best practices and lessons learned. With speakers from the founding of GDS, like Lord Maude, as well as speakers from the Netherlands, and it was a great opportunity to meet others working on solving problems for users in the Government space wider than the UK.

A lot of the talks, especially by the GDS alum were things I had heard before, but I actually found that reassuring, that over 5 years later I am still doing the right things, and approaching problems in the right way.

It was especially interesting to hear from both Lord Maude, and others, about the work they have been doing with foreign governments, for example in Canada, Peru and Hawaii. The map Andrew Greenway, previous of GDS now from Public Digital, shared of the digital government movement was fantastic to see, and really made me realise how big what we are trying to achieve around the world really is.

@ad_greenway sharing a map of the Digtial Government transformations happening around the world

The talks from some of the Dutch speakers were really interesting. I loved hearing about the approach the council in The Hague are taking to digital innovations, and their soon to be published digital strategy. One of the pilots the city are running in particular intrigued me; in an effort to reduce traffic, they put sensors onto parking spaces in key shopping streets and all disabled parking bays in the city. This gave them real time information on the use of the parking spaces, and where available spaces were and successfully decreased traffic from people driving around searching for spaces. They were now looking at how to scale the pilot an manage the infrastructure and senor data for a ‘smart’ city, working with local business to enable new services to be offered.

The draft digital strategy for the city of The Hague

We also heard about the work the Netherlands has been doing to pilot other innovative digital services, like a new service that allows residents in an area to submit planning ideas to improve their neighbourhoods, with the first trial receiving over 50 suggestions, of those 4 have been chosen to take forward. We heard about the support that was given to enable everyone to take part, and it was nice to hear about the 78 year old resident who’s suggestion came 5th.

It was also great to hear from the speaker from Matthij from Novum, a digital innovation lab in the Netherlands, who talked about his own personal journey into Digital transformation, learning from failures and ensuring that you prepare for failure from the start. He also told us about some fascinating research they have been doing into the use of smart speakers, especially with the elderly, to enable better engagement and use of government services to those that need assistive technologies.

An image of an older lady talking to an AI robot, courtesy of Novum

Realising that 30% of eligible claimants for the Dutch state pension supplement were not claiming it, they believed that this was potentially down to the complexity of the form. They hypothesised that smart speakers might be one way to solve this problem. However recognising that it was no good to make assumptions and design a solution for users without ensuring they had understood the problem their users were facing properly they did a small sample test with elderly users to see whether they could use smart speakers to check the date of their next pension payment (one of the largest contributors to inbound calls to the Sociale Verzekeringsbank), they found that not only could elderly users use the smart speakers, but that the introduction of smart speakers into their homes decreased loneliness dramatically.

There were other good sessions with James Stewart from GDS & Public Digtial on technology within digital, and an interesting panel session at the end. Every session was good, and I learnt something I heard something new at each one. My only grumble from the day was the lack of diversity in the speakers. Which the organises themselves put their hands up and admitted before they were called out on it. A quick call on twitter and the ever amazing Joanne Rewcaslte from DWP shared a list of amazing female speakers, so hopefully that will help with the next event.

One key thing I took away from the day is that the challenges are the same everyone, but the message is also the same, involve users from the start. In the practical steps everyone could start tomorrow, Matthij talked about ensuring you interview 5 end users, and some steps to simple prototypes you could develop to engage your users.

This slide from Lord Maude summed up three of the main things any organisations needs to succeed in delivering Digital Transformation

Lord Maude talked about the importance of a strong mandate, Novum talked about having a good understanding of the problem you are trying to fix at the start. The digital strategy from the Hague highlights the fact they want everyone to be able to participate and deliver a personal service to their citizens. As Andrew Greenaway said, they key thing is to “start with user needs”.

The other second key message from the day was that, as Lord Maude put it… “Just Do it!” A digital strategy delivers nothing, the strategy should be delivery, instead of spending months on developing a digital strategy, “you just have to start” by doing something, this in turn will help you develop your strategy once you understand the problems you are trying to solve, the people you will need, and the set up and way of doing things that works best in your organisation. This was a message reinforced by every speaker throughout the day.

@jystewart sharing a statement from Ivana Osores from Interbank… “You have to just start”

The third key message was the importance of good leadership, good teams and good people. Talk in the open about the failures you’ve made and what you have learned. Build strong multidisciplinary and diverse teams. As Andrew Greenway said, Start with teams, not apps or documents. In the round table discussion on building capability we spent a lot of time discussing the best ways to build capability, and the fact that in order to get good people and be able to keep them, and to go on to develop good things, you need strong leadership that is bought in to the culture you need to deliver.

I left the day with a number of good contacts, had some great conversations, and felt reinvigorated and reassured. Speaking to Worth I know they are aiming to run another event next year, with both an even more diverse international cohort and an equal number of female speakers, and I for one will definitely be signing up again for the next event.

Lord Maude, myself and Claire Harrison at the social gathering after the event

Service Standards for the whole service

How the service standards have evolved over time….

Gov.uk has recently published the new Service Standards for government and public sector agencies to use when developing public facing transactional services.

I’ve previously blogged about why the Service Standards are important in helping us develop services that meet user needs, as such I’ve been following their iteration with interest.

The service standards are a labour of love that have been changed and iterated a couple of time over the last 6 years. The initial digital by default service standard, developed in 2013 by the Government Digital Service, came fully into force in April 2014 for use by all transactional Digital Products being developed within Government; it was a list of 26 standards all Product teams had to meet to be able to deliver digital products to the public. The focus was on creating digital services so good that people preferred to use them, driving up digital completion rates and decreasing costs by moving to digital services. It included making plans for the phasing out of alternative channels and encouraged that any non-digital sections of the service should only be kept where legally required.

A number of fantastic products and services were developed during this time, leading the digital revolution in government, and vastly improving users experience of interacting government. However, these Products and Services were predominantly dubbed ‘shiny front ends’. They had to integrate with clunky back end services, and often featured drop out points from the digital service (like the need for wet signatures) that it was difficult to change. This meant the ‘cost per transaction’ was actually very difficult to calculate; and yet standard 23 insisted all services must publish their cost per transaction as one of the 4 minimum key performance indicators required for the performance platform.

The second iteration of the digital service standard was developed in 2015, it reduced the number of standards services had to meet to 18, and was intended to be more Service focused rather than Product focused, with standard number 10 giving some clarity on how to ‘test the service end to end’. It grouped the standards together into themes to help the flow of the service standard assessments, it also clarified and emphasised a number of the points to help teams develop services that met user needs. While standard 16 still specified you needed a plan for reducing you cost per transaction, it also advised you to calculate how cost effective your non transactional user journeys were and to include the ‘total cost’ which included things like printing, staff costs and fixtures and fittings.

However, as Service design as a methodology began to evolve, the standards were criticised for still being too focused on the digital element of the service. Standard 14 still stated that ‘everyone much be encourage to use the digital service’. There were also a lot of questions about how the non digital elements of a service could be assessed, and the feeling that the standards didn’t cover how large or complicated some services could be.

Paper and Digital

The newest version of the Service standard has been in development since 2017, a lot of thought and work has gone into the new standard, and a number of good blogs have been written about the process the team have gone through to update them. As a member of some of the early conversations and workshops about the new standards I’ve been eagerly awaiting their arrival.

While the standards still specifically focus on public facing transactional services, they have specially be designed for full end to end services, covering all channels users might use to engage with a service. There are now 14 standards, but the focus is now much wider than ‘Digital’ as is highlighted by the fact the word Digital has been removed from the title!

Standard number 2 highlights this new holistic focus, acknowledging the problems users face with fragmented services. Which is now complimented by Standard number 3 that specifics that you must provide a joined up experience that meets all user needs across all channels. While the requirement to measure your cost per transaction and digital take up is still there for central government departments, it’s no longer the focus, instead the focus of standard 10 is now on identifying metrics that will indicate how well the services is solving the problem it’s meant to solve.

For all the changes, one thing has remained the same thorough out, the first standard upon which the principles of transformation in the public sector are built; understand the needs of your users.

Apparently the new standards are being rolled out for Products and Services entering Discovery after the 30th of June 2019, and I for one I’m looking forward to using them.

Launch!

What is the role of Business Analysts within agile?

Analysisng the role of the Business Analyst.

When we were looking at the Digital, Data and Technology (DDaT) roles and capabilities back in 2016, one of the roles we really struggled with was the role of the Business Analyst (BA).

Not because we didn’t agree that it was a role (because we absolutely did) but because we struggled to define the scope of the role in comparison to things like; Product Management, Design or User Research roles.

It’s one of the questions, that three years later still comes around regularly. Who is responsible for defining the requirements? What is the role of the BA?

Who holds the requirments?

The role of the BA in an agile team

One of the problems we had back when we were defining the BA role as part of the DDaT professions, was that the Government Digital Service didn’t have BA’s in their teams. Similarly, the original Scrum Manifesto only has 3 roles in an agile team, the product owner, development team members and scrum master.

Because traditionally, the BA acted as the link between the business units and IT, helping to discover the requirements and the solution to address them; when multidisciplinary teams bought those members together the role of the BA became less clear.

This has meant when adopting agile, different government departments implemented the role in slightly different ways. The biggest trap I have seen teams fall into was the BA getting stuck with all the admin roles for the project.

Roman Pilcher argued for those BA’s moving into Scrum there were two options, becoming the Product Owner, or becoming a ‘team member’.

While I agree that Business Analysts are a key member of a multidisciplinary team, I disagree with this assumption that everyone on an agile team who isn’t the scrum master or the PO is simply ‘a team member’ I think the Business Analyst is a critical role (especially for Product Managers!) that brings unique skills to the team.

So, first things first let’s look at what requirements are in the agile space.

Certainly, within digital government at least, we use a user centric design approach. We are developing products and services that fix the problems that our users are facing. We are identifying user needs and testing and iterating those throughout the product development lifecycle. A lot of the time this conversation about ‘user needs’ has replaced the more traditional conversations about ‘requirements’. Which is good in some ways, but has also led to a bit of confusion about what Business Analysts do if it’s not gathering requirements. Who owns the requirements now?

Are user researchers responsible for gathering the requirements from external users (user needs), whereas Business Analysts are responsible for gathering requirements based on what the business needs (sometimes called business user needs)?

That line of conversation worries me, because it suggests that we don’t need to carry out user research on internal staff, it forgets that internal staff are users too.

So, what is the role of the BA?

In my experience the conversation about who is responsible for gathering requirements is symptamatic of the limitations of the English language, and our obsession with ‘ownership’.

User researchers primarily focus on gathering more qualitative data; why users behave the way they do, how things are making them feel; probing their views and opinions to understand what it is they actually need etc. They will help understand who they users are and verify what the users need. They will work with the team to test design assumptions and help ensure the options being developed meet user needs.

Business Analysis primarily focus on gathering the more quantitative data about the process; both the ‘as is’ process, and the future process we are designing. They work to understand how many users are being or will be affected? What are the cost/time impacts of the problem as identified? What value could be gained through the implementation of any of the options the team are considering?

They help identify the stakeholders that will need to be engaged, and how to best engage with them. They will turn the user needs into user stories that the team can develop and identify the metrics and success criteria for the team to help them measure value.

Where you have a Product or Service that is live and being used by real users, the BA will work with Performance Analysts to understand the feedback data coming from the users.

User Researchers and Business Analysts will often work closely together, while BA’s will use tools like process mapping to identify pain points, user researchers will work with them to map the emotions users are experiencing so that we can fully understand the impact of our current processes and the value we can release by fixing the problem. When User Researchers are planning in research sessions etc., they will often work with BA’s to get the data on where best to test in terms of locations or user groups.

Good Product Managers will use both the Quantitative and Qualitative data to help them pick which options to test. Designers will help both the user researchers and business analysts look at the data and design prototypes etc. to test with users.

For me, each role is clear in its scope, and their need to work together to identify the right problems users are facing and the best way to test those; and it’s not about what individual owns the requirements, because the answer is the team do.

The Day Data went Viral

This week the UK Government and Parliament petitions website has been getting a lot of attention, both good and not so good. This site has been a great example of how the Digital Service Standards work to ensure what we deliver in the public sector meets user needs.

On the 20th of February a petition was created on the petitions website to Revoke Article 50 and remain within the EU, on the 21st of March the petition went viral, and as of writing this blog has currently got 5,536,580 5,608,428 5,714,965 signatures. This is the biggest petition to have ever been started since the sites launch. Not only that, it is now the most supported petition in the world, ever.

Screenshot of the petitions website

The first version of the site was developed in 2010 after the election. Originally intended to replace the Number 10 petition site, which had a subtly different purpose. The new version of the Parliamentary petitions site was then launched in 2015, as an easy way for users to make sure their concerns were heard by the government and parliament. The original version of the service was developed by Pete Herlihy and Mark O’Neill back in the very early days of Digital Government, before the Digital Service Standard was born.

The site was built using open source code, meaning anyone can access the source code used to build the site, making it is easy to interrogate the data. With a number of sites, like unboxed, developing tools to help map signatories of petitions etc based off the data available.

Screenshot of the unboxed website

Within the Governments Digital Design standards using open source code has always been one of the standards some departments have really struggled with, it’s digital standard number 8, and is often a bit contentious. But looking at the accusations being lobbied at the Revoke Article 50 petition, that people outside of the UK are unfairly signing the petition, that people are creating fake emails to sign the petition etc, it shows why open source data is so important. While the petitions committee won’t comment in detail about the security measures they use; examining the code you can see the validation the designers built into the site to try and ensure it was being used seurely and fairly.

britorbot data analysis

Speaking of security measures, that’s digital service standard number 7, making sure the service has the right security levels, the petitions site apparently uses both automated and manual techniques to spot bots; disposable email addresses and other fraudulent activities. This works with digital standard number 15, using tools for analysis that collect performance data; to monitor signing patterns etc. Analysing the data, 96% of signatories have been within the UK (what the committee would expect from a petition like this).

tweet from the Petitions Committee from 22nd March

Another key service standard is building a service that can be iterated and improved on a frequent basis (digital standard number 5), which mean that when the petition went viral, the team were able to spot that the site wasn’t coping with the frankly huge amount of traffic headed it’s way and quickly doubled the capacity of the service within a handful of hours.

tweet from Pete Herlihy (product manager – petitions website)

This also calls out the importance of testing your service end to end (standard number 10) and ensuring its scalable; and if and when it goes down (as the petitions website did a number of times given the large amount of traffic that hit it, you need to have a plan for what to do when it goes down (standard number 11), which for the poor Petitions team meant some very polite apologetic messages being shared over social media while the team worked hard and fast to get the service back online.

tweet from the Petitions Committee from 21st March

The staggering volume of traffic to the site, and the meteoric speed in which the petition went vial, shows that at its heart, the team who developed the service had followed Digital Service Standard number 1. Understand your user’s needs.

In today’s culture of social media, people have high expectations of services and departments with there interactions online, we live in a time of near instant news, entertainment and information- and an expectation of having the world available at our fingertips with a click of a button. People want and need to feel that their voice is being heard, and the petitions website tapped into that need, delivering it effectively under conditions that are unprecedented.

Interestingly when the site was first developed, Mark himself admitted they didn’t know if anyone would use it. There was a lot of concern from people that 100,000 signatures was too high a figure to trigger a debate; but within the first 100 days six petitions had already reached the threshold and become eligible for a debate in the Commons. Pete wrote a great blog back in 2011 summing up what those first 100 days looked like.

It’s an example of great form design, and following digital service standard number 12, it is simple and intuitive to use. This has not been recognised or celebrated enough over the last few days, both the hard work of the team who developed the service and those maintaining and iterating it today. In my opinion this service has proven over the last few days that it is a success, and that the principles behind the Digital Service Standards that provided the design foundations for the site are still relevant and adding value today.

tweet from Mark O’Neill (part of the original team)

Speak Agile To Me:

I have blogged about some of these elsewhere, but a quick glossary of terms that you might hear when talking Agile or Digital Transformation.

Agile: A change methodology, focusing on delivering value as early as possible, iterating and testing regularly.

Waterfall: A Change methodology, focusing on a linear lifecycle delivering a project based on requirements gathered upfront.

Scrum: A type of Agile, based on daily communication and the flexible iteration of plans that are carried out in short timeboxes of work.

Kanban: A type of Agile, based on limiting throughput and the amount of work in progress.

The Agile Lifecycle: Similar to other change methodology lifecycles, the agile lifecycle is the stages a project has to go through. Unlike other lifecycles, agile is not a linear process, and products or services may go around the agile lifecycle several times before they are decommissioned.

Discovery: The first stage of the agile lifecycle, all about understanding who your users are; what they need and the problem you are trying to fix. Developing assumptions and hypothesis. Identifying a MVP that you think will fix the problem you have identified. Prioritising your user needs and
turning them into epic user stories.Akin to the requirements gathering stage in Waterfall.

Alpha: The design and development stage. Building prototypes of your service and testing it with your users. Breaking user needs and Epics into user stories and prioritising them. Identifying risks and issues understanding the architecture and infrastructure you will need prior to build. Akin to the design and implementation stage in Waterfall.

Beta: The build and test stage. Building a working version of your service. Ensuring your service is accessible, secure and scalable. Improving the service based on user feedback, measuring the impact of your service or product on the problem you were trying to fix. Can feature Private and Public Beta. Akin to the Testing and development stage in Waterfall.

Private Beta: Testing with a restricted number of users. A limited test. Users can be invited to user the service or limited by geographical region etc.

Public Beta: A product still in test phase but open to a wider audience, users are no longer invited in, but should be aware they product is still in test phase.

Live: Once you know your service meets all the user needs identified within your MVP, you are sure it is accessible, secure and scalable, and you have a clear plan to keep iterating and supporting it then you can go live. Akin to the Maintenance stage in Waterfall.

MVP: The Minimum Viable Product, the smallest releasable product with just enough features to meet user needs, and to provide feedback for future product development.

User Needs: The things your users need, evidenced by user research and testing. Akin to business requirements in Waterfall and other methodologies.

GDS: Government Digital Services, part of the Cabinet Office, leading digital transformation for Government, setting the Digital Service Standard that all Government Departments must meet when developing digital products and services.

The Digital Service Standards: https://www.gov.uk/service-manual/service-standard 18 standards all government digital services should meet when developing products and services.

Service Design: Looking at your Product or Service holistically, keeping it user focused while ensuring it aligns with your organisation strategy.

User Centric Design (UCD): The principles of user centric design are very simple, that you keep the users (both internal and external) at the heart of everything you do. This means involving users in the design process, rather than using ‘proxy’ users (people acting like users), you involve actual users throughout the design and development process. Recognising different users (and user groups) have different needs and that the best way to design services that meet those needs is to keep engaging with the users.