It’s mental health awareness week, and I’ve seen lots of things about “I’m always here, reach out to me if you need help”, or “show you care, support those around you by reaching out to them and seeing how they are doing”.
And I wholeheartedly love both these sentiments. We should be talking about mental health more and more. As a society we’re having more conversation about what we can do to support each other and be kind to ourselves, and it’s great.
My problem is the thought of reaching out to anyone, to say I need help, or to offer support, fills me with dread.
I’ve talked before on social media about my Imposter Syndrome and social anxiety, and Gavin Elliot does one of the best blogs out there about what Imposter syndrome is. It makes you feel like you add no worth. So reaching out to ask for help, or offering help to others, is very very hard to do, because it is imposing on others time. Butting into their life uninvited. Interrupting them. Giving them the opportunity to see you.
I’ve been told in the past people don’t assume I ever need help because I seem confident. That I can seem imposing to approach. Yet I always try to help other when I am approached, I will share what’s going on in my life and the things that are bothering me if I am asked about them.
But the ability to reach out first? To drop someone a message out of the blue? On a bad day I can find that simple act almost impossible.
That fear is something over the years I’ve worked hard to overcome, and I will now try and force myself to reach out, both to ask for help, and to ask if I can help without waiting to be given a direct opening. But even on the best day, it still takes effort for me to do so, it is not natural for me.
It’s something I oddly find a little easier in a professional setting, as I know what my role is and my responsibilities within that, but outside of that scope then it becomes much harder for me to reach out first.
And the thing is, even when you’re doing well, and have been doing well for a while, it’s easy for your confidence to take a hit, and for you to take a backwards step. For things you thought you had overcome to rear their head.
And that is ok.
There will be times when you’re doing well and can do the things you find hard. And times when your can’t.
However you manage your mental health, the first step is knowing yourself, knowing what you find hard and what things can set you back, owning that knowledge. But its also important to recognise the things that can help you do the things you find hard. That good days and bad days exist.
And I just want to say I hear you. I’m here should you ever want to talk. Whether you can reach or to me or not, I want you to know you are not alone.
If you’re struggling with your mental health, Mind can be a good place to start if you need some help.
How the service standards have evolved over time….
Gov.uk has recently published the new Service Standards for government and public sector agencies to use when developing public facing transactional services.
I’ve previously blogged about why the Service Standards are important in helping us develop services that meet user needs, as such I’ve been following their iteration with interest.
The service standards are a labour of love that have been changed and iterated a couple of time over the last 6 years. The initial digital by default service standard, developed in 2013 by the Government Digital Service, came fully into force in April 2014 for use by all transactional Digital Products being developed within Government; it was a list of 26 standards all Product teams had to meet to be able to deliver digital products to the public. The focus was on creating digital services so good that people preferred to use them, driving up digital completion rates and decreasing costs by moving to digital services. It included making plans for the phasing out of alternative channels and encouraged that any non-digital sections of the service should only be kept where legally required.
A number of fantastic products and services were developed during this time, leading the digital revolution in government, and vastly improving users experience of interacting government. However, these Products and Services were predominantly dubbed ‘shiny front ends’. They had to integrate with clunky back end services, and often featured drop out points from the digital service (like the need for wet signatures) that it was difficult to change. This meant the ‘cost per transaction’ was actually very difficult to calculate; and yet standard 23 insisted all services must publish their cost per transaction as one of the 4 minimum key performance indicators required for the performance platform.
The second iteration of the digital service standard was developed in 2015, it reduced the number of standards services had to meet to 18, and was intended to be more Service focused rather than Product focused, with standard number 10 giving some clarity on how to ‘test the service end to end’. It grouped the standards together into themes to help the flow of the service standard assessments, it also clarified and emphasised a number of the points to help teams develop services that met user needs. While standard 16 still specified you needed a plan for reducing you cost per transaction, it also advised you to calculate how cost effective your non transactional user journeys were and to include the ‘total cost’ which included things like printing, staff costs and fixtures and fittings.
However, as Service design as a methodology began to evolve, the standards were criticised for still being too focused on the digital element of the service. Standard 14 still stated that ‘everyone much be encourage to use the digital service’. There were also a lot of questions about how the non digital elements of a service could be assessed, and the feeling that the standards didn’t cover how large or complicated some services could be.
The newest version of the Service standard has been in development since 2017, a lot of thought and work has gone into the new standard, and a number of good blogs have been written about the process the team have gone through to update them. As a member of some of the early conversations and workshops about the new standards I’ve been eagerly awaiting their arrival.
While the standards still specifically focus on public facing transactional services, they have specially be designed for full end to end services, covering all channels users might use to engage with a service. There are now 14 standards, but the focus is now much wider than ‘Digital’ as is highlighted by the fact the word Digital has been removed from the title!
Standard number 2 highlights this new holistic focus, acknowledging the problems users face with fragmented services. Which is now complimented by Standard number 3 that specifics that you must provide a joined up experience that meets all user needs across all channels. While the requirement to measure your cost per transaction and digital take up is still there for central government departments, it’s no longer the focus, instead the focus of standard 10 is now on identifying metrics that will indicate how well the services is solving the problem it’s meant to solve.
For all the changes, one thing has remained the same thorough out, the first standard upon which the principles of transformation in the public sector are built; understand the needs of your users.
Apparently the new standards are being rolled out for Products and Services entering Discovery after the 30th of June 2019, and I for one I’m looking forward to using them.
Are the roles of Scrum Master and Delivery manager the same?
Continuing on my recent musings on the different roles within Agile multidisciplinary teams, today’s blog focuses on the role of the Delivery Manager, or the Scrum Master, and whether these roles are really the same thing.
This is a conversation that came up a few weeks ago at the #ProductPeople community meetup in Birmingham, and something that causes quite a bit of frustration from those people I’ve talked to in the Delivery Manager community.
The role of the Scrum Master is that of facilitator within the multidisciplinary team, it is a role particular to Scrum, and they are the ‘expert’ on how to make Scrum work, often described as a ‘Servant leader’ they help everyone in the team understand the theory and practices of Scrum as a way of working together.
Within digital government the role has been widened out to include other responsibilities, and often mixed with the role of the Delivery Manager. Emily Webber did a fantastic blog a few years ago on the role of the Delivery Manger, and as she put’s it, while the roles are often used interchangeably, they really shouldn’t be.
But why not? What makes them different?
As said above, the Scrum Master focuses on the ‘how’ of Scrum as a methodology. The are the expert in the room on how best to utilise Scrum to deliver. They are more akin to an agile coach, guiding the team, and often the person best versed on the most up to date practices and ways of working.
But for me, the Delivery Manager focuses more on the ‘What’ and ‘When’. While the Product Manager (or Owner) focuses on ‘Why’ the team are doing what they are doing, the problems they are trying to solve, the vision they are trying to deliver. The Delivery Manager is looking at what could block the team from being able to deliver; what the right make up of the team needs to be in terms of roles and capabilities, what governance processes does the team have to meet in order to stay on track to deliver, and when delivery will happen.
As the Digital Data and Technology capability framework says, at the most basic level the delivery manager is accountable for the delivery of products and services. They are very much a doer paired with the Product Managers visionary thinker. They make sure things actually happen. They hold the Product Manager and the team to account and keep them on track.
They are the heart of the team, responsible for maintaining the health and happiness of the team; they understand who from the team will be available and when, making sure people are able to work well together, identifying conflicts and ensuring the team stay motivated and happy in order to enable delivery.
When you look at the role as explained in the capability framework it looks very straight forward, build and motivate teams, manage risk s and issues, ensure delivery, ok great. But then you get to the bit that merges the scrum master tole and the delivery manager role, and this is where a lot of individuals I know within the team struggle, “coach team members and others, facilitate continuous improvement and apply the most appropriate agile and lean tools and techniques for their environment”.
This is actually quite a big task, to stay on top of the most appropriate agile and lean tools and techniques requires a lot of self learning; which is fantastic, but also requires quite a bit of time away form the team you are meant to be supporting.
Most Delivery Managers that I know (certainly within CQC, and others I have talked to across Government) are involved with (if not directly responsible for) the business cases for their Products and Services. Unblocking issues, ensuring funding requests, requesting resources, etc. this all takes up a lot of a Delivery Managers time. When you are also meant to be running the daily stand-ups, managing team retrospectives, monitoring team velocity and organising show and tells you can find your days are very full.
More and more delivery managers that I know are finding they just don’t have time for the ‘people centric’ part that is meant to be at the heart of their role, as Projects and Programmes utilise them more and more as project Managers who are also scrum masters, and so our Delivery managers feel pulled in two directions, and our teams suffer because of it.
When organisations so often find they are struggling to deliver, often at the heart of that is the issue that they have not properly recognised the role of the Delivery Manager. This is a fundamental issues, especially when organisations are new to agile ways of working. Embedding ‘how’ to be agile, takes up just as much time as understanding ‘what’ you can deliver and ‘when’.
Perhaps in mature agile organisations bringing those roles together makes more sense, but for now I think we need to go back to letting our Delivery Managers focus on making sure we can deliver, and our scrum masters helping us use the right techniques to be able to delivery well.
I’ve been working with Developers of different flavours for almost a decade now, and in that time I’ve worked with some amazing Devs, and some frustrating ones; the same as any role it depends on the person.
I’ve also encountered a lot of stereotypes about Developers, primarily that they’re all introverts who like to work on their own, which is as true as saying all Product Managers must be extraverts.
In the last couple of years I’ve also been lucky enough to take part in recruiting and interviewing Developers, and as such I’ve found it fascinating to discuss the role of the Product Manager and the role of Developers, how we can work better together, to support each other and get the best out of each other.
I’ve found it very positive to see the role of the Developer begin to be more central within user centric design, and to have more Developers proactively taking part in user research and design sessions. The days where meeting user needs was solely the domain of the User Researcher and the Product Manager, and that Developers only cared about producing code felt like one we had, at least within Government Digital circles, left behind.
As such, it almost felt like having a bucket of cold water tipped over my head to be told recently that Developers shouldn’t be overly involved in user research, and should be focusing on producing code.
As a Product Manager I don’t want Developers who just produce code, I’ve seen in the past the dangerous waters that can lead to. If you don’t understand why users are doing things, what their needs are, the problems we are trying to fix for our users, then how could you, as my technical expert, challenge me? How can you understand the options and give me advice on how best to tackle the problems we are trying to solve? How can you ensure the code you are writing actually meets the requirements if you don’t understand why it’s needed?
The best Developers I’ve worked with have been proactively working with the user researchers to suggest things to test, using tools like A/B testing to help explore the options and determine the best solutions we can test to help fix the problems we’re trying to solve, using feedback from users to iterate and learn and improve.
I recently did a google search for the ‘role of developers in user centred design’ and was saddened to see there wasn’t much out there, other than a few scholarly articles citing the importance of getting Software Engineers and Developers to integrate user centric design into their approach.
So maybe this is where we are going wrong, maybe we are not talking enough about how important it is that user centric design isn’t just the domain of the designers and user researchers. That the principles of UCD are just as important in the development stage as the discovery phase.
As the Government Digital Service famously said, ‘User Research is a team sport’, and we need to makes sure everyone gets the chance to play.
Our numbers are growing, but what is the role, and what value does it add?
When I first took up the role of ‘Head of Product Management’ back in October 2016, I was one of the first in Government to have the title, and within a few months there was a very small band of 5 of us, who were responsible for looking after the Product Management professionals within our own Government Departments. We were professional leaders, tasked with building capability and skills, and building communities of practice. The original job description we created for a Head of Product was very different to what I do now.
In my first 12 months of the role I focused on the people, working with the others across government to develop a capability framework, training and development plan and a career pathway that Product Managers could use to develop a proper career as a Product Manager within Government.
A lot of our time was spent debating the skills Product Managers needed, and what value Product Managers brought to Projects and teams. It was, upon reflection, a very inward focused role; which given the maturity of the profession at that time made sense. But several years later user needs have changed and I think it’s a good time to reflect on the value we Heads of Product now find ourselves adding within our work, and making sure everyone understands the work some of us are now doing and why. To discuss what that difference is between what we were doing and what we are doing now, and does everyone understand and agree that difference.
This change in the dichotomy of the Head of Product role came up at our last Head of Product catch up, for those of us in role a few years ago, we have all separately found that our focus isn’t purely on developing that community and the professional skills and capabilities of Product Managers anymore.
Instead we are now focussing on Product strategies, on aiding Prioritisation of portfolios, of working with Senior leaders to break problems down, understanding the value we are trying to gain, or the outcomes we are trying to achieve through the Products and Services we are developing. We’re running roadmap workshops across directorates, debating Targeting Operating Models and strategic alignment.
Most departments are now hiring ‘Head’s of Product’ or ‘Deputy Directors of Product’ to be part of their Senior Leadership teams within Digital, and personally I think that is the right move.
As organisations mature in their agile ways of working, the role of prioritisation has become ever more important, and as Product Management professionals, the ability to weigh up data and evidence to make decisions about priorities is our bread and butter. As organisational budgets continue to be constrained we all need to get better at focusing on outputs and understanding the value we are looking to deliver through our projects and programmes, ensuring we are meeting user needs whilst spending public money wisely. Determining priorities and ensuring we are delivering value for users are the fundamental objectives of the Product Manger role, and as such it makes sense to utilise those skills at an organisational level.
We are, in fact, much closer to our counterparts in the private sector determining Returns on Investment etc. than we have ever been before. Yes, we as Head’s of Product still work with Product Managers and teams to help them ensure they are meeting the standards and delivering value, and we still look at the resource demands of teams and make calls on which person within our professional community might be best placed to work on with Product; and in some departments the community is so big that actually they still need someone to focus onleading that; but for the most part, our communities and our people have grown along with us, and most don’t need the level of support from us as community managers that they did before.
Most of the communities now across government are self-sustaining, events like #ProductPeople are being set up and run by members of the community; and while we as Heads of Product are still here to help champion Product Management, and to support the people in our communities, the role of the Head of Product Management as a community lead, has adapted and gown into what our organisations need now, someone who can use those Product Management skills at an organisational level.
As such, I think it’s time we look at the Digital Data and Technology capability framework again for Product Management, talk to community, and review the job description for the Head of Product role we initially developed and iterate that. We need to understand the role of the community lead and the need for that, whilst also recognising the value of Product Management and the skills Head’s of Product can bring to our senior leadership and our organisations.
Not because we didn’t agree that it was a role (because we absolutely did) but because we struggled to define the scope of the role in comparison to things like; Product Management, Design or User Research roles.
It’s one of the questions, that three years later still comes around regularly. Who is responsible for defining the requirements? What is the role of the BA?
The role of the BA in an agile team
One of the problems we had back when we were defining the BA role as part of the DDaT professions, was that the Government Digital Service didn’t have BA’s in their teams. Similarly, the original Scrum Manifesto only has 3 roles in an agile team, the product owner, development team members and scrum master.
Because traditionally, the BA acted as the link between the business units and IT, helping to discover the requirements and the solution to address them; when multidisciplinary teams bought those members together the role of the BA became less clear.
This has meant when adopting agile, different government departments implemented the role in slightly different ways. The biggest trap I have seen teams fall into was the BA getting stuck with all the admin roles for the project.
Roman Pilcher argued for those BA’s moving into Scrum there were two options, becoming the Product Owner, or becoming a ‘team member’.
While I agree that Business Analysts are a key member of a multidisciplinary team, I disagree with this assumption that everyone on an agile team who isn’t the scrum master or the PO is simply ‘a team member’ I think the Business Analyst is a critical role (especially for Product Managers!) that brings unique skills to the team.
So, first things first let’s look at what requirements are in the agile space.
Certainly, within digital government at least, we use a user centric design approach. We are developing products and services that fix the problems that our users are facing. We are identifying user needs and testing and iterating those throughout the product development lifecycle. A lot of the time this conversation about ‘user needs’ has replaced the more traditional conversations about ‘requirements’. Which is good in some ways, but has also led to a bit of confusion about what Business Analysts do if it’s not gathering requirements. Who owns the requirements now?
Are user researchers responsible for gathering the requirements from external users (user needs), whereas Business Analysts are responsible for gathering requirements based on what the business needs (sometimes called business user needs)?
That line of conversation worries me, because it suggests that we don’t need to carry out user research on internal staff, it forgets that internal staff are users too.
So, what is the role of the BA?
In my experience the conversation about who is responsible for gathering requirements is symptamatic of the limitations of the English language, and our obsession with ‘ownership’.
User researchers primarily focus on gathering more qualitative data; why users behave the way they do, how things are making them feel; probing their views and opinions to understand what it is they actually need etc. They will help understand who they users are and verify what the users need. They will work with the team to test design assumptions and help ensure the options being developed meet user needs.
Business Analysis primarily focus on gathering the more quantitative data about the process; both the ‘as is’ process, and the future process we are designing. They work to understand how many users are being or will be affected? What are the cost/time impacts of the problem as identified? What value could be gained through the implementation of any of the options the team are considering?
They help identify the stakeholders that will need to be engaged, and how to best engage with them. They will turn the user needs into user stories that the team can develop and identify the metrics and success criteria for the team to help them measure value.
Where you have a Product or Service that is live and being used by real users, the BA will work with Performance Analysts to understand the feedback data coming from the users.
User Researchers and Business Analysts will often work closely together, while BA’s will use tools like process mapping to identify pain points, user researchers will work with them to map the emotions users are experiencing so that we can fully understand the impact of our current processes and the value we can release by fixing the problem. When User Researchers are planning in research sessions etc., they will often work with BA’s to get the data on where best to test in terms of locations or user groups.
Good Product Managers will use both the Quantitative and Qualitative data to help them pick which options to test. Designers will help both the user researchers and business analysts look at the data and design prototypes etc. to test with users.
For me, each role is clear in its scope, and their need to work together to identify the right problems users are facing and the best way to test those; and it’s not about what individual owns the requirements, because the answer is the team do.
On the 20th of February a petition was created on the petitions website to Revoke Article 50 and remain within the EU, on the 21st of March the petition went viral, and as of writing this blog has currently got 5,536,5805,608,428 5,714,965 signatures. This is the biggest petition to have ever been started since the sites launch. Not only that, it is now the most supported petition in the world, ever.
The first version of the site was developed in 2010 after the election. Originally intended to replace the Number 10 petition site, which had a subtly different purpose. The new version of the Parliamentary petitions site was then launched in 2015, as an easy way for users to make sure their concerns were heard by the government and parliament. The original version of the service was developed by Pete Herlihy and Mark O’Neill back in the very early days of Digital Government, before the Digital Service Standard was born.
The site was built using open source code, meaning anyone can access the source code used to build the site, making it is easy to interrogate the data. With a number of sites, like unboxed, developing tools to help map signatories of petitions etc based off the data available.
Speaking of security measures, that’s digital service standard number 7, making sure the service has the right security levels, the petitions site apparently uses both automated and manual techniques to spot bots; disposable email addresses and other fraudulent activities. This works with digital standard number 15, using tools for analysis that collect performance data; to monitor signing patterns etc. Analysing the data, 96% of signatories have been within the UK (what the committee would expect from a petition like this).
Another key service standard is building a service that can be iterated and improved on a frequent basis (digital standard number 5), which mean that when the petition went viral, the team were able to spot that the site wasn’t coping with the frankly huge amount of traffic headed it’s way and quickly doubled the capacity of the service within a handful of hours.
This also calls out the importance of testing your service end to end (standard number 10) and ensuring its scalable; and if and when it goes down (as the petitions website did a number of times given the large amount of traffic that hit it, you need to have a plan for what to do when it goes down (standard number 11), which for the poor Petitions team meant some very polite apologetic messages being shared over social media while the team worked hard and fast to get the service back online.
The staggering volume of traffic to the site, and the meteoric speed in which the petition went vial, shows that at its heart, the team who developed the service had followed Digital Service Standard number 1. Understand your user’s needs.
In today’s culture of social media, people have high expectations of services and departments with there interactions online, we live in a time of near instant news, entertainment and information- and an expectation of having the world available at our fingertips with a click of a button. People want and need to feel that their voice is being heard, and the petitions website tapped into that need, delivering it effectively under conditions that are unprecedented.
Interestingly when the site was first developed, Mark himself admitted they didn’t know if anyone would use it. There was a lot of concern from people that 100,000 signatures was too high a figure to trigger a debate; but within the first 100 days six petitions had already reached the threshold and become eligible for a debate in the Commons. Pete wrote a great blog back in 2011 summing up what those first 100 days looked like.
It’s an example of great form design, and following digital service standard number 12, it is simple and intuitive to use. This has not been recognised or celebrated enough over the last few days, both the hard work of the team who developed the service and those maintaining and iterating it today. In my opinion this service has proven over the last few days that it is a success, and that the principles behind the Digital Service Standards that provided the design foundations for the site are still relevant and adding value today.
In other words Agile isn’t linear so stop making it look like it is.
Most people within the public sector who work in Digital transformation have seen the GDS version of the Alpha lifecycle:
Which aims to demonstrate that developing services starts with user needs, and that projects will move from Discovery to Live, with iterations at each stage of the lifecycle.
The problem with this image of Agile is that it still makes the development of Products and Services seem linear, which it very rarely is. Most Products and Services I know, certainly the big complex ones, will need several cracks at a Discovery. They move into Alpha and then back to Discovery. They may get to Beta, stop and then start again. The more we move to a Service Design mentality, and approach problems holistically, the more complex we realise they are, and this means developing Products and Services that meet user needs is very rarely as simple and straightforward as the GDS Lifecycle makes it appear.
And this is fine, one of the core principles of Agile is failing fast. Stopping things rather than carrying on regardless. We iterate our Products and Services because we realise there is more to learn. More to Discover.
The problem is, especially in organisations new to Agile and the GDS way of working, they see the above image, and its more linear portrayal seems familiar and understandable to them, because they are generally user to Waterfall projects which are linear. So when something doesn’t move from Alpha to Beta, when it needs to go back into Discovery they see that as a failure of the team, of the Project. Sometimes it is, but more not always, sometimes the team have done exactly what they were meant to do, they realised the problem identified at the start wasn’t the right problem to fix because they have tested assumptions and learned from their research. This is what we want them to do.
The second problem with the image put forward in the GDS lifecycle is that it doesn’t demonstrate how additional features are added. The principle of Agile is getting the smallest usable bit of your Product or Service out and being used by users as soon as you can, the minimum viable product (MVP), and this is right. But once you have your MVP live what then? The Service Manual talks about keeping iterating in Live, but if your Product or Service is large or complex, then your MVP might just be for one of your user groups, and now you need to develop the rest. So what do you do? Do you go back into Discovery for the next user segment (ideally, if you need to yes), but the GDS lifecycle doesn’t show that.
As such, again for those organisations new to Agile, they don’t’ factor that in to their business cases, it’s not within the expectations of the stakeholders, and this is where Projects end up with bloated scopes and get stuck forever in Discovery or Alpha because the Project is too big to deliver.
With Public Services being developed to the Digital Service Standards set by GDS, we need a version of the lifecycle that breaks that linear mindset and helps everyone understand that within an Ariel project you will go around and around the lifecycle and back and forwards several times before you are done.
Agile is not a sprint, a race, or a marathon, it’s a game of snakes and ladders. You can get off, go back to the start or go back a phase or two if you need to. You win when all your user needs are met, but as user needs can change over time, you have to keep your eye on the board, and you only really stop playing once you decommission your Product or Service!
One thing that comes up time and again from senior managers is “My focus or priority is improving Productivity”.
Early on in my career I worked on a number of ‘productivity improvement inniatives’ and sometimes they would seem to make some improvments, sometimes they wouldn’t, but they never really solved the issues and there always seemed to be more to do to ‘improve productivity’, so I began to wonder whether these projects were ready adding any value.
Nowadays when I talk to a manager to understand why they are prioritising any productivity initative, the answer is usually something like, “we can be more productive, it’ll help us cut costs and deliver a better service.” If I then ask how they plan to do this, the answer is generally “by improving our processes/ transforming our service/ getting more staff”.
Anyone who has ever been responsible for transforming processes or services acknowledges that when you introduce a change or a new way of working productivity drops, at least for a short time. This is a truth pretty much universally acknowledged. In fact lets repeat that, just to be clear, when you introduce any new change to processes, technology or ways of working productivity will drop. This can be as much as 30% while people learn the new processes, or get up to speed with any changes. This drop generally doesn’t last for too long, but it will happen.
Next, when you hire more staff, your existing high performing staff are often the ones tasked with training them and building capability. So, for those staff at least, productivity will drop while they help support and build the capability of your new staff. There’s evidence that it can take new staff 6-12 months to fully get to the same capability and productivity levels as existing staff, and that the productivity of the staff responsible for training and coaching them will be impacted as well.
This is all fine, as long as it is acknowledged upfront. If we tell people that we know their productivity will drop while they get up to speed with these new changes, or help train new staff, then we are helping to reassure them, and managing our own expectations.
But the biggest mistake I often see is a reluctance by senior managers to face that truth. We refuse to change people’s targets, we expect them to still meet the previous demands, and what happens?
First morale drops, and then productivity drops, morale drops more because people are struggling to meet unrealistic targets and then they leave or go off sick, and productivity drops further. In the end the change we’ve introduced fails or we have less staff than we did to start with and we’re actually achieving less than we did before not more. It is almost Oedipal in it’s obviousness, you can see it coming a mile off. So why do Managers do it?
For me this conversation comes down to Managers rather than Leaders, and a failure to look at the actual problem we are trying to fix.
When I work with organisations to understand the outcomes they are trying to reach, or the problems they are trying to fix, productivity is often mentioned, especially in operational delivery spaces.
But by working with both the leadership team and the people delivering on the front line, productivity itself is never really the problem. It’s the IT, or the processes, or the checks and balances we’ve put in place creating multiple handoffs, or generally a mixture of all of the above. So, we’re back to changing processes or introducing new tools or ways of working, but we’ve already said that doesn’t help, right? No.
Changing the tools or processes is absolutely the right thing to do, BUT, we have to really understand why we are doing it. Equally investing in people and training them is absolutely the right thing to do, but, we can’t solely make it all about upping productivity, because that forgets the people who are at the heart of getting things done, treating them solely as resources rather than individuals with thoughts and feelings. By telling people we’re trying to improve productivity we make it sound like they are not already being productive. By imposing change on them to simply improve productivity we are treating them like cogs in a factory and demotivating them before we even start.
Instead we need to talk to them to really understand the problems they are facing, and what blockers or issues they are having to work around to do actually their job. We need to consider their views and ideas and involve them in the process of making any changes. By empowering our people to talk to us about the issues they are facing and consulting them we are hopefully getting them motivated and invested in any changes we make, rather than making the change happen to them we are doing it with them.
We also need to look wider than one particular function or area, it’s likely that what looks like a productivity issue in one function, is actually a more systemic issue. By just trying to improve productivity in one area we are not considering the design of the whole service, but instead working ourselves into a silo.
While we may have assumptions about what changes will work, we we have to accept we may need to try out different options to really improve things, and we have to acknowledge this out loud. Again, it’s important to manage everyones expectations so that people don’t feel disempowered and like the change is happening to them.
And finally we have to reassure people that we know that for a short period of time productivity will drop and that is ok. And you know what? It is ok. Yes, we all have targets to hit, but if for 2 or 3 months every case worker out there processes 13 cases rather than 15, that is acceptable, because within 12 months’ time, if we as change leaders have done our job right, they’ll be processing over 20 cases rather than 15. And they’ll feel listened to, they’ll feel supported, and in the end productivity will go up.
But it’ll be a biproduct of successful change. We’ll have taken people with us, we’ll have learned from the work we have done, and probably given ourselves a nice backlog of things we can do to keep improving things, so that productivity can keep improving; but more importantly our organisation will hopefully be a better place for everyone in it.
So please, next time you hear a senior manager say “My focus is improving Productivity” just ask them how? And if you are that Senior Manager, ask yourself, “Why?” What is the problem you are really trying to solve? Is it really just about productivity?
We need to lead people, not simply manage productivity.
I have blogged about some of these elsewhere, but a quick glossary of terms that you might hear when talking Agile or Digital Transformation.
Agile: A change methodology, focusing on delivering value as early as possible, iterating and testing regularly.
Waterfall: A Change methodology, focusing on a linear lifecycle delivering a project based on requirements gathered upfront.
Scrum: A type of Agile, based on daily communication and the flexible iteration of plans that are carried out in short timeboxes of work.
Kanban: A type of Agile, based on limiting throughput and the amount of work in progress.
The Agile Lifecycle: Similar to other change methodology lifecycles, the agile lifecycle is the stages a project has to go through. Unlike other lifecycles, agile is not a linear process, and products or services may go around the agile lifecycle several times before they are decommissioned.
Discovery: The first stage of the agile lifecycle, all about understanding who your users are; what they need and the problem you are trying to fix. Developing assumptions and hypothesis. Identifying a MVP that you think will fix the problem you have identified. Prioritising your user needs and turning them into epic user stories.Akin to the requirements gathering stage in Waterfall.
Alpha: The design and development stage. Building prototypes of your service and testing it with your users. Breaking user needs and Epics into user stories and prioritising them. Identifying risks and issues understanding the architecture and infrastructure you will need prior to build. Akin to the design and implementation stage in Waterfall.
Beta: The build and test stage. Building a working version of your service. Ensuring your service is accessible, secure and scalable. Improving the service based on user feedback, measuring the impact of your service or product on the problem you were trying to fix. Can feature Private and Public Beta. Akin to the Testing and development stage in Waterfall.
Private Beta: Testing with a restricted number of users. A limited test. Users can be invited to user the service or limited by geographical region etc.
Public Beta: A product still in test phase but open to a wider audience, users are no longer invited in, but should be aware they product is still in test phase.
Live: Once you know your service meets all the user needs identified within your MVP, you are sure it is accessible, secure and scalable, and you have a clear plan to keep iterating and supporting it then you can go live. Akin to the Maintenance stage in Waterfall.
MVP: The Minimum Viable Product, the smallest releasable product with just enough features to meet user needs, and to provide feedback for future product development.
User Needs: The things your users need, evidenced by user research and testing. Akin to business requirements in Waterfall and other methodologies.
GDS:Government Digital Services, part of the Cabinet Office, leading digital transformation for Government, setting the Digital Service Standard that all Government Departments must meet when developing digital products and services.
Service Design: Looking at your Product or Service holistically, keeping it user focused while ensuring it aligns with your organisation strategy.
User Centric Design (UCD): The principles of user centric design are very simple, that you keep the users (both internal and external) at the heart of everything you do. This means involving users in the design process, rather than using ‘proxy’ users (people acting like users), you involve actual users throughout the design and development process. Recognising different users (and user groups) have different needs and that the best way to design services that meet those needs is to keep engaging with the users.