×

Category: Digital Government

Assessing Innovation

(co-written with Matt Knight)

Some background, for context

Just over a month ago I got approached to ask if I could provide some advice on assessments to support phase two of the GovTech Catalyst (GTC) scheme. For those who aren’t aware of the GovTech Catalyst Scheme, there’s a blog here that explains how the scheme was designed to connect private sector innovators with the public sector sponsors, using Innovate UK’s Small Business Research Initiative (SBRI) to help find promising solutions to some of the hardest public sector challenges.

Person in a lab coat with a stethoscope around their neck looking through a Virtual Reality head set.
Looking for innovation

The Sponsor we were working with (who were one of the public sector sponsors of the scheme) had put two suppliers through to the next phase and allocated funding to see how and where tech innovation could help drive societal improvements in Wales. As part of their spend approval for the next phase, the teams had to pass the equivalent of a Digital Service Standard assessment at the 6 month point in order to get funding to proceed. 

For those who aren’t aware, there used to be a lovely team in GDS who would work with the GTC teams to provide advice and run the Digital Service Standard assessments for the projects; unfortunately this team got stood down last year; after the recent GTC initiatives started, leaving them with no one to talk to about assessments, nor anyone in place to assess them. 

The sponsor had reached out to both GDS and NHS Digital to see if they would be willing to run the assessments or provide advice to the teams, but had no luck; which left them a bit stuck; which is where I came in. I’ve blogged before about the Digital Service Standards; which led to the Sponsor reaching out to me to ask whether I’d be willing and able to help them out; or whether I knew any other assessors who might be willing to help. 

Preparing for the Assessments

As there were two services to assess; one of the first things I did was talk to the wonderful Matt Knight to see if he’d be willing and able to lead one of the assessments. Matt’s done even more assessments than me; and I knew he would be able to give some really good advice to the product teams to get the best out of them and their work. 

Matt and I sat and had a discussion on how to ensure we were approaching our assessments consistently; how to ensure we were honouring and adhering to the core tenants of the Digital Standards whilst also trying to assess the teams innovation and the value for money their services could deliver in line with the criteria for the GovTech scheme.

What became quickly apparent was; because this was to support the GTC scheme; the teams doing the work were fully private sector with little experience of the Digital Service Standards. A normal assessment, with the standard ‘bar’ we’d expect teams to be able to meet, wouldn’t necessarily work well; we’d need to be a little flexible in our approach. 

Obvious, no matter what type of Assessment you’re doing the basic framework of an assessment stays the same (start with user needs, then think about the End-to-End service, then you can talk about the team and design and tech, and along the way you need to ask about the awkward stuff like sustainability and open source and accessibility and metrics) can be applied to almost anything and come up with a useful result, regardless of sector/background/approach. 

As the services were tasked with trying to improve public services in Wales, we also wanted to take account of the newly agreed Welsh Digital Standards; using them alongside the original Digital Standards; obviously the main difference was the bits of the Welsh Standards that covered ensuring the well-being of people in Wales and promoting the Welsh Language (standards 8 & 9), you can read more about the Well being of future generations Act here

The assessments themselves 

An image of a team mapping out a user journey
User Journey Mapping

The assessments themselves ran well, (with thanks to Sam Hall, Coca Rivas and Claire Harrison my co-assessors) while the service teams were new to the process they were both fully open and willing to talk about their work, what went well and not so well and what they had learnt along the way. There was some great work done by both the teams we assessed, and it’s clearly a process that everyone involved learned a lot from, both in terms of the service teams, and the sponsor team, and it was great to hear about how they’d collaborated to support user research activities etc. Both panels went away to write up their notes; at which point Matt and I exchanged notes to see if there were any common themes or issues; and interestingly both assessments had flagged the need for a Service Owner from the sponsor to be more involved in order to help the team identify the success measures etc. 

When we played the recommendations and findings back to the Sponsor, this led to an interesting discussion; although the sponsor had nominated someone to act as the link for the teams in order to answer their questions etc. and to try and provide the teams some guidance and steer where they could. Because of the terms of the GTC scheme, the rules on what steers they could and couldn’t give were quite strict to avoid violating the terms of the competition. Originally the GTC team within GDS would have helped the sponsors navigate these slightly confusing waters in terms of competition rules and processes. However, without an experienced team to turn to for advice it leaves sponsors in a somewhat uncomfortable and unfamiliar position; although they had clearly done their best (and the recommendations in this blog are general comments on how we can improve how we assess innovation across the board and not specifically aimed at them)”

Frustratingly this meant that even when teams were potentially heading into known dead-ends etc; while the sponsor could try to provide some guidance and steer them in a different direction; they couldn’t force the teams pivot or change; instead the only option would be to pull the funding. While this makes sense from a competition point of view; it makes little to no sense from a public purse point of view; or from a Digital Standards point of view. It leaves sponsors stuck (when things might have gone a little off track) rather than being able to get teams to pivot; they are left choosing between potentially throwing away or losing some great work; or investing money in projects that may not be able to deliver. 

Which then raises the question; how should we be assessing and supporting innovation initiatives? How do we ensure they’re delivering value for the public purse whilst also remaining fair and competitive? How do we ensure we’re not missing out on innovative opportunities because of government bureaucracy and processes? 

In this process, what is the point of a Digital Service Standard assessment? 

If it’s like most other assessment protocols (do not start Matt on his gateway rant), then it’s only to assess work that has already happened. If so, then it’s not much good here, when teams are so new to the standards and need flexible advice and support on what they could do next etc.   

If it’s to assess whether a service should be released to end users, then it’s useful in central government when looking to roll out and test a larger service; but not so much use when it’s a small service, mainly internal users or a service that’s earlier on in the process aiming to test a proof of concept etc. 

If it’s to look at all of the constituent areas of a service, and provide help and guidance to a multidisciplinary team in how to make it better and what gaps there are (and a bit of clarity from people who haven’t got too close to see clearly), then it’s a lot of use here, and in other places; but we need to ensure the panel has the right mix of experts to be able to assess this. 

While my panel was all fantastic; and we were able to assess the levels of user research the team had done, their understanding of the problems they were seeing to solve, their ability to integrate with legacy tech solutions and how their team was working together etc. none of us had any experience in assessing innovation business cases or understanding if teams had done the right due diligence on their financial funding models. The standards specify that teams should have their budget sorted for the next phase and a roadmap for future development; in my experience this has generally been a fairly easy yes or no; I certainly wouldn’t know a good business accelerator if it came and bopped me on the nose. So while we could take a very high level call on whether we thought a service could deliver some value to users; and whether a roadmap or budget looked reasonable; a complex discussion on funding models and investment options was a little outside our wheelhouse; so was not an area we could offer any useful advice or recommendations on.  

How can we deliver and assess innovation better going forward? 

If we’re continuing to use schemes like the GTC scheme to sponsor and encourage private sector innovators to work with the public sector to solve important problems affecting our society, then we obviously need a clear way to assess their success. But we also need to ensure we’re setting up these schemes in such a way that the private sector is working with the public sector; and that means we need to be working in partnership; able to advise and guide them where appropriate in order to ensure we’re spending public money wisely. 

There is a lot of great potential out there to use innovative tech to help solve societal issues; but we can’t just throw those problems at the private sector and expect them to do all the hard work. While the private sector can bring innovative and different approaches and expertise, we shouldn’t ignore the wealth of experience and knowledge within the public sector either. We need people within the public sector with the right digital skills, who are able to  prioritise and understand the services that are being developed inorder to ensure that the public purse doesn’t pay for stuff that already exists to be endlessly remade. 

Assessment can have a role in supporting innovation; as long as we take a generous rather than nitpicking (or macro rather than micro) approach to the service standard. Assessments (and the Standards themselves) are a useful format for structuring conversations about services that involve users (hint: that’s most of them) just the act of starting with user needs – pt 1 – rather than tech – changes the whole conversation. 

However,  to make this work and add real value, solve a whole problem for users (point 2 of the new uk govt standard) – is critical, and that involves having someone who can see the entire end to end process for any new service and devise and own success measures for it. The best answer to both delivering innovation, and assessing it, is bringing the private and public sector together to deliver real value; creating a process that builds capacity, maturity and genuine collaboration within the wider public sector. A space to innovate and grow solutions. True multidisciplinary collaboration, working together to deliver real value.

“Together, We Create”

Big thanks to Matt for helping collaborate on this, if you want to find his blog (well worth a read) you can do so here:

Cost vs. Quality

A debate as old as time, and a loop that goes around and around; or so it seems in the Public Sector commercial space.

Every few years, often every couple of spend control cycles, the debate of cost vs. quality rears its head again; with Commercial weighting flip flopping between Quality as the most important factor, to cost (or lowest cost) as the highest priority.

When quality is the most important factor in the commercial space; Government Departments will prioritise the outputs they want to achieve; and weighting their commercial scores to the areas that indicate Quality – things like ‘Value Add’; ‘Delivering Quality’, ‘Culture’, ‘Delivering in Partnership etc’. We will see more output focused contracts coming out on to the market; with organisations clear on the vision they want to achieve and problems they need to solve and looking for the supplier that can best help them achieve that.

When reducing costs becomes the highest priority, the commercial weighting moves to ‘Value for Money’. Contracts are more likely to be fixed price and are often thinly veiled requests for suppliers to act as body shops rather than partners with commercial tenders scoring day rate cards rather than requesting the cost for overall delivery of outcomes.

Unfortunately, a lot of the time, when the priority switches to cost over quality; we end up with a lot of projects not being delivered; of outcomes being missed, and user needs not being met. In order to cut more and more costs, offshoring resource can become the only way to deliver the results cheaply; with the departmental project teams working out of sync with their offshore delivery partners; making co-design and delivery much harder to do, and making it almost impossible to achieve the required quality. This goes in a cycle, with Departments toting and grooming between “offshore as much as possible to cut costs” and “the only way to deliver quality is for everyone to be collocated in the office 100% of the time”. Full collocation of the teams inevitably driving up the costs again.

So, does that mean in order to get quality we have to have high costs? Surely there is an obviously a sweet spot we’re all looking for, where cost and quality align; but why does it seem so hard to achieve within the Public Sector and what do we need to be looking at to achieve it?

When the government commercial function (and GDS) shook up the public sector digital world over nearly a decade ago they introduced things like the Digital Marketplace and implemented the Spend Control pipeline; with the aim of moving departments away from the large SI’s that won 90% of government contracts. These suppliers often charged a fortune and rarely seemed to deliver what was actually needed. (This blog gives the details on what they intended, back in 2014).

Lots of SME suppliers began to enter the market and began to win contracts and change up how contracts were delivered, as completion increased, costs decreased; with quality partnerships forming between new suppliers and government departments; and the quality of delivery increased as new options, solutions and was of working were explored.

However, this left Departments managing lots of individual contracts; which grew increasingly complex and time consuming to mange. In order to try and reduce the number of contracts they had to manage; the scale of the contracts began to increase, with more and more multimillion pound contacts emerging.

As the size and value of the contracts increased, SME’s began to struggle to win them, as they couldn’t stand up the teams needed quickly; nor could they demonstrate they had the experience in delivering contracts of that scale; which became a bit of a self-fulfilling prophecy, as the larger SI’s continued to win the larger contracts as they were the only ones able to provide the evidence they could staff and deliver them; and their costs remained high.

This left the SME’s facing three options:

  • Decide not to try for the larger contracts, reducing the amount of competition; potentially increasing costs and decreasing quality in the long run);
  • Form partnership agreements with a number of other SME’s or a larger supplier (again reducing the amount of completion) in order to be able to stand up the teams needed and enable delivery of larger contracts. However having a consortium of suppliers not used to working together could complicate delivery, which could in turn decrease the quality or speed of delivery if not carefully managed; as such not all contracts allowed consortium or partnership bids due to the perceived complexity they could bring.
  • Or the SME aimed to grow to allow them to be able to win and deliver the larger contracts. As SME’s grew however, they would often have to either increase their costs in order to run a larger organisation that could still deliver the same quality they did as before; or they could keep their costs low, but their quality would likely decrease.

Throughout the pandemic, the focus has been on delivery; and there’s been a healthy mix of both small and large contracts coming out, meaning lots of competition. While costs have always been a factor;  the pandemic allowed both departments and suppliers to remove much of the costly admin and bureaucratic approval processes in favour of lightweight approaches involved to bring on suppliers and manage teams outputs, encouraging innovation in delivery and cost; with lockdowns ensuring co-location was now out of the question many suppliers were able to reduce their rates to support the pandemic response as both departments and suppliers agreeing that the priority had been on delivering quality products and services to meet organisations and users urgent needs.The removal of co-location as a prerequisite also open up the market to more suppliers to bid for work, and more individuals applying for more roles; which increased competition and inevitably improved the quality out the outputs being produced. This in fact led to a lot of innovation being delivered throughout the pandemic which has benefited us all.

As we move out of the pandemic and into the next spending review round; the signs are that the focus is about to swing back to costs as the highest priority. With larger contracts coming out that are looking for cheaper day rates in order to allow departments to balance their own budgets; but as the economy bounces back and departments begin to insist again that teams return to the office, most suppliers will want to increase their costs to pre-pandemic levels. If we’re not careful the focus on cost reduction will mean we could decrease the quality and innovation that has been being delivered throughout the pandemic; and could cost the taxpayers more in the long run. Look at DWP’s first attempt to deliver Universal Credit for how badly things can go wrong when cost is the highest priority and when the Commercial team and runs the procurement process with minimum input from Delivery; driving the commercial and deliver decisions being made more than quality.

To find the sweet spot between Cost and Quality we need to create the best environment for innovation and competition. Allowing flexibility on where teams can be based will support this; supporting and encouraging SME’s and Medium sized suppliers to bid for and win contracts by varying contract sizes and values. Focusing on outputs over body shopping. Looking for what value suppliers can add in terms of knowledge transfer and partnership rather than simply prioritising who is the cheapest.

It’s important we all work together to get the balance between cost and quality right, and ensure we remain focused on delivering the right things in the right way.

Seesaw

Is it time for Flexible Working to actually become flexible?

Does the Public Sector need to embrace Hybrid working or risk loosing its workforce?

The majority of job adverts within the Public Sector (and beyond) feature the phrase – “We offer flexible working” as a benefit. However, this flexible working is limited on how flexible it can be; generally its telling you they don’t mind what hours you work, as long as you work the core hours and get your work done. What they don’t mean is, we don’t mind where you work as long as you can attend core meetings face to face and get your work done.

Home working, hell geographically diverse (not London) working has always been a bone of contention within the Public Sector; in the couple of years before the pandemic there was a push to get more staff our of London and establish offices ‘in the regions etc’ but this has always met with some resistance, as Ministers themselves are firmly London based, and if your work required any kind of interaction with a Minister then you’d need to be in London at least part time.

Street sign – Downing St

There has long been the view with managers in the Public Sector that staff (especially Operational ones from my experience) couldn’t be trusted to work at home full time, that it would be impossible to monitor their work and ensure things got done on time etc. Obviously given the Public Sector is there to spend public money – keeping staff within your eyesight so you could ensure they were not wasting money was the most important thing. That was never the vocalised reason though, instead it was concerns about staff accessing or taking home users personal data or commercial sensitive information; a fear that staff would not (or could not) keep data secure if it left the office. This attitude slowly dispersed as you moved up the ranks, proving this was more about hierarchy and a command and control culture based on a pervasive lack of trust of staff.

The pandemic has meant for the first time all (or most at least) office staff have been not only allowed, but required, to work from home. It finally stopped the traditional slog to the office and forced managers to trust their staff could in fact get the work done perfectly well when not in the office; and those same staff proved they could deliver from home just as well as the office.

But now as the pandemic ebbs, the question has come around – do staff really need to return to the office? Most Departments so far seem to be taking the sensible approach and talking about phased returns to the office and the use of hybrid working. But one Minster has already stated that as “People who have been working from home aren’t paying their commuting costs… they have had a de facto pay rise… if people aren’t going into work, they don’t deserve the terms and conditions they get if they are going into work.”

Not only is this ridiculous at a time that public sector pay has been effectively frozen for years, as the Retail price index has continued to increase higher than public sector pay; but it also ignores the needs of both those people who can’t go into the office for a health reason and the issues departments themselves have faced for years when it comes to their offices.

Departments have long struggled with over crowding; with at least two (often more) staff members to every desk. Due to this over subscription, most offices moved to hot-desking; but that comes with its own problems as team leaders and office managers try to keep track of who is sitting where on what day. Desk allocation has long been the thorn in every office manager and team leaders side. Not only do you have more people than desks, but a number of staff have health constraints the limit where they can sit. For ever person who needs a window desk due to migraine etc, you’ve got a person who needs the thermostat at a specific setting (often sat next to someone who needs a completely different setting for their own health condition); or needs a desk nearer the bathrooms etc. Office planning is a complex nightmare of logistics and expense.

Crowds

The other problem teams face when organisations insist that everyone comes into the office; is that your automatically excluding those who can’t. For those people who have a disability that means they are unable to travel into an office daily they are at worst either excluded from jobs that insist on it, or at best they are the one home worker in a team of office workers; generally leaving them feeling excluded from decisions and conversations; creating feelings of isolation and exclusion.

Disabled people have for years been crying out for more home working, only to be told it wasn’t feasible; but now that the pandemic has proved it is indeed workable, if employers don’t use this as a time to examine properly how to enable and support home workers; they face at best the exodus of staff who want (or need) to have home working as a proper option; and at worst the start seeing more and more legal challenges from staff who feel they are being treated unfairly and excluded from work the pandemic has proved they can do just as well from home.

We need to properly consider what the future ways of working look like, and how we can proactively be inclusive to everyone, whether they choose to work from an office, from home or a mix thereof (which seems to be the preferred method of most people according to the million LinkedIn surveys I’ve seen floating around). A recent study by YouGov has found that over 75% of people want the option of Hybrid working; with most people wanting the flexibility to spend 2-3 days working from home and 2-3 days working in the office.

As Sammy Rubin, CEO and founder of YuLife has stated “Workplaces now need to give employees more tools to help them benefit from the new expectations they now have from their employers following the pandemic. Perks and benefits have to be made more accessible and tailored to individual employees’ needs, while also benefiting both remote staff as well as those coming into the office in an era increasingly characterised by a hybrid working model.” Allowing people to work from home isn’t enough, we need to proactively be thinking how we can best support and include those working from home in meetings in the same way we include those working in the office.

A virtual meeting

While the public sector has always struggled with loosing staff to the private sector for money; the public sector has always prided itself on offering better ways of working and a better work life balance etc. However, many private sector companies are using this opportunity to look at their own ways of working; either moving away from offices entirely to save costs and investing properly in home working, or engaging and consulting with their staff to support a move to hybrid working, some are even using this as an opportunity to consider moving to 4 day weeks etc. They’ve recognised that this not only benefits, them, their staff, but also the environment at a time when Climate Change is becoming one of the hottest topics (pun intended) by reducing the number of commuters etc.

If the public sector insists on a full return to the office, then they risk loosing even more staff to the private sector; as people begin to prioritise their quality of life, and realise the private sector doesn’t just offer more money, but it can also offer better ways of working. The Public Sector has much bigger issues to deal with (like climate change!) rather than focusing who is working where; and Ministers need to be looking at the bigger picture. As Dave Penman from the FDA union has said “What should matter to ministers is whether public services are being delivered effectively, not where individual civil servants are sitting on a particular day.”

All it takes is a little trust, and a degree of flexibility.

Service Owner vs. Programme Manager vs. Product Lead

What’s the difference? Does the name matter?

Over a year ago, following an interesting chat with David Roberts at NHSBSA, I got to thinking about the role of the Service Owner; and why the role wasn’t working in the way we intended back in the dawn of the Service Manual. This in turn (as most things do for me) led to a blog in order to try and capture my thoughts in the vague hope they might be useful/interesting for anyone reading them.

Ironically, for what was a random think-piece, it has consistently been my most popular blog; getting a least a dozen reads everyday since I wrote it. Which got me thinking again; what is it about that blog that resonates with people? And the fact is, the role of the Service Owner is no better or consistently understood today than it was then. The confusion over the role of the Service Owner; their role and responsibilities, is still one of the most common things I get asked about. What’s the difference between a Service Owner or Manager (is there one)? How/why is the role different to that of the Product Lead? What is the difference between a Service Manager and a Programme Manager? Is the Service Owner different to the SRO? What do all these different role titles mean?

What's In a Name? A lot. – AE2S Communications
What’s in a name?

Every department/Agency within the Public Sector seems to have implemented the role of the Service Owner differently; which makes it very hard for those in the role (or considering applying for the role) to understand what they should be doing or what their responsibilities are etc. This is probably why, as a community of practice within DDaT, it certainly used to be the one hardest communities to bring together, as everyone in it was doing such different roles to each other.

Some clients I’ve been working with use the role of Service Owner and Lead Product Manager interchangeably; some have Service Owners who sit in Ops and Service Managers who sit in Digital (or vice versa); some have Service Managers sitting alongside Programme Managers; or Service Owners alongside Programme Directors, all desperately trying to not stand on each others toes.

So what is the difference?

The obvious place to look for clarity surely is the Service Manual, or the DDaT capability framework. The Service manual specifies it is the responsibility of the Service Owner is to be: “the decision-making authority to deliver on all aspects of a project. Who also:

  • has overall responsibility for developing, operating and continually improving your service
  • represents the service during service assessments
  • makes sure the necessary project and approval processes are followed
  • identifies and mitigates risks to your project
  • encourages the maximum possible take-up of your digital service
  • has responsibility for your service’s assisted digital support”

When the DDaT capability framework was first written, the Service Manager was more akin to a Product person; and originally sat as a senior role within that capability framework; yet they were also responsibility for the end to end service (which was a very big ask for anyone below the SCS working as an SRO). But the role often got confused with that of the IT Service manager, and (as perviously discussed in last years blog) the responsibilities and titles got changed to create the role of Service Owner instead.

Interesting in the Service Manual the reference to the Service Owner being the person who has responsibility for the end to end service; has now been removed; instead focusing on them being the person responsible for being the person responsible for delivering the project. While I imagine this is because it’s very hard for any one person (below SCS level) to have responsibility for an end to end service in the Public Sector due to the size of the Products and Services the Public Sector delivers; it does however mean the new role as description in the Service Manual seems to bring the role of Service Owner closer to that of the Programme Manager.

However, in contrast to the description in the Service manual, the DDaT capability framework does still specify that the role of the Service Owner is “accountable for the quality of their service, and you will be expected to adopt a portfolio view, managing end-to-end services that include multiple products and channels.” Obviously the onus here has changed from being responsible for the end to end service to managing the end to end service. But even that is a clear difference to being responsible for delivering a project as the manual describes it.

Some elements of the new Service Owner role description in the Manual do still align to the traditional responsibilities of Product people (mainly considering things like assisted digital support and ensuring you can maximise take up of your service); but the Service Manual has now removed those responsibilities within a team from the Product Manager. Now the Product Manager seems too intended to be much more focused solely on user needs and user stories; rather than the longer term uptake and running of the service. But again, confusingly, in the Capability framework for Product Management there is still the expectation that Product people will be responsible for ensuring maximum take-up of the service etc.

It seems in trying to clarify the role of the Service Owner, the Service Manual and the Capability framework disagree on exactly what the responsibilities of the role are; and rather than clarify the difference between the Product people and the Service Owners, the waters have instead been muddied even more. Nor have they made it any clearer if/what the difference is between the role of the Service Owner or Programme manager is.

The Project Delivery Capability framework states that “there are many other roles that are needed to successfully deliver projects. These roles are not included in our framework but you will find information on them within the frameworks of other professions, such as, Digital, Data & Technology framework” frustratingly it doesn’t give any clarity on how and when roles like SRO or Programme Manager might overlap with roles within the DDaT framework; nor how these roles could work best with the roles within the DDaT framework. Both the Service Owner role and the Programme manager role state responsibility for things like stakeholder management; business case development/alignment; risk management and governance adherence. Admittedly the language is slightly different; but the core themes are the same.

So is the assumption that you don’t need both a Programme Manager and a Service Owner? Is it an either or that has never been clearly specified? If you’re using PRINCE2 you get a Programme Manager, if Agile its a Service Owner? I would hope not, mainly because we all know that in reality, most Public Sector digital programmes are a blend of methodologies and never that clear cut. So are we not being clear enough about what the role of the Service Owner is? Does it really matter if we don’t have that clarity?

Evidence has shown that when teams aren’t clear on the roles and responsibilities of there team mates, and especially those people responsible for making key decisions; then bottlenecks being to occur. Teams struggle to know who should be signing of what. Hierarchy and governance become essential to achieving any progress; but inevitabley delays occur while approvals are sought, which simply slows down delivery.

So can we get some clarity?

At the start of the year DEFRA advertised a role for a Service Owner which (I thought) clearly articulated the responsibilities of the role, and made it clear how that role would sit alongside and support Product team and work with Programme professionals to ensure effective delivery of services that met user needs. Sadly this clarity of role seems few and far between.

I would love, when travel etc. allows, to see a workshop happen mapping out the roles of Service Owner; SRO; Programme manager; Product Lead etc. Looking at what their responsibilities are; providing clarity on where there is any overlap and how this could be managed better so that we can get to the point where we have consistency in these roles; and better understanding of how they can work together without duplication or confusion over the value they all add.

For now, at least, it’s each organisations responsibility to ensure that they are being clear on what the responsibilities for the roles and those people working in them are. We need to stop pretending the confusion doesn’t exist and do are best to provide clarity to our teams and our people; otherwise we’re only muddying the waters and it’s that kind of confusion that inevitably impacts teams and their ability to deliver.

Let’s be clear, say what do you mean

How to be a Product Advocate

Why you need a Product Person in your team.

Since joining Kainos a few weeks ago, I’ve had a number of conversations internally and with clients about the relationship between Delivery and Product; and why I as a Product Person moved over to Delivery.

‘Products at the heart of delivery’ image

My answer to that question was that, having spent over 10 years as a Product Person, and seeing the growth of Product as a ‘thing’ within the Public Sector; helping Product grow and mature, developing the community, ways of working, career pathway etc; I realised that what was missing was Product thinking at a senior level. Most Senior leaders within the Programme delivery or Transformation space come from a traditional delivery background (if not an operational one) and while many of them do now understand the value of user centric design and user needs etc; they don’t understand the benefit of a product centric approach or what value Product thinking brings.

The expansion of Product people in the Public sector has predominantly been driven by GDS and the Digital Service standards; with most organisations now knowing they need a ‘Product Manger‘ in order to pass their Service Standard Assessment. However, almost 10 years later, most organisations are still not prioritising the hiring and capability development of their Product people. In May I worked with four different teams each working to the Digital Standards and needing to pass an assessment; and in none of those teams was the role of the Product manger working in the way we intended when we creating the DDaT Product Management capability framework.

Most organisations (understandably) feel the role of the Product Manager should be an internal one, rather than one provided by a Supplier; but 9 times out of 10 the person they have allocated to the role has no experience in the role, have never worked on a product or service that was developed to the digital standards never mind having been through an assessment; and they are regularly not budgeted or allocated the project full time; often being split across too many teams or split between the Product Manager role whilst still working in Ops or Policy or whoever they have come from previously; more often than not their actually a Subject Matter Expert, not a Product Manager (which I’ve blogged about before).

As a supplier; this makes delivery so much harder. When the right Product person isn’t allocated to a project, we can quickly see a whole crop of issues emerge.

So what are the signs that Product isn’t being properly represented within a team:

  • Overall vision and strategy are unclear or not shared widely; teams aren’t clear on what they’re trying to achieve or why; this can be because the Product person is not able to clearly articulate the problem the team are there to solve or the outcomes that team are their to deliver aren’t clearly defined.
  • Roadmap doesn’t exist, is unstable or does not go beyond immediate future/ or the Scope of the project keeps expanding; often a sign that prioritisation isn’t being looked at regularly or is happening behind closed doors making planning hard to do.
  • Success measures are unclear or undefined; because the team doesn’t understand what they’re trying to achieve and often leads to the wrong work getting prioritised or outcomes not getting delivered or user needs not met.
  • Work regularly comes in over budget or doesn’t meet the business case; or the team keeps completing Discoveries and then going back to the start or struggling to get funding to progress. This can be a sign the team aren’t clear what problem they are trying to solve or that the value that the work delivers cannot be/ isn’t clearly articulated by the Product person.
  • Delivery is late/ velocity is slow. This can be a sign the team aren’t getting access to their Product person in a timely manner causing bottlenecks in stories being agreed or signed off; or that the Product person is not empowered to make decisions and is constantly waiting for sign off from more senior stakeholders.
  • Role out is delayed or messy, with operational teams frustrated or unclear on project progress; a sign that the team doesn’t have someone owning the roadmap who understands what functionality will be available when and ensuring any dependancies are clearly understand and being monitored, or a sign that there isn’t someone engaging with or communicating progress to wider stakeholders.

More often than not as a Supplier I’ve had to argue that we need to provide a Product person to work alongside/ with teams to coach/support their internal Product people in the skills and responsibilities a Product person needs to have to enable successful delivery. Where clients have been adamant they don’t want Product people from a Supplier (often for budgetary reasons), we’ve then had to look at how we sneak someone in the door; usually by adding a Business Analyst or delivery manager to the team who also has Product skills, because otherwise are ability to deliver will be negatively impacted.

When budgets are tight, the role of Product person is often the first thing project managers try to cut or reduce; prioritising the technical or project delivery skills over Product ones. As such, teams (and organisations) need to understand the skills a good product person brings; and the cost of not having someone within a team who has those skills.

  • Their role is to focus on and clarify to the team (and business) the problem the team are trying to fix.
  • Ensure a balance between user needs; business requirements and technical constraints/options.
  • Quantifying and understanding the ROI/ value a project will deliver; and ensuring that can be tracked and measured through clear success measures and metrics.
  • Being able to translate complex problems into roadmaps for delivery. Prioritising work and controlling the scope of a product or service to ensure it can be delivered in a timely and cost effective manner, with a proper role out plan that can be clearly communicated to the wider organisation.

As an assessor, I have seen more projects fail their assessments at Alpha (or even occasionally Beta) because they lack that clear understanding of the problem there trying to solve or their success measures etc; than I have because they’ve used the wrong technical stack etc. This can be very costly; and often means undress of thousands (if not millions) of pounds being written off or wasted due to delays and rework. Much more costly than investing in having a properly qualified or experienced Product people working within teams.

While Product and Delivery are often seen as very different skill sets; I recognised a few years ago the value in having more people who understand and can advocate for both the value Product thinking brings to delivery; but also how delivery can work better with Product. People who can not only understand but also champion both in order to ensure we’re delivering the right things in the right ways to meet our clients and their users needs.

Which is why I made the active decision to hop the fence and try and bring the professions closer together and build understanding in both teams and senior leaders in the need for Product and Delivery skills to be invested in and present within teams in order to support and enable good delivery, and I as really glad to see when I joined Kainos that we’re already talking about how to bring our Product and Delivery communities closer together and act for advocates to support each other; and it was in fact a chat with the Kainos Head of Product Charlene McDonald that inspired this blog.

Having someone with the title of Product Manager or Owner isn’t enough; we need people who are experienced in Product thinking and skilled in Product Management; but that isn’t all we need. We need to stop seeing the role of Product person as an important label needed you can give to anyone in the team in order to pass an assessment and understand why the role and the skills it brings are important. We need senior leaders, project managers and delivery teams who understand what value Product brings; who understand why product is important and what it could cost the team and their organisation if those product skills are not included and budgeted for properly right from the start. We need Senior Leaders to understand why it’s important to invest in their product people; giving them the time and support they need to do their job properly; rather than spreading them thin across teams with minimal training or empowerment.

We need more Product advocates.

Partnership

The good and the bad.

At Difrent we always talk about our desire to deliver in partnership with out clients. To move beyond the pure supplier and client relationship to enable proper collaboration.

One of my main frustrations when I was ‘client side’ was the amount of suppliers we’d work with who said they would partner with us, but then when the contract started, after the first few weeks had passed and the new relationship glow had faded; the teams and the account managers reverted to type. I can’t recall how many times I had to have conversations at the supplier governance meetings where I was practically begging them to challenge us; to be a critical friend and push for the right thing; to feedback to us about any issues and suggest improvements. It always felt like we were reaching across a gap and never quite making full contact.

As such, that’s one of the areas in Difrent I (and others) are very keen to embody. We try to be true partners; feeding back proactively where there are issues or concerns or where we have suggestions. Trying to foster collaborative ‘one team’ working.

We’ve obviously had more success with this on some contracts vs others. There’s always more we can learn about how to better partner with our clients; however; given we see a lot of complaining about strained partnerships between clients and suppliers; I thought I’d do a bit of a case study/ reflection and praise of one partnership we’ve been working on recently.

Difrent won a contract with the Planning Inspectorate last year, and it was the first completely remote pitch and award we’d been involved with on a multi million pound contract.

From the start of the procurement it became really clear that the Planning Inspectorate wanted a partner; that this wasn’t just lip service, but something they truly believed it. As part of the procurement process they opened up their github so we could see their code; they opened up their Miro so we could see their service roadmap, they proactively shared their assessment reports with suppliers etc.

For us this made not only a good impression, but enabled us to develop a more informed and valuable pitch.

Since we put virtual feet in the virtual door that dedication to partnership has remained as true 6 months later as it was then. Outside of our weekly governance calls we’ve had multiple workshops to discuss collaboration and ways of working. We’ve had multiple discussions on knowledge transfer and reflecting on progress and ways to iterate and improve.

Where there have been challenges we’ve all worked hard to be proactive and open and honest in talking things through. They’ve welcome our suggestions and feedback (and proactively encouraged them) and been equally proactive on giving us feedback and suggestions.

This has helped us adapt and really think about how we do things like knowledge transfer, always challenging (especially remotely), but something we’re passionate about getting right. We’ve all worked so hard on this, so much so that it’s become on of the core bits of our balanced scorecard; ensuring they as a client can measure the value they’re getting from our partnership not just through our outputs on the projects we’re working on, but our contributions to the organisation as a whole; which is also really helpful for us to be able to help us analyse and iterate our ‘value add’ to our partners; and ensure we’re delivering on our promises.

I think there is a lot of learning for other Departments/ ALB’s out there looking to procure digital services or capability on how a good partnership with a supplier needs to start before the contract is signed.

Thanks to Paul Moffat and Stephen Read at the Planning Inspectorate for helping with this blog – demonstrating that partnership in action!

Digital Transformation is still new

We’re punishing those who are less experienced, and we need to stop.

The timeline of Digital Transformation. Courtesy of Rachelle @ https://www.strangedigital.org/

In the last few weeks I’ve had multiple conversations with clients (both existing and new) who are preparing for or have recently not passed their Digital Service standard assessments who are really struggling to understand what is needed from them in order to pass their assessment.

These teams have tried to engage with the service standards teams, but given those teams are extremely busy; most teams cant get any time with their ‘link’ person until 6 weeks before their assessment; by which time most teams are quite far down their track and potentially leaves them a lot of (re)work to try and do before their assessment.

Having sat in on a few of those calls recently I’ve been surprised how little time is set aside to help the teams prep; and to give them advice on guidance on what to expect at an assessment if they haven’t been through one before. Thos no time or support for mock assessments for new teams. There may be the offer of one or two of the team getting to observe someone else’s assessment if the stars align to allow this; but it’s not proactively planned in; and instead viewed as a nice to have. There seems to be an assumption the project teams should know all of this already; and no recognition that a large number of teams don’t; this is still all new to them.

“In the old days” we as assessors and transformation leads used to set aside time regularly to meet with teams; talk through the problems they were trying to fix, understand any issues they may be facing, provide clarity and guidance before the assessment; so that teams could be confident they were ready to move onto the next phase before their assessment. But when I talk to teams now, so few of them are getting this support. Many teams reach out because the rare bits of guidance they have received hasn’t been clear, and in some cases it’s been contradictory and they don’t know who to talk too to get that clarity.

Instead, more and more of my time at the moment, as a supplier, is being set aside to support teams through their assessment. To provide advice and guidance on what to expect, how to prepare and what approach the team needs to take. Actually what an MVP is; how to decide when you need an assessment, and what elements of the service do you need to have ready to ‘show’ at each stage. What the difference is between Alpha/ Beta and Live assessments and why it matters. For so many teams this is still almost like a foreign language and new.

So, how can we better support teams through this journey?

Stop treating it like this is all old hat and that everyone should know everything about it already.

Digital Transformation has been ‘a thing’ for one generation (if you count from the invention of the internet as a tool for the masses in 1995); Within the public sector, GDS, the Digital Service Standards and the Digital Academy have existed for less than one generation; less than 10 years in-fact.

By treating it as a thing everyone should know, we make it exclusionary. We make people feel less than us for the simple act of not having the same experience we do.

We talk about working in the open, and many team do still strive to do that; but digital transformation is still almost seen as a magical art by many; and how to pass what should be a simple thing like a service standard assessment is still almost viewed as Arcane knowledge held by the few. As a community we need to get better at supporting each other, and especially those new to this experience, along this path.

This isn’t just a nice thing to do, its the fiscally responsible thing to do; by assuming teams already have all this knowledge we’re just increasing the likelihood they will fail, and that comes with a cost.

We need to set aside more time to help and guide each other on this journey; so that we can all succeed; that is how we truly add value, and ensure that Digital Transformation delivers and is around to stay for generations to come.

Agile Delivery in a Waterfall procurement world

One of the things that has really become apparent when moving ‘supplier side’ is how much the procurement processes used by the public sector to tender work doesn’t facilitate agile delivery.

The process of bidding for work, certainly as an SME is an industry in itself.

This month alone we’ve seen multiple Invitations to Tender’s on the Digital Marketplace for Discoveries etc, as many departments are trying to spend their budget before the end of the financial year.

The ITT’s will mention user research and ask how suppliers will work to understand user needs or hire proper user researchers. But they will then state they only have 4 weeks or £60K to carry out the Discovery. While they will specify the need for user research, no user recruitment has been carried out to let the supplier hit the ground running; it’s not possible for it to be carried out before the project starts (unless as a supplier you’re willing to do that for free; and even if you are, you’ve got less than a week to onboard your team, do any reading you need to do and complete user recruitment, which just isn’t feasible); and we regular see requests for prototypes within that time as well.

This isn’t to say that short Discoveries etc. are impossible, if anything COVID-19 has proved it is possible, however there the outcomes we were trying to deliver were understood by all; the problems we were trying to solve were very clear,; and there was a fairly clear understanding of the user groups we’d need to be working with to carry out any research; all of this enabled the teams to move at pace.

But we all know the normal commercial rules were relaxed to support delivery of the urgent COVID-19 related services. Generally it’s rare for an ITT to clarify the problem the organisation is trying to solve, or the outcomes they are looking to achieve. Instead they tend to solely focus on delivering a Discovery or Alpha etc. The outcome is stated as completing the work in the timeframe in order to move to the next stage; not as a problem to solve with clear goals and scope.

We spend a lot of time submitting questions trying to get clarity on what outcomes the organisations are looking for, and sometimes it certainly feels like organisations are looking for someone to deliver them a Discovery solely because the GDS/Digital Service Standard says they need to do one. This means, if we’re not careful, halfway through the Discovery phase we’re still struggling to get stakeholders to agree the scope of the work and why we really do need to talk to that group of users over there that they’ve never spoken too before.

Image result for gds lifecycle
The GDS lifecycle

The GDS lifecycle and how it currently ties into procurement and funding (badly) means that organisations are reluctant to go back into Discovery or Alpha when they need too, because of how they have procured suppliers. If as a supplier you deliver a Discovery that finds that there is no need to move into Alpha (because there are no user needs etc) or midway through an Alpha you find the option you prioritised for your MVP no longer meets the needs as anticipated, clients still tend to view that money as ‘lost’ or ‘wasted’ rather than accepting the value in failing fast and stopping or changing to do something that can add value. Even when the clients do accept that, sometimes the procurement rules that brought you on to deliver a specific outcome mean your team now can’t pivot onto another piece of work, as that needs to be a new contract; either scenario could mean as a supplier you loose that contract you spent so much time getting, because you did ‘the right thing’.

We regularly pick up work midway through the lifecycle; sometimes that’s because the previous supplier didn’t work out; sometimes its because they were only brought in to complete the Discovery or Alpha etc. and when it comes to re-tender, another supplier is now cheaper etc. That’s part and parcel of being a supplier; but I know from being ‘client side’ for so long how that can make it hard to manage corporate knowledge.

Equally, as a supplier, we rarely see things come out for procurement in Live, because there is the assumption by Live most of the work is done, and yet if you follow the intent of the GDS lifecycle rather than how it’s often interpreted, there should still be plenty of feature development, research etc happening in Live.

This is turn is part of the reason we see so many services stuck in Public Beta. Services have been developed by or with suppliers who were only contracted to provide support until Beta. There is rarely funding available for further development in Live, but the knowledge and experience the suppliers provided has exited stage left so it’s tricky for internal teams to pick up the work to move it into Live and continue development.

Most contracts specify ‘knowledge transfer’ (although sometimes it’s classed as a value add; when it really should be a fundamental requirement) but few are clear on what they are looking for. When we talk to clients about how they would like to manage that, or how we can ensure we can get the balance right between delivery of tangible outcomes and transferring knowledge, knowledge transfer is regularly de-scoped or de-prioritised. It ends up being seen as not as important as getting a product or service ‘out there’; but once the service is out there, the funding for the supplier stops and the time to do any proper knowledge transfer is minimal at best; and if not carefully managed suppliers can end up handing over a load of documentation and code without completing the peer working/ lunch and learns/ co-working workshops we’d wanted to happen.

Some departments and organisations have got much better at getting their commercial teams working hand and hand with their delivery teams; and we can always see those ITT’s a mile off; and it’s a pleasure to see them; as it makes it much easier for us as suppliers to provide a good response.

None of this is insurmountable, but we (both suppliers and commercial/procuring managers and delivery leads) need to get better at working together to look at how we procure/bid for work; ensuring we are clear on what the outcomes we’re trying to achieve are, and properly valuing ‘the value add’.

And this is why we test with users…

A blog on the new National Careers ‘Discover your skills and careers’ Service

As I sit here are ten past ten on a Wednesday night watching social media have a field day with the new National Careers service, I’m yet again reminded about the importance of the Digital Service Standard, especially Standard Number One – Understand users and their needs. And why we need to get Ministers and senior leaders to understand their importance.

The first role of any good User Centric designer or Product Manager within the public sector is understanding the problem you’re trying to solve.

In this case, the problem we’re facing is not a small one. Because of COVID-19 we currently have approximately 1.4M people unemployed with many more still facing redundancy due to the ongoing pandemic. ONS data states that between March and August, the number of people claiming benefits rose 120% to 2.7 million.

The Entertainment, Leisure and Hospitality sectors have been decimated, amongst many others. Just this week we’ve had Cineworld announce 45,000 job loses and Odeon may soon be following suit. Theatres and live event venues across the country are reporting they are on the brink of collapse.

So, when the Chancellor announced as part of the summer statement, a whole host of support for people too retrain; it included advice for people to use the new Careers and Skills advice service to get ideas on new career options.

A service to help people understand new career options right now is a great idea, it absolutely should meet user need.

A screenshot of the national careers service skills assessment

Unfortunately, you only have to look at the headlines to see how well the new service has been received. The service is currently such a laughing stock that no-one is taking it seriously; which is a massive shame, because it’s trying to solve a very real problem.

A number of my friends and acquaintances have now taken the quiz (as has half of twitter apparently) and it was suggested I have a look. So I did. (As an aside, it recommended I retrain in the hospitality industry, all who know me know how terrible this would be for all involved, last week I managed to forget to cook 50% of our dinner, and I am clinically unable to make a good cup of coffee, never mind clean or tidy anything!)

It has good intentions, and in a number of cases, it may not be too far off the mark; the team behind the service have done a write up here* of how they have developed it, and what they set out to achieve. Unfortunately, while the service seems to be simple to understand and accessible to use; what it seems to be missing is any level of context or practicality that would help it meet the problem it’s being used for.

*EDIT: Which has sadly now been taken down, which is a massive shame, because they did good work, but sadly I suspect under political pressure to get something out there quickly. We’ve all been there, it’s a horrid position to be in.

While they have tested with users with accessibility needs, the focus seems to have been on whether they can use the digital service; not does the service actually meet their needs?

My friend with severe mobility and hearing issues was advised to retrain as a builder. Another friend with physical impairments (and a profound phobia of blood) was advised they were best suited to a role as a paramedic. A friend with ASD who also has severe anxiety and an aversion to people they don’t know, was advised to become a beautician. Another friend who is a single parent was given three career options that all required evening and weekend work. At no point does this service ask whether you have any medical conditions or caring needs that would limit the work you could do. While you can argue that that level of detail falls under the remit of a jobs coach; it can understandable be seen as insensitive and demoralising to be recommending careers to people they are physically unable to do.

Equally, unhelpful is the fact the service which has been especially recommended to people who have been made redundant from the worst hit industries; is recommending those same decimated industries to work in, with no recognition of the current jobs market.

My partner, who was actually made redundant from her creative role due to COVID-19, (and the target audience for this service according to the Chancellor) was advised to seek a role in the creative industries; an industry that doesn’t currently exist; and a quick look on social media proves she isn’t alone.

The service doesn’t actually collect enough (well, any) data about the career someone is in, nor does it seem to have any interface to the current jobs market to understand whether the careers its recommending are actually viable.

Unfortunately, the service is too generic, and while it would possibly help school/ college students who are trying to choose their future career paths in a ‘normal’ job market, (And I honestly suspect that’s who it was actually developed for!) it’s not meetings the fundamental problem we are facing at the moment; ie. help people understand their career options in the current market.

If you’ve worked within Digital in the Public Sector you’ve had to deal with Ministers and Directors who don’t really understand the value of user research or why we need to test things properly before we role them out nationally. The current debacle with the careers website is possible a perfect example of why you need to make sure you actually test your service with a wide range of users regularly; not just rely on assumptions and user personas; and why its important to test and iterate the service with real users multiple times before it gets launched. It highlights the need for us to get Ministers to understand that rushing a service out there quickly isn’t always the right answer.

We all need to understand users and their needs. Just because a service is accessible doesn’t mean it solves the problem users are facing.

Do Civil Servants dream of woolly sheep?

The frustration of job descriptions and their lack of clarity.

One of the biggest and most regularly occurring complaints about the Civil Service (and public sector as a whole) is their miss-management of commercial contracts.

There are regularly headlines in the papers accusing Government Departments & the Civil Servants working in them of wasting public money, and there has been a drive over the last few years especially to improve commercial experience especially within the Senior Civil Service.

When a few years ago my mentor at the time suggested leaving the public sector for a short while to gain some more commercial experience before going for any Director level roles, this seemed like a very smart idea. I would obviously need to provide evidence of my commercial experience to get any further promotions, and surely managing a couple of 500K, 1M contracts would not be enough, right?

Recently I’ve been working with my new mentor, focusing specially on gaining more commercial knowledge etc. and last month he set me an exercise to look at some Director and above roles within the Digital and Transformation arena to see what level of commercial experience they were asking for, so that I can measure my current levels of experience against what is being asked for.

You can therefor imagine my surprise when this month we got together to compare 4 senior level roles (2 at Director level and 2 Director General) and found that the amount of commercial experience requested in the job descriptions was decidedly woolly.

I really shouldn’t have been surprised, the Civil Service is famous for its woolly language, policy and strategy documents are rarely written in simple English after all.

But rather than job specifications with specific language asking for “experience of managing multiple multi million pound contracts successfully etc”. What is instead called for (if mentioned specially at all) is “commercial acumen” or “a commercial mindset” but no real definition of what level of acumen or experience is needed.

The Digital Infrastructure Director role at DCMS does mention commercial knowledge as part of the person specification, which it defines as “a commercial mindset, with experience in complex programmes and market facing delivery.

And this one from MoD, for an Executive Director Service Delivery and Operations, calls for “Excellent commercial acumen with the ability to navigate complex governance arrangements in a highly scrutinised and regulated environment”

Finally we have the recently published Government CDO role, which clearly mentions commercial responsibilities in the role description, but doesn’t actually demand any commercial experience in the person specification.

At which point, my question is, what level of Commercial acumen or experience do you actually want? What is a commercial mindset and how do you know if you have it? Why are we being so woolly at defining what is a fundamentally critical part of these roles?

How much is enough?

Recent DoS framework opportunities we have bid for or considered at Difrent have required suppliers to have have experience of things like “a minimum of 2 two million pound plus level contracts” (as an example) to be able to bid for them.

That’s helpful, as Delivery Director I know exactly how many multimillion pound contracts we’ve delivered successfully and can immediately decide whether as a company it’s worth us putting time or effort into the bid submissions. But as a person, I don’t have the same level of information needed to make a similar decision on a personal level.

The flip side of the argument is that data suggests that women especially won’t apply for roles that are “too specific” or have a long shopping list of demands, because women feel like they need to meet 75% of the person specification to apply. I agree with that wholeheartedly, but there’s a big difference between being far too specific and listing 12+ essential criteria for a role, and being soo unspecific you’ve become decidedly generic.

Especially when, as multiple studies have shown, in the public digital sector Job titles are often meaningless. Very rarely in the public sector does a job actually do what it says on the tin. What a Service Manager is in one Department can be very different in another one.

If I’m applying for an Infrastructure role I would expect the person specification to ask for Infrastructure experience. If I’m applying for a comms role, I expect to be asked for some level of comms experience; and I would expect some hint as too how much experience is enough.

So why when we are looking at Senior/ Director level roles in the Civil Service are we not helping candidates understand what level of commercial experience is ‘enough’? The private sector job adverts for similar level roles tend to be much more specific in terms of the amount of contract level experience/ knowledge needed, so why is the public sector being so woolly in our language?

Woolly enough for you?

*If you don’t get the blog title, I’m sorry, it is very geeky. and a terrible Philip K. Dick reference. But it amused me.