Project overview

We identified through research that the assessment process and guidance for digital services are inconsistent at all stages. We know this is because guidance around the process is confusing.

Examples of why:

  • there's no single place to find out about discovery peer reviews
  • guidance lives in multiple places across the intranet and Teams folders
  • guidance is often contradictory or out-to-date
  • unawareness of what assurance is available or whether an assessment is needed

We also identified ways to improve the efficiency of the service for the internal team who manage it.

Our users are split into 3 groups

Service delivery teams, who:

  • explore the problem
  • build a service or product
  • book an assessment or peer review
  • share their work with assessors

Service assessors, who:

  • review the work the team have done druing an assessment
  • provide advice and support on how the team is performing against the Service Standard
  • identify areas for improvement

Service assessment team, who:

  • book assessments and peer reviews
  • recruit assessors
  • prepare service teams
  • arrange pre-and-post assessment calls
  • share and publish the report
  • report to senior leadership and spend control on assessment results

We also identified a range of secondary users, which include portfolio leadership, heads of profession, and other government departments.

Ultimately, by improving how services are assessed, and raising awareness of the Service Standard, we will see an improvement in the quality of the services we develop.

What we’ve done so far

We’re currently in beta phase, building out the service ready to test with the service assessment team, assessors and service teams in April 2024.

What we did in discovery and alpha

We spoke to 68 people in research from all 3 primary user groups. This was to learn about the assessment and peer review experience in DfE. We tested guidance content for teams to find out about and book discovery peer reviews.

We considered the role of assessors, to understand:

  • their responsibilities at an assessment or peer review
  • how they write and submit the assessment report
  • how they prepare for assessment

Our research included time with the service assessment team to look at how they:

  • book a service assessment or peer review
  • manage and administer assessments

We also spoke to people involved in assessments from across government, including:

  • 9 government departments other than DfE
  • digital colleagues at OneTeamGov events
  • service and delivery teams at Services Week 2023
  • assessors from across government at panel meet-ups
  • CDDO (Central Digital and Data Office)

This helped our understanding of shared pain points with assessments across government. We also shared our service assessment service work to understand if it could benefit other departments, too.

Next steps

We will continue to build out the service to test with users in our private beta phase. This will include how teams start their assessment journey and find out about what’s involved.

This includes:

Get involved

If you are interested in contributing to, or using this service in your own organisation, contact the DesignOps team.

You can follow the progress of the service through the service assessment design histories.

Design histories

Read more in our design histories.

Other projects