Catherine Obradovic, a GDS assessor, talks about what to look for when assessing support for digital services.
Part one of this blog gave some tips for self-certification assessors on how to assess assisted digital user research.
Here’s what we look for when we assess a service team’s approach to designing support for their digital service.
Unlike the on-screen part of the service, we don’t see the support demonstrated in an assessment. So the service team needs to describe how the support will meet user needs and the Digital Service Standard.
Tips to consider during assessments
1) Does the proposed support meet user needs identified through user research?
Sounds obvious for a digital service standard assessment but service teams don’t always talk about how their plan to support users is based on the user needs they’ve identified. Good specific user research should show what type of support a service team needs to put in place, whether that is face by face support, telephone, web chat, a combination of those or something different.
2) Does the proposed support meet user needs, not government needs?
Government already provides offline support to people as part of some transactional services. The difference with assisted digital support is that government has committed to developing support that is tailored to user needs for specific services and of a consistently good standard across multiple services and departments.
So it’s fine for service teams to say that they are planning to use existing support providers to deliver assisted digital support, but the team must also say what changes will be made to bring their support in line with user needs and the digital service standard.
3) Does the proposed support meet user needs, not user preferences?
As assessors, we don’t need to know what users would like or whether users prefer a particular provider or route. Service teams should talk about developing appropriate support to meet the specific needs of their users in completing the redesigned digital service.
4) Does the support for the end to end user journey meet the Digital Service Standard?
Service teams should demonstrate that they are developing good assisted digital support, by talking about the user journey from end to end.
For example, teams should talk about how users will become aware of the service and the support available - i.e. not just providing a phone number online as this won’t reach users who are offline - as well as showing how the support is easy to access for users, at times and places they need it. They should have thought about how every step in the digital service will be handled for or by people who aren’t online, for example, digital confirmations and notifications or verifying identity.
The service team should understand the context of the service and say how they plan to make the user experience joined up and consistent across other relevant government services. The team should provide details of how the support and wider service helps users develop skills to use the online service independently. And we want to hear how they will collect feedback and measure the support, so that they can make ongoing improvements to the whole digital service.
5) Has the proposed support been tested and will it work for users, from end to end?
In the beta assessment, the service team should talk about how they plan to test support in public beta. Testing should identify areas for improvement across the end to end user journey and show whether proposed support will work for potential users. The team should talk about these findings in the Live assessment.
Service teams must explain why they plan to test and provide certain assisted digital routes and, importantly, why they are ruling out others. They need to say why the support is appropriate for their users. For example, users who don’t have relevant skills or access to a computer can’t use telephone talk-through support, so it’s unlikely that this route alone will be sufficient for most services. Some services might not be able to take information on behalf of the user over the telephone, if the service is very long, complex or for legal reasons. If the only way to meet user needs is through face to face support, service teams must provide this.
6) Is the support sustainable?
Service teams may find that a portion of their users say they would ask a third party for help with completing the service. This often includes charities, friends and family, or paid for intermediaries. This support is not sustainable as charities rely on donations and may have long waiting lists for appointments, friends and family may not always be there and not everyone can afford to pay an intermediary. It’s not sufficient for service teams to say that they are pointing users in the direction of ‘free’ (to government) help.
So teams need to demonstrate how they will provide sustainable support to those users, if they choose not to or are unable to use a third party any more. This could include funding or entering a formal agreement with organisations who are already providing support for the existing service, pairing up with a similar government agency which has appropriate support etc.
Context is everything
These are tips but there are no right answers. Assessments give service teams the opportunity to explain the specific context of their service and why they have made certain decisions in designing their support. As an assessor, I just need to be convinced that the service team has considered the needs of all their users - not just those who can use the digital service independently - and that there is appropriate support in place for those who can’t.
We’ve been running briefings for self-certification assessors in government departments, and assessors across all disciplines have said they found it useful. If you’re a self-certification assessor and you’d like a briefing on how to assess support for users, please get in touch.
Follow Catherine on Twitter and don’t forget to sign up for email alerts