Public Health England

Evaluating digital health products

Pill glass

How might we enable people to understand the effectiveness of the digital health products that they create? 

Highlights

10,000s of site views

The evaluation guide I designed became a hugely popular part of the Public Health England site, attracting thousands of users each month. The guide is live, and allows people to better evaluate their digital health products.

24 Usability testing sessions

During the Alpha and Beta phases of the project I conducted multiple rounds of usability testing, to make the evaluation service as simple, and effective as possible.

7 Pilot Projects

During the testing I delivered workshops to many digital health teams, to support them in evaluating their health products. They were also able to go on to teach others about evaluation.

13 Evaluation training workshops

I carried out many evaluation training workshops across; Public Health England, NHS Digital and a variety of start ups.

Challenge      

There are now over 318,000 health apps available on the top app stories worldwide – with more than 200 apps being added each day (IQVIA, 2017). Public Health England recognised that cross the health sector there are lots of digital health products being developed, but we have little knowledge of whether they achieve their intended health outcomes. This meant it was difficult to improve efficacy of digital health products. It was also difficult to fund and celebrate digital health products that were effective.

I’ve worked as the lead service and interaction designer in an agile, multidisciplinary team to design a service that supports digital teams to better evaluate their products.      

Personas

We developed 3 personas to understand the user needs of people who would use the Public Health England Evaluation Service. There were 3 main stakeholders: founders of start ups who understood how to create compelling digital health products, but didn't necessarily have a health evaluation background. Product managers who worked within central and local government delivering digital products. Public health consultants who would be involved in evaluation, but not necessarily have a background in digital. We understood each of these people would have different needs and ways of interacting with the service.  

Founders of start ups persona
Public health consultants
Product managers persona

We identified that teams may not have the expertise to carry out evaluation. As well as this, they may not have the support and the organisational culture they need to evaluate their products. To address this, we designed an evaluation service which includes an evaluation guide, training and a community to share knowledge and ask questions.

Solution

We developed a solution that would support teams inside and outside of goverment to better evaluate their digital health products. This evaluation service included: easy to run training, an accessible evaluation guide, and an evaluation methods library. We tested this solution extensively with teams. These teams were all working on digital health products, and they used the service to understand the effectiveness of their digital health products. This service went through it's GDS assessment, and is live on the GDS website.

Image showing NHS Digital team taking part in evaluation training

Easy to Run Training

In collaboration with evaluation experts I created an evaluation training package, so that an evaluation novice can: learn about evaluation and carry out activities be better informed to commission evaluation Many teams participated in the evaluation training, their feedback allowed us to iterate the training. This way the training is understandable to users and ensures they learn the evaluation basics.    

Image of the new PHE Evaluation Guide we developed

An Accessible Evaluation Guide    

The guide gives step-by-step instructions on how to carry out evaluation. I designed a coded a prototype to test the guide. A content designer and evaluation expert were brought on board to create the guide’s content. We carried out 6 rounds of usability testing to make the guide intuitive to people who are new to evaluation. Our accessibility audit and testing ensured that people with a variety of access needs can use it.      

Information Architecture of Guide

The image above shows how the evaluation guide worked. Because there is a lot of content on the guide, I spent time understanding how people would move through the site. We carried out usability testing to hone this.

Evaluation Methods Library

How I developed the evaluation methods library

Evaluation methods are one of the most important parts of the guide. I prototyped multiple versions of the evaluation methods library. I then spent time with the team gaining feedback on which design worked best.

Menu Design

How I developed the Evaluation Guide menu

I developed multiple options for the evaluation menu design, and tested them all with users.

Service Blueprint

Service blueprint of the whole service

This shows how the different parts of the evaluation service can be used at different points in a teams journey. We advise that a team creates a logic model early on, in order to understand their intended health impact and collect the right data.

Team

  • Product Manager: Hasan Ali
  • Content Designer: Flora Death
  • Service and Interaction Designer: Charlotte Fountaine
  • Service Owner: Kassandra Karpathakis
  • Delivery Managers: Rosmari Mitova, Bobak Sadaat
  • Evaluation Expert: Henry Potts
  • User Researchers: Anya Zeitlin, Claire Rackstraw

Impact    

The evaluation guide is available to all on GOV.UK. The training has been run with 15 teams across the health sector, with a positive response. This was one of the first digital products to be delivered at Public Health England Digital, trailblazing the use of design in the department.    

I spoke at Service Lab about the project, you can listen here:

You can read more about the project at these links:

Project reception

"

This project has shown innovation in it's wide perspecive, by looking out and up not down and in. It's also helped influence policy-led design, through a more rounded view of evaluating performance.

Ian Roddis

Related work

Check out some of the other projects I've been working on.