Insights

User Centred Design and Measuring the Success of Digital

Intro

We made the walk up to The Core for the Showcase: User Centred Design and Measuring the Success of Digital. The line up consisted of 3 experts working at DWP Digital. Many thanks to Newcastle City Council and DWP for putting this event on for Digital Leaders Week.

How we Engage with Users and Iterate Services Based on User Needs

The event was kicked off by Jill Pate (Lead User Researcher) discussing how services can be improved based on user needs. Kate began by outlining the sheer amount of information a user is displayed with when looking at their state pension in paper form before a digital version was generated. There were 3 groups of paper, each with an extensive amount of legal jargon and mostly irrelevant information.

Prior to creating a digital solution, many forms of discovery had to be undertaken to grasp the problem. It was interesting to hear about the various discovery methods used that helped gain information from users. We were told about the process that users had to undertake to find out about their pension forecast including a back and forth chat between HMRC and DWP (Department for Work and Pensions).

This is when ‘Hypothesis Driven Design’ was utilised which was described as ‘experiment led design’. Each experiment that was tested and reviewed produced valuable insights that could be acted on. Kate made a great point about experiments that didn’t work. In which if there are issues with the experimental solution that you generated don’t worry because you are continuously learning if nothing else.

Empathy in Accessibility

The next talk was presented by Craig Abbott (Senior Interaction Designer). This talk centred around accessibility which aligned with our recent articles published about accessibility. Craig began the presentation by outlining some crucial statistics about the prevalence of disability.

One of the stand out points for me was that implementing accessibility should be done from the start and not an afterthought. This will ensure the product is fit for purpose for all audiences alike. It was mentioned that companies try to crowbar accessibility into their sites without research. This more often than not is made up of low hanging fruit, meaning major accessibility issues will not be taken into account.

So how we know what issues need addressing? Craig began discussing ways in which we can uncover these issues by using the appropriate research methods. This included sitting down with people with disabilities in their home environment will help put them at ease and you can see first hand the issues they face. Unplugging your mouse and navigating via keyboard can help, as well as using screen readers and many more.

There are sites that can analyse the accessibility of a site and bring up the issues. However, although these may be good for startling obvious issues like contrast, for example, they typically miss most of the other issues. Craig showed up the top site checker only found 40% of issues on a test site that was purposely made to be very bad for accessibility. It pays to have an accessible site. Major companies in America are being sued for not being accessible. But above all else, the population with disabilities could make up the majority of your traffic and should be treated in a welcoming manner.

Measuring the success of Digital

Sarah Windass (Performance Analyst) rounded up the talk by discussing the importance of continuously analyzing your site. This coincides with user research and how analysis bounces off the findings from user research and vice versa.

If your site is going through an update it is heavily advised that when in a Beta stage to test with users. The analysis taken from the old site can help provide context as to what worked and what didn’t. This will help reveal improvements that can be made to help elevate the site experience. Sarah emphasised that analytics are more than just numbers, it’s important to remember that people are behind those numbers.

One intriguing point made was that the real world can heavily impact what the analysis shows. Sarah showed Google analytics for their DWP site and how there is normally a steady stream of traffic. However, the graph then showed a sudden spike in web traffic across the space of a day or two. Typically a spike means there is an issue or something that needs flagging. But in this case, it was because an article was published by the BBC about a change in pensions.

Closing thoughts

In all, it was a great event which brought up some great talking points. Interestingly we have touched upon some of these points in past articles. However, it was good to gain additional information about these and other points that we ourselves may not have necessarily thought about. Thanks to everybody involved for hosting the event. 

Insights

Curiosity is what drives us to learn and grow professionally enabling us to remain vigilant to change and the opportunities it presents for our clients. We document some of this learning and share it via our 'Insights' section.