The information in this blogpost may now be out of date. See the current GOV.UK content and publishing guidance.
In the GOV.UK content team, we keep an eye on how all our content pages are performing using the Performance platform.
This produces charts and graphics which let us tell at a glance whether something might need our attention.
Alarm bells started to ring recently when we saw a set of graphics which suggested that user engagement in the student finance guide - one of our most popular pages - had plummeted.
What was wrong? Was there an error or broken link on the page? Were we missing some vital information?
We turned to the analytics and customer insight to find out more. And we found an interesting story about how our pages need to reflect the fact that user need isn’t a constant thing - it can change over time.
We checked the Mainstream content tool, which showed that it was a recent drop:
For a user to be counted as ‘engaged’ in the guide format they need to spend at least 7 seconds on a page, or click a link within the main body of the page, and this wasn’t happening.
Was this a bad thing?
To find out what was going on we compared periods before and after the drop, and found that the change in engagement was due to a change in user-need.
Most students had finished applying for student finance, which required high engagement with the site. They were now coming purely to log in to their account, which meant they were jumping quickly from the guide to the login page.
Data shows change in user need
The Google analytics report shows changes in visits from student finance-related keywords that support this theory:
traffic for 'student finance login' had increased by 550%, and pages per visit for this keyword has gone down by 23%
people who arrived from the keyword ‘student finance’ are visiting fewer pages
there'd been a drop in traffic for most of the other student finance keywords
Change content to suit seasonal change in user-need?
The logical fix would be to optimise the login page for login-related keywords. But many users were using the generic ‘student finance’ keyword when they wanted to log in to their account. The guide is currently optimised for this keyword, to change this we’d have to change the focus of the generic guide to meet a niche, seasonal need.
Solution: optimise for user-journey, not individual page
We decided to leave the guide overview focussed on the generic ‘student finance’ needs and make it easier for people to jump from the guide to the login page by bumping this page higher in the related links section of the guide (from 4th to 1st).
Did the change help our users?
Moving the ‘login’ link up in the related links section of the guide sent more traffic over. People clicking on the related link on the Student finance guide to the login page increased by 91% in the week after the change, sending 21K more visits over (11K to 31K).
Our tracking doesn’t count people who immediately click on a related link as ‘engaged’, because they haven’t landed on the right content item.
But what about the drop in engagement?
We’re OK with the drop in engagement in this instance. In fact, engagement for the Student finance guide has dropped even further because we’re getting people to the right page quicker:
So it’s simpler, clearer, faster because we’re responding directly to user-needs. This case study highlights the importance of keeping an eye on metrics as a guide to user need - it’s the change in engagement that gave us insight.
We continually review the engagement metrics and are always looking for ways to improve them - and of course we use them in the context of all the other data available. As demonstrated in this example, the important factor is how you interpret and use data to make decisions - just having information available in the first place is only step 1.