Skip to main content

Book a practical driving test: how changing the wording on one button increased clickthroughs by 600%

Posted by: , Posted on: - Categories: User insights

In this guest post, Martin Richardson, Digital Services Manager at Driver and Vehicle Standards Agency (DVSA) writes about how analysing the effects of different wording options for their user feedback route helped him understand which version worked best with users.

Hello. I’m the service manager for the ‘book a practical driving test’ transaction, working for the Driver and Vehicle Standards Agency (DVSA).

It’s really important for us to be able to get feedback on user satisfaction, once people have finished booking their test with us.

We get this feedback through a ‘Done page’ on GOV.UK - this is a standard ‘mini-survey’ page, which allows capture of the satisfaction rating as well as suggestions for service improvements.

Our problem was, however, that once users landed on the final page within the transaction itself (which has dynamic content), they have completed what they set out to do.

So - what was the most effective way of encouraging users to go back onto GOV.UK and give feedback?

The experiment

We worked with GDS to understand how we might build the final page of our transaction in a way that maximised clickthroughs to the Done page on GOV.UK.

We decided to test four different forms of button text - ‘Done’, ‘Finish’, ‘Finished’ and ‘Give feedback’ - over four successive releases. (In future we aim to use multivariate testing, which will enable such tests to be conducted simultaneously.)

For each, we measured the proportion of users clicking-through to the Done page, and the proportion subsequently submitting the mini-survey.

The results

First of all, we looked at the proportion of users that progressed from the final transaction page to the Done page.

There were surprisingly big differences according to the button text used - with the most effective wording over 600%  more successful than the least effective:

Button wording

Clickthrough rate

Give feedback








Of all users who clicked through to the Done page, around half submitted the survey – and this proportion did not vary much with different button text. Of all who responded, about 23% included suggestions for improvements or comments.

What did we learn?

The wording on buttons can make a huge difference to response levels:

  • when we directly asked for feedback (e.g. “Give feedback”), then users tended not to respond! This gave the lowest net response of just 5%

  • the most effective text was ‘Finish’ and ‘Finished’, where almost 1 in 3 users responded

What next?

After discussing the results with GDS, we agreed to stick with ‘Finished’ for this service. (In fact, GDS say that based on these findings they may adopt this as standard for all services).

This approach may not fit so well for less ‘linear’ services. For example, on the ‘Change Practical Test’ service, where multiple transaction flows are possible, we will be using the ‘Sign out’ button as a trigger to direct users to a Done page.

Benefits of this approach

  • GDS are able to share the data, which not only provides the basis for the mandatory user satisfaction KPI, but also yields some great user insight from the comments submitted.

Incidentally - we’re now getting around 1,200 responses per day, with over 93% giving a positive rating.

Sharing and comments

Share this page


  1. Comment by Tom Ruppel posted on

    Isn't there something slightly misleading about "Finished" if it then dumps the user in another activity? I thought onscreen text should be straightforward, telling the user what will happen when they click somewhere, not hoodwinking them into entering a survey they might not want/have time to complete. Almost like a fake "close" button on a phishing ad.

    I'm probably misunderstanding the context - a screenshot or two would be really helpful here.

    • Replies to Tom Ruppel>

      Comment by Martin Richardson posted on

      Hi Tom – at this stage the user has finished interacting with the service, and the follow on action of completing the survey is of course optional. We found that the proportion of people arriving at the done page who gave feedback didn't change much with the different wording, which it probably would have done if people had very different expectations about the call to action.

  2. Comment by Jennifer posted on

    Does clicking on Finished also sign a user out of the service?