Skip to main content

https://insidegovuk.blog.gov.uk/2024/11/28/how-were-designing-gov-uk-chat/

How we’re designing GOV.UK Chat

3 mobile phone screens displaying GOV.UK Chat. A user has asked ‘Can I claim capital allowances for a house?’ and GOV.UK Chat has provided an answer, plus 2 links to check it on GOV.UK.

GOV.UK is the digital home of government information and services. More than 11 million people visit us every week and, according to the most recent YouGov polling, GOV.UK is the most-recognised digital service in the UK. Users can find the information and services they need in a number of ways: some people use the menu or homepage elements and browse through pages, some use our internal search and others come from external search engines like Google.

But we know we can do more to make interacting with GOV.UK easier. In our strategy for growth, we’ve made a priority of exploring whether emerging technologies can enhance the experience for our users. Earlier this month, we blogged about the latest stage of our experiments with a generative AI chatbot, GOV.UK Chat.

In this post, we explain in more detail how we’re designing GOV.UK Chat, why our approach to this element of the work is so important, and the reasons behind some of the decisions we’ve made so far.

What it looks like

Key elements of the GOV.UK Chat visual identity

One of the first things you’ll see is that GOV.UK Chat looks different from the GOV.UK website. We use an avatar, it’s described as ‘experimental’, things animate and there’s an emoji in the welcome message. All of this is very deliberate, and it helps us to communicate that GOV.UK Chat is:

  • an experimental product, that we ourselves are still learning about
  • a more informal, conversational experience
  • a chat product, so you can expect it to do the kinds of things chat products do

As we mentioned earlier, GOV.UK is a very well-recognised brand, with a visual identity – including language, the typeface, the crown logo, colours and design components – that maintains trust and assures users they’re seeing the official source of government information. To arrive at a design that communicated the nature of GOV.UK Chat while protecting the GOV.UK brand, we had to try different approaches. We asked ourselves questions like:

  • how could we communicate the experimental nature of GOV.UK Chat?
  • will users notice that this is different to GOV.UK?
  • how different should we make it?
  • how would this make users feel?

We iterated the design over a few months, with multiple rounds of user testing. We even included some deliberately provocative designs – for example, using different logos or styles for displaying questions and answers – to understand which elements were important to users. Some of the things we found out with this approach were that:

  • GOV.UK’s crown logo indicated to users that the tool was trustworthy
  • boxes made the experience feel more like a chat and drew users’ attention to the answers
  • users told us the colour yellow made GOV.UK Chat feel more friendly

We also played with many different avatars. We wanted to avoid relying on visual references to magic and robots, which can obscure from users how this technology works, and instead explored giving the technology a friendly face and very clearly explaining how it works.

All of this iteration and testing led us to GOV.UK Chat’s current design. We’ll continue iterating as we learn more about user behaviour, expectations and emerging mental models when using generative AI chatbots.

Check this answer

Any AI product built on the current generation of models carries the risk of hallucination. This is when the system generates a response that contains incorrect information, but presents it as fact. On GOV.UK Chat, one of the ways we’re minimising the impact of hallucination on users is through our ‘check this answer’ component.

In this feature, we give users links to official GOV.UK guidance, which they can use to ‘fact check’ their AI-generated answer before acting on it. These links are to the pages that GOV.UK Chat has used to create the answer, so it should be clear if there are any major differences.

Another benefit of this feature is that it helps users to continue their journey on GOV.UK. For example, if they need to use a service or do more reading around a certain topic, these links can quickly point them in the right direction.

Some of the routes we explored for how checking answers works

This pattern has evolved a lot since the early days of GOV.UK Chat. In the very beginning, we included a list of sources as a way for the team to see which pages GOV.UK Chat had deemed relevant. When we began testing, we found that these sources were important to users too. For example, when they wanted to delve deeper into a topic or wanted reassurance, we’d see users following these links to GOV.UK. It became clear that these pages were a marker of credibility and trustworthiness.

Overall, we know that this pattern makes it easier for users to check answers and improves the experience of using GOV.UK Chat. It prevents dead ends, links journeys up across channels and, importantly, gives users control. We’re excited to continue exploring improvements to this pattern.

Managing trust and expectations

Part of the onboarding journey into GOV.UK Chat

In order to use AI to help meet users’ needs, we need to be able to help them place the right level of trust in our products. This is because the technology is new, complicated, and not always reliable. With GOV.UK Chat, we’ve done this in a few different ways.

First, we needed to make sure that users both saw and understood warnings about the potential for inaccurate information in GOV.UK Chat’s answers. Our earlier versions included some concise content on the start page, then a dedicated ‘before you start’ page, but neither were effective enough. After multiple rounds of testing and iteration, we now display this through a series of timed in-chat messages when a user first accesses the tool, a process known as ‘onboarding’. This allows us to progressively disclose important warnings and context, meaning users can read each message before the next one appears. We’ve also introduced a meaningful decision point – the choice of clicking ‘I understand’ or ‘tell me more’. This extra step benefits users by giving them more control over their onboarding.

Our research also showed there were some misunderstandings among users about how generative AI works. For example, some users believed that:

  • answers would only be inaccurate because of their own mistakes when asking questions
  • answers could not be inaccurate, because the information came from GOV.UK
  • if there were any inaccuracies, they would be able to spot them

To tackle this, we’ve taken an educational approach to explaining the risks of generative AI. We explain the concept of hallucination and the use of training data during the onboarding journey, as well as highlighting that incorrect information could still be presented confidently. Importantly, we talk about these risks as inherent and universal limitations of generative AI to prevent users from interpreting them as specific to GOV.UK Chat.

In past testing, we often found that some users overestimated GOV.UK Chat’s accuracy because of its connection to the trusted GOV.UK brand. Following these improvements, just under 80% of research participants understood that GOV.UK Chat can contain inaccurate information. This was a lot higher than in previous testing, when we asked users about their understanding in a slightly different way. Testing with users has also shown that, despite the challenges, GOV.UK Chat has the potential to both help reduce the burden of navigation and summarise complex information on GOV.UK.

Next steps

We’re really excited to see how the private beta goes and how the insights can help us make the product even better. We also know that AI is a fast-moving technology and the kinds of things we can do with it will continue to evolve, so we want to continue exploring new possibilities and design patterns.

Thanks to everyone else who has contributed to this work, including the entire AI team, other government departments, and all of our research participants.

If you work in the public sector and have any insights into how to work with AI, you can contribute to our GitHub discussion.

Subscribe to Inside GOV.UK to find out more about our work.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.