How we create user-centred content


Good content brings your business objectives and customer needs together.  It should resonate with a customer and build a connection with them.

But when each day brings new deadlines and speed of delivery is so fast, how do we ensure the content works for the brand and product, as well as the customer?

There are a number of options depending upon the scope and size of the project.  But here are just a few ways that we test and optimise our content here at RSA Digital.

A/B testing

In its simplest form, A/B testing means putting two versions of copy and design up against each other to determine which delivers a better outcome.   

This could take the form of a question in a quote journey, how we present product options, or even a whole page on our website.    

The benefit is that you can get something live very quickly in this way. The downside is that you may have to run the test for quite a while depending on the volume of traffic to see whether or not you’ve achieved the desired outcome. 

A/B testing doesn’t always produce the result you were expecting, but that’s OK.  In digital you can ‘fail fast’, and that’s how we learn and improve what we’re doing.

We’ve A/B tested email content too – running one article for some customers and another for others, to see which one gets the highest click-throughs and views.  This is great for finding out what type of content your customers find engaging and useful.

Scenario testing

If content sits in a particular part of a customer journey, for example an email or a confirmation page, how can you ensure what you’re telling the customer is always correct? 

We find journey mapping and scenario testing helps for this.  On a piece of paper, map out the journey flow.  Is the user coming into your content from somewhere else?  What has the previous communication told them – is your message consistent with what they have just seen?  What does their onward journey look like?  This is where working in silos is really dangerous – if you change a web journey, you need to consider whether it impacts on the broader customer experience.

We make sure that we understand the end to end customer journey to test our copy – this means working with customer documentation teams, marketing teams, and call centre teams.  We’ll often ask these colleagues to validate our copy.

Peer reviews

Sometimes, you have been working on a project for so long that you can’t see the woods for the trees.  Taking your content to someone who isn’t on the same project, and seeing whether it makes sense to them can be a real eye-opener. 

This can be anyone else in your team, or even someone outside of the business (but not on the project), and if they have little knowledge of your subject matter then all the better.  If they ask you what something means, it’s a sure sign you need to re-write that sentence to make it clearer.  If they don’t understand it, you can be sure that your customer won’t.

User testing

User testing can be done in a number of ways.  It can be as simple as putting your copy in front of someone on screen and watching how they respond and interact.  Do they do what you want them to do next?  Do they ask questions?  It’s also wise to not just check desktop use.  Give the user a mobile phone or tablet to see how they interact with it, because many of your customers will be on small devices.

If we need more than one person to test, sometimes we’ll go to a local coffee shop or set up camp in the staff canteen and approach civilians with a small incentive, such as a voucher or free coffee.  Then we’ll put a prototype web journey in front of them, and see how they use it.  The benefit of this is that you get to check whether people outside of your industry understand your content.   It’s best to set the tone and explain that you didn’t create the copy/journey, and that you’re simply testing it.  You’ll get a more honest response.

Don’t be offended if they question the copy or criticise the journey.  That’s exactly why you’re testing. 

Take observational notes that you can take back to the office and learn from.  Some points of view will simply be that, ‘I don’t like that word’, for example.  But fundamental issues such as understanding mean that you need to revisit your copy and then test again.

If we’re designing a large customer journey, such as a purchase or servicing journey, we’ll go to a lab.  A facilitator will sit with the user, while we observe from behind a window. 

Again, taking observational notes on each user and how they react to your copy is very enlightening.  But the benefit of this kind of testing is that you can iterate the copy as you go along. 

For example, person one didn’t understand the word ‘underwriter’.  For person two you amended your wireframe to say ‘insurance provider’ and the user understood.  There will be instances where you need to watch a few users to understand whether the copy is the problem.  Sometimes it’s not the copy, it’s how it’s presented or the context, so it’s important not to make too many knee-jerk reactions on the day.  Watch a few users, if a problem starts recurring, it’s a sure sign you need to revisit your work.

Whatever method you employ to test your copy, it’s vital to make sure you act on the results.  Stay open-minded and don’t take any criticism as a negative.  It’s an opportunity for continuous improvement, and that can only be a good thing.