If your website refresh is rubbish, check your processes

oliver-walker108For most organisations, your website is a critical user touchpoint. As a result, it will get a lot of attention. Investment will be ring-fenced to ensure the website can be updated and refreshed throughout the year, yet some 80-90% of changes funded by that investment…achieve nothing. In fact, according to Stefan Thomke, author of Experimentation Works: The Surprising Power of Business Experimentation, among others, as much as 10% will have a negative impact on users.

So, why aren’t you changing your process?

At the heart of this is the question is, how baked into your process are data and insights?

The first point to make is that data should always be the prompt for the change from the start. Whether it’s insights from aggregated data like web analytics or behavioural analytics tools, or more qualitative and individual feedback from session replays, customer service or on-site surveys, there’s plenty of evidence that data-driven changes outperform opinion-led changes. We estimate from our internal testing hub, that the likelihood of eliciting an impact is about 2x greater when it’s a data-driven piece of insight.

To put that into a clearer perspective, 20-25% of opinion-based website changes lead to a statistically significant result (i.e. a result we believe to be valid, and not due to chance or luck). That’s compared to almost 50% of data-driven website changes. And we’re not even talking about a positive change, a negative change can also be valuable because it at least gives us an opportunity to learn.

Secondly, data should be used to validate the change that was made. Of course, we can always fall back on a quick “before and after”. What was the data like last week before the change? What does it look like this week, now we’ve made the change? Kudos for trying, but there’s so many factors influencing last week and this week that it makes the comparison a little futile.

What every organisation should be looking to bake into non-critical (i.e. they fix a bug or error) website changes is testing the original versus the variation. Whether that’s using different URLs and setting up redirects for half the traffic, using a CMS plug-in or, ideally, a dedicated website testing tool. This allows you to clearly establish whether a) there was a difference in behaviour, and b) whether you learned more about your prospective customers.

Learning is the key piece here. There are so many facets of a business which are fixed and far harder to test changes in – editorial content, leaflets and brochures, linear out-of-home (OOH) and TV – that being able to apply what we learn from web experimentation is invaluable. And that can even be as simple as the final call-to-action (CTA). Take a look at the most recent piece of non-digital or dynamic marketing collateral you’ve created. Was the CTA used as a result of insights, or something that was used because it looked good, or was it what you always do?

Likewise, think about the last time your SEO team or agency asked you to tweak some page content in order to improve your search engine rankings. Did this change in language impact your users, and how likely they were to convert?

And, finally, data can also be used to continue that cycle of insights.

Did a particular change have a bigger impact for one group of users than another? If something didn’t work, can we try to work out why? Can we change treatment based on what we know and can learn about your website users – whether that’s the marketing channel they came through or their current intent levels?

If you want to ensure the changes to your website are good ones, there are a few critical questions to ask yourself. 1. Is your investment for your website planned into this year’s budget? 2. Do you have a way of understanding which of those investments will have the impact you want? And 3. If not, what can you change about your processes to get to that point?

Oliver Walker is managing director of Hookflash