Change can be exciting for your business. You’re hoping that a website redesign, or updates to a few key elements will help boost your results like never before.
This is why it’s disappointing when A/B test results are not what you expected.
Let’s say you’ve made a couple of changes which you strongly feel will assist with the user experience, only to get the equivalent of a “meh” showing up in test results.
“But we made these changes based on feedback and hypotheses derived from baseline data,” you say, wringing your hands. “How can this be?”
If you have put appropriate time and effort into developing hypotheses from data, then it’s worth taking a second look to figure out why you’re getting a lukewarm response. Sometimes, with further investigation and testing, you find that your ideas were great, but what you were coming up against was something else entirely.
Something that tends to be built in to the human condition…
Change aversion and A/B test results
Think of a time you were uncomfortable with a change. Sometimes it’s a simple change in routine like getting more exercise, or it might be a big change, like moving to a new country. In any case, there’s always some level of uncertainty and discomfort, which can lead to change aversion.
Resistance to change is thought to be at the root of why humans often struggle with making meaningful life changes. Our brains work harder to process change and it can be quite disconcerting.
“When you’re learning something new, your prefrontal cortex must work very hard as you experiment with unfamiliar ideas. Since your brain uses 25% of your energy, no wonder you feel tired and your head hurts when learning!” – Forbes
Despite all of this, we obviously do make changes from time to time. It’s just that implementing the change may require an adjustment period in the short-term. If you developed a new exercise routine or moved to a new country, you probably got used to it after a while, right? It becomes the new norm.
When you’re looking with surprise at A/B test results which you’re sure should have reflected positive sentiment for the change, just remember that early results will often be more negative. This is the adjustment period, the time when user’s brains are working harder to process the unfamiliar. Some people naturally cope better with change than others, so you can expect that you might even see very mixed results.
You can expect to encounter change aversion when A/B testing for your website Click To Tweet
Give your A/B tests time
Change aversion usually smooths out over time. That gym routine you now follow religiously was once completely foreign, right? This is a great reason to make sure that your A/B tests are conducted over a period of time that is long enough for a fair assessment.
You will usually find that initial resistance to change drops away given time. In fact, not running tests for long enough is a key mistake that companies often make.
This begs the question, how long should you run an A/B test for? There’s a lot of different advice out there. Some say that a week should be the minimum, as long as you can gather statistically significant results in that time. We tend to run tests for at least three weeks so that we’re able to see a good range of traffic and a healthy sample size, but a company with huge amounts of traffic might get statistically significant results in a week.
Remember your testing when going live
Here’s a scenario that has played out a number of times before. A company A/B tests changes, such as a new layout or functionality, picking a segment of “ideal” users to test the changed version on.
After a while testing and gathering feedback from the test group, they find that results are positive for the changes overall. Their idea is a winner and they’re ready to roll it out to their wider audience.
When the changes are in place for everyone, negative feedback starts coming in. People report that they just “don’t like” the changes and the company receives many messages about it. On receipt of all of this feedback, it can cause the company to question their decision to make changes. Does everyone really hate the new design?
This exact situation happened to Netflix when they rolled out a redesign in 2011. There was an outcry among users and among media commentators. Did Netflix back down and revert to the old design?
No, they didn’t. They’d done extensive testing ahead of the redesign and had a confident baseline showing that overall, the new version was popular among test subjects. What they were finding on the full rollout was that people grumbled at first due to their aversion to change. The new design was better, but people were simply used to the old one. After a while, those people got used to the new design too, and their A/B test results were proven correct.
The lesson? If you’re confident that you did conduct extensive testing, don’t be tempted to do an about-face on changes too soon! Allow time for that natural aversion to change to subside.
How to mitigate aversion to change
You know to expect that you might find some resistance to changes you make, so there are a few things you can do to mitigate that reaction:
Change incrementally and communicate well
Most often, it’s the big, radical changes that users find to be really jarring. If you rip the current web design out from underneath them and immediately implement a huge change, you’re likely to face stronger resistance (unless your website was in a terrible state already).
If you’re able to, this is a strong case for making incremental changes, testing as you go, and thoroughly communicating the rationale behind what you’re doing.
Give users plenty of warning, with messages such as “we’re introducing a new website design soon to improve your experience.” Explain the value of what you’re doing: “our cleaner checkout process will save you time.”
Provide instructions and support
There are a lot of articles written about the importance of onboarding, especially in the SaaS space. No matter what sort of business you run, you can look at any changes as creating a need for user onboarding.
This means providing very clear instructions about how any changes work. Some websites tackle this by having a “walk through” pop up when the user logs in for the first time after a change. You might offer other means of support such as instructional videos or pop-up messages.
Another thing some sites are able to offer (depending on what sort of site and the nature of the changes), is to allow users to toggle between old and new versions for a period of time. This helps to ease them into the changes.
Provide feedback channels
Lastly, provide users with feedback channels so that they’re able to let you know what they think, or to make specific requests themselves. To make this as effective as possible, it’s important that you’re communicating with them and letting them know how you’re taking feedback.
Give clear instructions on how users can provide feedback. For example, direct them to a contact form, use a chat function, or use a tool which pops up requesting feedback, such as Qualaroo.
As a bonus, any feedback that you get may provide you with starting points for future testing.
Human resistance to change is something that should be considered carefully when designing and implementing an A/B testing program. You can expect to encounter some level of aversion, so factor in enough time to get test subjects used to the change.
When tests are stopped early and declared a failure, sometimes the real issue is that the test needed to continue for longer. Aversion to change will usually smooth out over time.
Lastly, where possible, test and make changes incrementally, giving plenty of warning and providing clear rationale for the change. This helps to ease your users into something new, without them feeling that everything familiar was just blown up. It’s like that new gym routine – when you’re able to ease into it, it’s more likely to stick.