skip to Main Content
Why Button Color Isn’t The Best A/B Test

Why Button Color isn’t the Best A/B Test

Human beings are fascinated by color.

“Red cars are faster,” “yellow makes people happy,” and “green means go” are just some of the common stories that are told. These influence how people choose or perceive color.

When it comes to conversion optimization, unsurprisingly there’s often an obsession with button color on websites. The idea is that perhaps we can choose an optimal color that promotes more conversions.

From our perspective, button color may not be the ideal A/B test. Altering color alone can be an overly simplistic view, and it usually isn’t something that produces significant uplifts in your results.

What’s the deal with button color testing? Let’s take a closer look:

What can you test instead? Get our quick alternative ideas here

Why do people test button color?

Button color testing is something that has had wide publicity, so it’s one thing that many companies turn to when they’re looking to improve conversions. The idea is that the button must not be attractive enough, so by adjusting the color, they hope to entice more clicks.

“How can we make this more attractive to click on?” is actually a very valid and fundamental question. Another important question is whether color comes into play at all when we look at the attractiveness of the button.

The core question for buttons is; “how can we make this more attractive to click on?” Click To Tweet

Are there “universal” best colors?

If you dive into the many often-cited tests playing off one button color against another, you’ll find many opposing arguments. Someone argues strongly for a red button because they experienced an uplift in conversions when testing it, while someone else argues just as strongly for the famed “BOB” (Big Orange Button), or even blue!

Many will spend a lot of time diving into color psychology, which is indeed an interesting field. The problem is that, as researchers have found, while there are assumptions made linking certain colors to associated emotions, this is not universal. Our reaction to different colors is likely due to factors such as personal experience, cultural differences, and the way we were raised.

The bottom line is that there are no concrete answers when it comes to evoking desired emotions with the use of color. That is not to say that the use of color psychology is a complete waste of time, but that anyone looking at it should go in willing to accept that there are no absolutes.

For almost any common button color, you will find a study somewhere that suggests it is the “best” color. The answer is that there is no “universal” best color for a button at all.

Is button color important?

In the scheme of A/B testing, button color is definitely a factor. However, it’s not really about saying “red is better.” In fact, while we can definitely find data to suggest button color is important, we would argue that it’s not THE most important thing.

Let’s say button color is the only thing that you change (that’s the only fair way to test it), in our experience, that’s such a small change that any resulting uplift you see is also quite small. We tend to focus on bigger changes that can yield bigger uplifts. Button color testing might get you some kind of result, but looking at other factors you could test might make a more significant impact.

We’d also be wary of the methodology used in some tests. There are several examples out there of tests that proclaimed a button color to be a clear winner, but when you look closely, you notice that button color wasn’t the only thing changed in the test.

For example, Wider Funnel conducted a test on two versions of a landing page, then afterward declared orange buttons to “work hard” for conversions.

Here’s the control page they used:

Here’s the variation of the page which was declared the winner:

The problem here in declaring that “orange buttons are better” is that the button is clearly not the only change on the variation page. There isn’t even a button on the control version! They experienced a significant uplift, but multiple changes were made, any of which might be attributed to the result.

What else should we consider?

If button color has some influence but isn’t the best A/B test, what else should we be considering? Here are some other factors that come into the testing equation:

Button size

It comes back to that question, how do we make this more attractive to click on? Sometimes the size of your button can be playing a role in its attractiveness.

For example, you’ve probably been to a website previously where buttons were so small, they were difficult to spot. This is definitely an issue worthy of testing, along with the space given around the button to help it stand out.

Consider also where your traffic is coming from and the sorts of devices they are using. Sometimes buttons don’t render well on mobile screens, making them difficult for users to click on. Apple shows a good example below:

A/B test

In this example, the button size on the left can make all the difference in whether someone moves forward with a transaction, or leaves in frustration. Color may not make any difference whatsoever when there are issues like this present.

Button placement

You want to make it as easy as possible for your website visitors to interact with your buttons, so placing them where people expect to find them is important, too. How do you know what your visitors expect? Through testing.

For example, buttons placed where users have to scroll to find them might not get the clicks you would like, whereas shifting the button to a more prominent position on the page might help. Website visitors tend to want obvious, easy navigation, and button placement has a role to play in that.

The complexity of the content on your page may also play a role in appropriate button placement – again, this is something to test. A Kissmetrics study found that more complex pages benefited from the button being placed further down so that people could read the content first, while simple, shorter content faired better with the button above the fold.

Button CTA

What is it that you are actually asking people to do? Is it clear to them why they should be clicking? One thing that can make a button unattractive is if there is ambiguity about what it actually does.

Testing your CTA and/or the content that immediately precedes it can be good tests to conduct, to ensure that your message is clear. For example, a generic “click here” might not yield great results, especially if it is unclear why the visitor should be clicking. On the other hand, a CTA, such as “add to cart”, is quite clear.

Why is visual hierarchy important?

An important distinction to make is that button color plays a role in a wider discussion about visual hierarchy on web pages. Some conversion discussions have people thinking that a button color change could solve their issues, but realistically, it tends to be more complicated than that.

“Visual hierarchy” refers to how elements are arranged on the page to confer importance. It influences the order in which we perceive what we see. In other words, the order of the elements on your page can change the message as to what is important.

Color is just one factor in a fairly long list of elements that can influence visual hierarchy. Other factors include things like; size, shape, whitespace, weight, case, style, contrast, alignment, pattern and saturation. There are several more elements we could add to this list.

When we talk about color, contrast is a particularly important element for making decisions about your buttons. Widely-heralded tests might tell you that your button should be orange, but what if your primary branding color is orange? That button is not going to rank well in terms of visual hierarchy.

Of course, we must also always come back to understanding the preferences of your particular audience. What works for one website does not automatically work for another that has a different target audience. In this way, A/B testing is very much context-dependent.

Peep Laja of Conversion XL worded it well:

“…green vs orange” is not the essence of A/B testing. It’s about understanding the target audience. Doing research and analysis can be tedious and it’s definitely hard work, but it’s something you need to do.

In order to give your conversions a serious lift, you need to do conversion research. You need to do the heavy lifting.

Serious gains in conversions don’t come from psychological trickery, but from analyzing what your customers really need, the language that resonates with them and how they want to buy it. It’s about relevancy and perceived value of the total offer.”

Test out other ideas – get our quick sheet with things to test here

Final thoughts

When we look at A/B testing, we like to look at things that can provide a good “bang for your buck,” or uplifts that are significant enough to warrant the time spent on testing. Button color isn’t really one of those things.

Sure, color can play a role, but we argue that there are other factors that tend to be more significant. A small button probably isn’t getting clicked, neither is one that is so crowded that it’s difficult to make out. There are any number of other issues to test that might provide better uplifts in results.

We suggest that button color shouldn’t be a top choice for A/B testing, instead focus on those other factors that impact visual hierarchy. Next time someone claims that a certain color is “best” for buttons, remember that there is no such thing as a universal fix!

Testing.Agency has a new name: Team Croco.

With this new brand, we focus even more on CO-operating on your CRO strategy. (Get it => CRO-CO )

Of course you can expect the same speed and quality as before, but we also commit to support you in the rest of your testing program: planning & prioritising, design, development, implementation, reporting.

Our goal is for you to run a consistent A/B testing program with minimal effort.

Want to know more?

Request a free demo and consultation!

Testing.Agency has a new name: Team Croco.

With this new brand, we focus even more on CO-operating on your CRO strategy. (Get it => CRO-CO )

Of course you can expect the same speed and quality as before, but we also commit to support you in the rest of your testing program: planning & prioritising, design, development, implementation, reporting.

Our goal is for you to run a consistent A/B testing program with minimal effort.

Want to know more?

Request a free demo and consultation!
Back To Top