skip to Main Content
Need A Hypothesis? Generating Good Test Ideas

Need a Hypothesis? Generating Good Test Ideas

How do you come up with good A/B test ideas?

Do you look over your website and take a guess? Perhaps you do a Google search and look for test ideas that others are using? Many people take these approaches, and come up with a “test checklist” to work their way through.

The problem with these ideas is that you will probably find test ideas that, while they’re being trumpeted loudly by others, aren’t necessarily a great test for you (see why we believe button color is one of those!)

The most important reason why those methods are inefficient is that they’re just guesses and anecdotes from other people. They’re not based on any form of solid data from your own website, they’re more of a “throw things at a wall and see what sticks” method.

If you’re going to test in this fashion, you need to prepare yourself for a significant fail rate. Wouldn’t you rather set yourself up with a clear process that helps you to prioritize areas for testing?

In order to do this, you need to base your test ideas upon data that you have already. This doesn’t completely eliminate guesswork, but it means that when you devise a test, you’re basing it on data, or educated guesses, not something you plucked from the internet or thin air.

How can you find that data and generate good test ideas?

Use the right tools for data-gathering. Get our list of examples here

Generating test ideas

Good test ideas come from clear data. There are multiple possible data sources you can use, helping you to eliminate guesswork, or those tests that happen “because the boss thinks this will work.” Here’s where you can look:

Your website analytics

At a glance, Google Analytics can seem quite overwhelming. There are many potential data points, and it can be difficult to know where to begin. To narrow it down, think about the key actions you’d like website visitors to take. Where are those happening, or not happening?

Your aim is to identify pages on your website that have the best potential for improvement through testing. This avoids simply testing random pages or testing pages that already perform very well and won’t see a significant uplift.

The pages you should prioritize for testing include those that have high amounts of traffic, but overall poor performance (people aren’t doing what you want them to).  These might include landing pages and specific steps in your funnel where visitors are dropping off.

Some of the Google Analytics reports that can be helpful for discovering these, along with some examples of what you might discover are shown in the table below:

Google Analytics Report/Data: Example of what you might discover:
Top landing pages Landing pages with a lot of traffic, but high bounce rates.
User flows See where users go after their initial entry. Look for common drop-offs.
Conversions per browser Do some browsers perform much better than others? Could there be something in how your website renders in particular browsers?
Conversions and bounce rate per device Find out which devices get more conversions and which need improvement.
Site speed Identify pages with high traffic and slow loading speeds.
New vs. Returning Visitors (Per Channel) New vs. returning visitors in itself is not especially useful for testing, but it becomes more useful if broken down into marketing channels. Particularly pay attention to the channels that get more returning visitors.
Well-designed A/B tests come from analyzing your available data Click To Tweet

Test ideas

Feedback from usability testing

One simple (and often relatively quick) way of finding useful test ideas is to get quality feedback from your website visitors. This means evaluating your website by testing it with users who are representative of your “ideal” user.

You will want to ask questions about the user experience, get users to complete key tasks (like finding a specific product and checking out), and find out what they have to say about things they like or dislike.

The feedback you gather, particularly on anything they find difficult or dislike, can be very valuable for devising tests to look for better conversions. There are several services out there that can help you run usability testing if you don’t have the time or are not sure where to begin with setup. Check out sites like Hotjar (which actually covers several features we mention in this post).

Feedback from on-page surveys

These follow along similar lines to usability testing, except that usually, you would ask just one question on a page, to give yourself a better chance that a visitor will answer it. Usually, you would devise the question so that it closely aligns with key goals for the page.

As an example, let’s say you had the question pop up as an exit survey from the page. You might ask something like; “Can you tell us why you’re not signing up today?”

Of course, not every visitor will answer your question, but the hope is that enough will so that you get a reasonable amount of data. Just be careful to ensure your question is relatively simple to understand and answer.

Data from previous A/B tests

You can expect that somewhere around three out of ten of your A/B tests will be “winners” – demonstrating a significant uplift. However, those that were losers don’t have to be written off as a waste of time – there is plenty to be learned from a losing test.

Put that data from those tests to use. A/B testing is an iterative process and should be repeated to be effective. For example, where a test was too close to call, you analyze the results, tweak the treatment and devise a new hypothesis. You are able to narrow down what your customer preferences are, and get more ideas for further testing.

Mouse tracking analysis

Mouse tracking is useful for recording and analyzing what people do on your site with their mouse – what are they paying attention to?

There are a few different tools available to install that can help you with this. As a general rule, areas that show up in red mean there is a lot of action, while blue means slim to none.

You can learn from data such as:

  • Where people click the most frequently
  • Whether the most important information is visible to most of your users
  • How far (or if) people scroll down the page.

This sort of data allows you to think about elements such as where features are placed on the page, or what people pay the most attention to.

Test ideas

Heat maps and user session replays

Mouse tracking is one form of heat map – there are also those that track eye movement to analyze what captures the attention of the user. The point is that you can capture details that you won’t know from simple analytics reports. The data is very visual and easy to understand.

Session replays are another good way to collect feedback directly from the user. You can use software that records user sessions, showing you exactly where they went, what they did, and what the outcome was. These are a great way to discover which parts of your website users seem to have the most trouble with. Do you notice common patterns with people getting stuck?

Chat transcripts and customer service issues

If you have a chat app enabled for your website, most of the time when customers reach out, it’s about some kind of question, frustration or confusion, right? This makes those chat transcripts a valuable source of raw data for potential testing ideas. What do you get a lot of questions about?

The same can be said for any logs you keep of customer service issues. If someone is bothering to get in touch, it’s often because they have a pressing need, or they’ve had negative experiences they’d like to relay. Look for any comments that can point to issues you can test, for example:

  • “I couldn’t find …”
  • “X is really hard to use.”
  • “I don’t understand …”
  • “Are these suitable for …”

Even general queries can provide a testing opportunity. Perhaps it’s time to update descriptions so that they answer more of the questions you get? How many people might abandon the website because they can’t find an answer?

Get our list of data-gathering tools here

Final thoughts

Successful A/B testing doesn’t happen off the cuff, it is based upon research and the use of solid data to generate good test ideas.

Rather than taking a guess or running tests because someone on the internet had success with the same kind, look to the various sources of data you have at your disposal, to formulate likely hypotheses.

You can always expect that some tests will not be successful, but basing tests on data can greatly improve your success rate.

Back To Top

Testing.Agency has a new name: Team Croco.

With this new brand, we focus even more on CO-operating on your CRO strategy. (Get it => CRO-CO )

Of course you can expect the same speed and quality as before, but we also commit to support you in the rest of your testing program: planning & prioritising, design, development, implementation, reporting.

Our goal is for you to run a consistent A/B testing program with minimal effort.

Want to know more?

Request a free demo and consultation!