Get in touch

AnswerMiner > Data science  > A-B Split Test – Avoid these common mistakes!
ab-test-experiment-mistakes

A-B Split Test – Avoid these common mistakes!

Running AB tests are indispensable if you want to make good decisions. But if you make mistakes, your decision can have worse consequences than deciding only based on your intuition. So it’s important to do A/B tests the right way, without mistakes.

If you are not worried about proper implementation of A/B tests, you should better forget doing A/B tests completely. Because you have only two options:

  • A) Do A/B tests and interpret the results in the correct way, without mistakes
  • B) Completely forget doing any A/B test and just use your common sense or intuition that will also mislead you.

In this guide I will use the following example for the sake of simplicity, but you these mistakes can be generalized to any other kind of a/b tests in your business: Suppose you have a webshop where you want to check which buy button makes you more money. Whether red button, or green button would have better click-through conversion rates? Suppose you measure 15% CTR on red button, and 10% CTR on green button.

#1 mistake: Comparing past to present

Never ever compare conversion rate of the past to conversion rate of the present. If you had red button in January, and changed it to green in February, do not compare the CTR results with A/B test. Because the experienced change in conversion might come from the difference of visitor-type distribution or any seasonal common change. If you make this mistake, your results will be totally useless.
What to do? If you have two variations, you should randomly (or alternately) show the different buttons for the visitors, but it is also very important that one visitor should always see only one variant.

[/vc_column_text]

#2 mistake: Checking only the conversion rates

If you are just watching the raw conversion rates, for example 15% CTR for red button, and 10% CTR for green button, you are making a mistake. This change might come from the work of coincidence and you can’t be sure whether red or green is really better. The meaning and the core essence of A/B tests are they can eliminate this accidental difference and show you have likely it is that the change is real.
What to do? Always examine the certainty (sureness level) of A/B tests and also the confidence intervals.

#3 mistake: Looking only at the p-value (certainty)

Almost all A/B test calculators (except ours) displays you only whether the difference in the case A and case B are significant or not. If yes (usually at 95%), they shows you a big green checkmark. Do not be satisfied with this! “Significant” only means on our example that it’s likely that red button is somewhat better. If you experience 15% vs. 10% conversion rates, and the result is significant, it does not mean that the red button is better by 50% at all! Maybe the red one is only 0.001% better than the green button.
What to do? You can calculate the so-called confidence interval which tells you exactly what you want: what is the most likely range of the difference between the the conversion rates of the variants.

#4 mistake: Not testing for a full time period

It’s a mistake if you run your experiment only for 5 days from Monday to Friday because this range doesn’t include weekends, and weekend purchasers might have totally different behaviors. Also note that testing January always gives you results only for January. If your visitors behave differently in February, you will have no data for that.
What to do? Always run your experiment for a full cycle that has a meaning in your business.

#5 mistake: Do not turn your results into decisions

If you run an experiment, it’s great! If you are able to interpret the results of the A/B test, it’s greater. But if you don’t use your data and don’t turn your results into decisions, the whole thing is pointless and a waste of time. A/B tests are for helping you make decision.
What to do? It is often a good idea to pre-estabilish your decisions that you will make if case A or case B will be the winner. For example you may say before running the A/B test that you will change the button from green to red if, and only if the red is the winner and it is surely better by at least 20%.

#6 mistake: Running an A/B test only once

It is not enough to plan, implement, run an A/B test and interpret the results, even if you do it the right way and you make a decision based on the results. It’s not enough, because your business, your visitors and their behavior are continuously changing. If you realized that red button has much better conversion rates than the green one, you only know that this was in the past. When the target audience change, future visitors might prefer the green ones.
What to do? Always repeat your A/B tests periodically to test whether differences on conversion rates are still persist because they might change as time passes.

#7 mistake: Forgetting to test what really matters

You may test the color of buttons. But what really matters might be the position, the caption or the size of the button!
What to do? Use your common sense and test those things what really can cause big changes in conversions and do not waste your time comparing “buy now” to “buy Now” for example.

#8 mistake: Disliking and ignoring the results

You surely have preconception or intuition about what the results will be. Because you like to think yourself clever and like to think that you can predict the outcome because you are an expert on your field. If the results of A/B test contradict your previous feelings this can cause cognitive dissonance. The consequence of this that many people will just ignore the results completely. You might think that green button is better (because it’s more natural, it’s more beautiful, users are not afraid to click it) and you want to support this presumption by an A/B test. If the outcome shows that red button is better by 50%, you should forget your preconception.
What to do? Believe in your results. Even if you think that red is ugly, you have to change your button from green to red. Because visual design of your website doesn’t matter as much as the conversion rates and money.

#9 mistake: Not understanding what 95% means

95% certainty (or 0.05 p-value) does’t mean that you can be sure about the result of a-b test. This means that it is likely that one variation is better by at least somewhat than the other variant. It’s usual to run an experiment until you reach 95% confidence. But take into consideration that this also means, that there is 5% chance to be false positive. So if you run 100 different A/B tests, there will be approximately 5 tests which tells you the wrong result.
What to do? Keep in mind that you can never be absolutely sure about the results. If the decision made based on your ab test is very important and has serious consequences than you may want 99%, or 99.9% certainty.

#10 mistake: Tracking the wrong conversion KPI

If you optimize your website and the color of the buttons so that you reach very good CTR at “add to cart” button, you may be wrong. Add to cart click-through rate is important, but you should measure purchases instead. With a red button, it’s likely that items placed in cart will be better, but purchases may not be. Or even if you optimize your site to have more purchases it can cause that micro-purchases per visits will be better but total purchase amount per month can be less. So if you use A/B tests, it is very important to define the correct conversion.
What to do? Keep in mind that improving one conversion rate can cause that other conversion rates will be lower.

#+1 mistake: Forgetting to segment

If you know that the CTR of red button is better than green button, it’s not the whole picture. Maybe males prefer green, females prefer red. Maybe impulsive purchasers prefer red but rich, wise, returning purchasers prefer blue. Maybe youth prefer red but elders prefer green.
What to do? Always gather as much data as you can and keep runing A/B tests until you are able to segment the results.

Calculator for a/b test that I suggest:

A/B test calculator!

Comments:

Get special deals and up to date content.

Subcribe