Have you ever read an article about a conversion increase? It is not uncommon to hear three figured increases, which sounds absolutely fantastic. With a three figured conversion increase it must mean that a lot of people are getting very rich, but of course the truth usually is a bit more complex.

I have stressed in my posts about the importance of having statistical significance in your tests before you draw conclusion but guess what, that is not always enough. You will also need to look at the number of conversions you get to understand your conversion increase.

First lets me briefly explain the significance concept. This is what Wikipedia says:

In statistics, a result is called statistically significant if it is unlikely to have occurred by chance alone, according to a pre-determined threshold probability, the significance level. The phrase “test of significance” was coined by Ronald Fisher: “Critical tests of this kind may be called tests of significance, and when such tests are available we may discover whether a second sample is or is not significantly different from the first.”

In plain English a statistical significance shows you how accurate your results are. So, if say you have a 95% statistical significance it should be read as, if you do this test a 100 times, 95 of these you will get the announced result.

Now, let me with an example show how statistical significance is important but not enough.

So, we wanted to try a new AB testing tool. We installed the code on the site, I read the manuals and I went ahead with a so called comma test. Actually it was a dot test. A dot test is when you put a dot somewhere on the site and then let it run as an AB test. In this way you should get the same conversion rate on both versions.

Below you can see the results of the test. Notice the fairly high amount of visitors on each version. I had several goal metrics. The Continue button, RefiForm2, RefiForm3, RefiForm4 and the thank you page. As you can see the Variation 1, which is the B version with the dot, had a 29,1% conversion on the continue button while the Original had 25,9%. Also the thank you page had 1,0% versus 0.6% for the original.

Both metrics point towards that this test increases conversion but of course you know better. Notice the confidence level. You need to reach five dots to be significant. None of these two metrics are significant, although the continue button shows a fairly healthy significance of 76%.

Now notice the RefiForm3. It shows significance with 99%. Does this means we should open an champagne and publish the results on a conference? Of course not. Even if the results show us a significant result if we look a bit deeper we can see that the winning version has only 22 conversions which is an extremely low number to draw conclusions from. As a rule of a thumb I usually look for 150 conversions or preferable around 300 conversions.

Further I have to consider what we are testing. Obviously there is no logic that a dot should increase conversion. Then I also look at how the test has performed during time. The graph above is a bit short but if you let it run for a while you should get a good illustration on how good the test has performed.

When interpreting testing results you have to consider several factors. Significance is a very important metric but not the only one. You have to combine it with the number of conversions and some time with a graph over how the test has done over time.

I hope that next time you come over a successful AB test that shows an incredible increase in conversion you should question the metrics. On your own tests, please do not try to be trigger happy, even if your stake holders push you for fast results. These things should be able to take time or there is a big risk of drawing the wrong conclusions.

To conclude you should:

  • Always look for significance in your testing.
  • Look at the number of conversions, are they to low?
  • Consider conversion over time, how it has changed.
  • Understand if the test makes sense.
  • And finally always consider the conversion interval.

If you have any questions do not hesitate to ask.



No responses yet

Leave a Reply