A complete solution for growth teams
The end-to-end platform for content teams
Actionable Analytics insights in your inbox
Whether you need an in-company growth workshop,
hire and train a new growth team, deploy a growth
strategy or simply get a growth assessment before
planning the next steps, we've got you covered!
The #1 community for scalable growth.
Learn with the best minds in the world.
No results found for your search
What's the one thing you wish you knew when you started A/B Testing?
That you need, at least, 100 conversions per variation to start considering the results of your tests.
The margin of error of smaller number are too risky to justify putting your money into a hypothesis based of the test results.
Actually that is very counterintuitive since most of people believe traffic is the key factor for A/B tests.
True, smaller number aren't reliable. But when you have lots of traffic, you achieve quite rapidly a good enough amount of conversions say in a day or two.
Should you stop the test here, you could still be getting imaginary results as you didn't test for long enough to have a representative sample and account for seasonality.
That's why you see the recommendation of testing for full weeks at a time, and for a duration of 1-2 business cycle.
What's your take on that?
What else would you use ;)
Also, Optimizely could be a bit funky. Take a look at this http://i.imgur.com/C5FNC09.png . There was a "traffic spike" but actully nothing happen. Thankfully we do assign attributes based on the A/B test and we didn't noticed any change. Have a backup plan in case Optimizely goes bad (Woorpa is out backup tool)
100% agree: have a backup plan and ALWAYS cross-check your data by sending it in your analytics tool too.
Use the feedback box below if you have a question, comment or general feedback.
Your feedback has been sent.
Sweet! The link has been copied to your clip boardy board!
Flash isn't supported. Please copy the link manually.