Published on March 4, 2014
Many thanks to:
Growth Hacking Meetup - Episode 4 - 9 A/B Testing Use Cases
But first… some A/B Testing mistakes startups make ALL the time
1 1 A/B tests are ended too early Statistical significance (large sample size, etc) is what tells you version A is actually better than version B
As an optimizer, your job is to figure out the truth. You have to put your ego aside.
2 1 Tests are not run for full weeks Conversion rates can vary greatly depending on the day of the week. Always test a full week at a time.
3 1 Always pay attention to external factors Is it Christmas? External factors definitely affect your test results. When in doubt, run a follow-up test.
4 1 Testing random ideas without a hypothesis You’re wasting precious time and traffic. Never do that.
By the way, if you want to test button colors “green vs orange” is not the essence of A/B testing. It’s about understanding the target audience.
Use your traffic on high-impact stuff. Test data-driven hypotheses.
5 1 They give up after the first test fails Run a follow-up test, learn from it, and improve your hypotheses.
“I have not failed 10,000 times. I have successfully found 10,000 ways that will not work.” - Thomas Alva Edison, inventor
6 1 They’re ignoring small gains Only 1 out of 8 A/B tests drive significant change.
7 1 They’re not running tests at all times Every single day without a test is a wasted day. Testing is learning.
Now, for the second part… 12 Surprising A/B Test Results to Stop You Making Assumptions
Test 1: Which Copy Increased Trial Sign-Ups?
Result In this test, Version B increased sign-ups by 38% – a big rise. However, you would think version A was the better design.
Why does Version B work? Simply because the copy is better. Lesson: landing pages don’t have to be pretty to be effective.
Test 2: Which Landing Page Got 24% More Lead Generation Form Submissions? Picture vs No picture
Result Surprisingly, Version A was the page that got the 24% increase in submissions, simply by removing the image from the page.
This is a great example of why you should confirm your assumptions with quantitative testing.
Test 3: Which Radically Redesigned Form Increased B2B Leads By 368.5%?
Result Version A is an obvious winner, but it’s not just the big red button that makes the difference. Version A keeps things really tight and uses grouping to visually shrink the impact of the form
When designing your landing page, don’t overestimate your user’s tolerance, goodwill, and patience.
Test 4: Does Matching Headline & Body Copy to Your Initial Ad Copy Really Matter?
Result Version B looks as if it should be better: the headline copy is snappier, the sub-head clearer, but in tests version A increased leads by 115%.
Why? Version A was designed to tie in with the ads that drive users to the page. Lesson: making the elements of the sales funnel work together increases efficacy.
Test 5: Which Landing Page Increased Mortgage Application Leads by 64%?
Result Video can be a very effective tool in communicating lots of information in a compact form. But the presence of video couldn’t save Version B; Version A increased leads by 64%.
Test 6: Does adding testimonials to your Homepage increase Sales?
Result It would seem that this would have very little effect, but in practice this small change increased sales by 34% – a big margin. Why is this? Having ‘social proof’, even in this basic form, humanizes the conversion experience
Test 7: Does Social Proof Increase Conversions? ‘Big Brand Prices’ vs. Consumer Ratings
Result ‘Social proof’ is a powerful tool that can have a demonstrable effect on conversion outcomes. But in this case, customers were just looking for the cheapest offer. Asking them to consider an additional piece of information – customer satisfaction – made them back away from the conversion.
Test 8: Does an Email Security Seal Help or Hinder Lead Generation Form Completions?
Result Actually, users were put off by seeing it in this context, assuming that they were about to pay for something
Test 9: Which Page Got an 89.8% Lift in Free Account Sign Ups?
Result Version B wins: It has three bullet points, as opposed to words in speech bubbles. Removing a distraction from the page and reducing the risk of users navigating away from the page worked better
Whether you are right or wrong: First, make sure you get the basics right before you start testing, and second, always be testing, because unless you test, you can never be absolutely sure.
You are Welcome to Join the Adventure firstname.lastname@example.org
Presentación que realice en el Evento Nacional de Gobierno Abierto, realizado los ...
In this presentation we will describe our experience developing with a highly dyna...
Presentation to the LITA Forum 7th November 2014 Albuquerque, NM
Un recorrido por los cambios que nos generará el wearabletech en el futuro
Um paralelo entre as novidades & mercado em Wearable Computing e Tecnologias Assis...
Mobile A/B testing can be a powerful tool to improve ... your app. Learn how to avoid 6 common mistakes and misunderstandings in your A/B ... Case Studies ...
AITS CAI's Accelerating IT Success More than 10,000 IT Articles for your Success
A/B testing is a simple way to test changes to a product or service against the current design to ... 11 A/B Testing Mistakes That Even Experts Make.
Use the world's easiest A/B Testing Tool to increase sales and conversions on your ... Understand your visitors better by tracking what they click ...
Software Quality Assurance & Testing Mistake in coding is called error, Error found by tester is called defect, ... Modified 9 months ago. views
Para quem quer aprender a fazer Teste A/B by ecosta_18 in Types > Brochures, conversion e marketing digital. Para quem quer aprender a fazer Teste A/B.
Classic Testing Mistakes: Revisited Matthew Heusser [email_address] Presented at the Better Software Conference San Francisco, CA ...
12 A/B Split Testing Mistakes I See Businesses Make All The Time. ... A/B testing is fun. With so many easy-to-use tools around, anyone can (and should) ...