A/B testing or split testing is a growth marketer's best friend. There is no better way to find out what works, optimize your conversions, or grow your lead flow. It ends any argument. Simply put the idea to the test and find out what works.
A simple alternative for website analytics, A/B testing, and conversion optimization.
You can call webhooks when a specific action occurs, which opens up a lot of integration possibilities for your MarTech stack.
For A/B testing and traffic analytics, this replaces Google Analytics, Google Tag Manager, and Google Optimize. If you are using Adwords, Splitbee would not replace your Adwords analytics. Google Analytics allows you to connect to your Google Adwords account and import spend and campaign information. You can use UTM parameters for responses in Splitbee, but you can't see your Adwords campaign metrics.
To determine how long you need to run a landing page test for valid results, use a handy sample size calculator.
How absolutely certain do you want to be that the difference in conversion rate between page A and page B is real? That is, it was not caused by random events like a cat falling asleep on a keyboard and accidentally submitting your web form.
Let me suggest to keep it simple. There's no need to get into all the statistical theory behind things here. Because we are measuring marketing conversion rate and not death rates of a new medicine, the risk of being wrong doesn't come with intolerable consequences and is outweighed by getting answers quickly.
I use the "good enough" method. Here's my rule of thumb for setting tolerance:
(I'm x% sure the results are real.)
99% Super conservative, takes longer.
95% Good enough. Sweet Spot.
90% Fast & loose, quick results.
(How much relative change in the conversion rate matters.)
10% Small lift. Even worth it?
20% Good enough. Sweet spot.
25% Big lift.
In other words, at the end of the test you will be able to say: I'm 95% sure that there was a 20% lift in conversion rate (or there wasn't).
*Note that "lift" means a relative change. For example, a 20% lift of a 10% conversion rate is 12%. (It is not 30%. You do not add 20%).
There are a variety of sample size calculators you can use to determine how much traffic you need to each landing page. Here are three that I like:
Or you can use this quick chart for an estimate:
Keep in mind that this is traffic needed to EACH variation of your landing page. If you are testing your existing page against one other page, multiple the traffic by 2.
If your total sample size needs to be 1,000 total visitors and your daily traffic is 100 visitors, then you need to run your experiment for 1000/100=10 days.
Many A/B testing tools will do the calculations and let you know when you have statistically valid results based on your tolerance assumptions. It is good to know ahead of time how long to expect a test to take.
If you already have done a test and are wondering if the results are significant or to what level of confidence you can have in your results, you can use this A/B Testing Significance calculator.
Have you ever run an A/B test on your landing page that didn't make a difference in conversion rate? Wasting time and money on things that make no difference is frustrating.
I used to think that any information is worth the effort. Finding out what doesn't move the needle is just as important as finding out what works, right?
Nope. I wasn't finding out what doesn't work. I was finding out what doesn't matter.
It is more productive to focus on what will drive change.
After testing hundreds of landing pages over the years, here are the elements that have made the most impact on conversion.
Your headline is the most important component of copy. Common rule of thumb is that 80% of people will read the headline but only 20% will read the rest of the copy. In case you missed it, Scrappy MarTech #10 "Conquer Copywriting" is all about tools and insights on copywriting.
You must inspire action. Your call-to-action is critical to your landing page conversion and can have the most impact. There are three elements of your call-to-action that can be tested that make a huge difference.
Are you using the standard "Submit" for your call-to-action button? Boring! Try copy that inspires action and shows the value of your product:
Your call-to-action button should stand out on your landing page. Test different colors, shapes, sizes. The best buttons tend to be contrasting colors to the rest of the page - think orange button on dark blue page.
The location of your call-to-action is an excellent test. Above the fold has always given me the best results. But in addition to that, there are a lot of different design options that could move the needle on conversion. If you use a hero image with a person, try putting the call-to-action button in the line of sight - people will follow the eyes.
Hero images can be a large part of your landing page design. Testing different hero images (or no hero image) can quickly tell you if you are hitting the mark with your audience. The best tests are hero images that are completely different from each other, and different from everyone else.
Check your mobile traffic and see what your percent of mobile traffic is and if there is a significant difference in conversion rate compared to desktop traffic. Many times mobile versions of landing pages need to be tested and designed separately from desktop versions to optimize conversion.
You may see a major lift in conversion rates if you apply a sticky call to action or navigation bar to your landing page. Also test your landing page without having any navigation bar at all, and only a call to action button. This generally will boost your conversions.
Anything above the fold (both desktop and mobile) is critical to your landing page. Test different elements being above the fold and below the fold to see if you get a lift. I have found the simpler above the fold, the better.
If you are stuck testing and seeing minimal impact between your "A" page and your "B" page, you are not taking big enough risks. Test something absolutely outrageous. I love it when a page wins or loses by a huge margin. It means I really learned something.
Here's a fun game called Guess the Test. You are presented with two landing pages and you guess which one performed better. It's a great way to learn what has worked for others and may give you ideas on your own landing page.
Looking to do good while also improving your business? Try testing different ways to show your core values and how you are helping others right on your landing page.
See if you get a lift in conversion or customer retention.
The key is to make the giving tied to a relevant cause to your company and customers.
Ideas to try:
Also check out the 404 For Good article, on how to elevate your 404 error page.
William Sealy Gosset was an English statistician who pioneered small sample experimental design and analysis. He published under the pen name Student and developed the Student's t-distribution. The Student t-distribution is incredibly helpful in situations where the sample size is small.
Gosset was also a chemist and brewer who served as Head Brewer of Guinness and Head Experimental Brewer of Guinness. He developed new statistical methods for experiments on barley and brewing.
Guinness really is good for you.
There is something extremely satisfying about designing a landing page, putting it in the wild, and measuring the impact.
For me, it doesn't get any better than combining creativity and math.
I hope you have fun with your A/B testing. Let me know how it goes.
See you next week!
Happy Scrappy Marketing,