A/B testing involves showing a certain selection of your community one version of an email, advertisement, web page, site-wide call-out or other element while you show another selection a different version. Keep in mind that A/B testing should be very specific and with a desired change in behavior in mind. If you start changing multiple elements of your design, you may not be able to discern which element resulted in the desired effect. Also keep in mind your statistical significance as a way of validating the number of people tested compared with the percentage change in performance. If your statistical significance is low, but your percentage change is high, you might make the mistake of using this change in the future when it did not impact enough of your overall sample.
About statistical significance:
- The significance of your data, assuming the variation(s) you’re testing is specific, is based on two primary factors: sample time and sample size. If your sample is too small or within too short a time frame, your test will not be reflective of the most common behaviors of your mailing list.
- In general we recommend your sample size reflect either a minimum of 250 users or 25% of your list (whichever is larger).
- For time frame with email, you will see a majority of your opens and click thrus within the first few hours so in general you are okay to send to the remainder of your list after 1-2 hours. Also keep in mind if you are using send time optimization your email platform will not send all emails at once. If your e-blast is not time-sensitive, try sampling your users over a few days before sending the final email or adjust your remaining sends to go out when you hit a certain number of opens or click-thrus.
A/B Testing Email Subject Lines for Open Rates:
How many tests should I run at a time? A/B/C/D: We recommend no more than 2 or 3 variations depending on how large your sample size is. If you have 10,000+ users in your mailing list, you can get some pretty statistically significant rates for multiple tests in that first hour or so after your mailing is sent.
Don’t assume that the mailing with the highest open rate is always the winner. If your subject lines are misleading compared to the content within your mailing you might have a high open rate, but a low click thru rate.
Example Subject Line Tests:
Scenario 1: A t-shirt company is planning new line next month centered around viral videos from the last few years and want to get customer opinions.
- Test A: New Double Rainbow All The Way Tees
- Test B: What Is Your Favorite Viral Video?
- Test C: 5% Off If Your Top Viral Video Tee
Scenario 2: A yoga studio is running a new promotion for current members where you can sign up for $99/month and go to unlimited classes (6 month minimum).
- Test A: Take The Stress Out Of Your Class Schedule With Unlimited Monthly Visits
- Test B: Get Unlimited Sessions For $99/Month
- Test C: Not Getting Enough Zen In Your Life? New Monthly Promotion
A/B Testing Email Content for Click-thru Rates:
Click thru rates determine the performance of your email after your users have opened it. Typically the higher the click thru rate, the better the performance, but clicks only mean something if you’re converting users once they get to your site. Testing variations in the layout of your emails, headlines, placement of calls to action or call to action labels (ex: ‘Buy Now’ versus ‘Click Here’) can all lead to improvements in click thru rate.
The best way to improve performance of your mailings is to continuously test over time, while using winning strategies from previous tests to move the needle gradually upward. Remember to only test specific elements and to measure a large enough sample over a long enough amount of time to get optimal results. Have more questions on a/b testing your website, landing pages, ad text or other email elements? Contact our team today!