Keep it simple, but be brave
By definition A/B/N tests need to be kept simple (see earlier discussion on when to use multivariate testing). Remember that you are simply testing one variable and can’t know how multiple variables will interact from A/B testing. This doesn’t mean you can’t be brave in your optimization, as long as you are sure to lock a test participant into a single treatment.
Test a whole landing page versus another. Test writing styles and tones of voice. If you’re feeling really brave, test your overall contact strategy. Test your frequency and cadence. Test different types and levels of personalization. In short, subject lines for emails or call-to-action buttons for landing pages are a good start, but they just scratch the surface of what you might want to test.
One common issue marketers deal with is sample size for their test. There are a lot of great calculators out there and if you use Concentri, you can let it handle the math. Assuming you have a sufficient audience (and/or test time) to more than cover sample sizes that will make your test result statistically significant, consider testing the smallest samples you can. I’m always amazed at the number of marketers who do an A/B test on an email and split the entire audience 50/50. If your test yields significant results, you lost out on some opportunities using this method.
This is especially true for email, when you may be able to test small cells in advance of the main “send” or for other optimizations that allow some testing in advance of the main flight. Small cells will also allow for an A/B/N test where you have 3 or more treatments you want to test while allowing the current champion to be presented to most customers.
Plus, if you are able to test some small samples (percentage-wise), it makes it much easier to be brave with your tests knowing that if results don’t improve over the current champion, you haven’t made a commitment that will have negative impact on your business.
Leverage your results
Is it scientifically valid to assume that results of an A/B test for one landing page will necessarily apply to another for a different product or at a different time? Not necessarily. How about different subject lines on different emails with different audiences? Maybe not. So what?
At least leverage what you have learned from other tests as a starting point. If you learned that a “Learn More” call-to-action got more clicks than “Buy Online,” then start with “Learn More” as your new champion. Does certain copy get better click-through on your banner? Start there on the next campaign. Try never to test campaigns in a vacuum and always leverage your insights for the next campaign. In large organizations this may also mean sharing your insights with others who are testing, so that you all benefit from the hard-earned insights of your optimization.