Split Testing Leads To 300% Better Conversion Rate

Scott HarveyGeneral Marketing4 Comments

Split Testing, also called A/B testing, is an extraordinarily powerful way to find out what works and what doesn’t, or what works better in website marketing.  Surprisingly, it’s very much under-utilized by marketers. (I know, you’re a doctor, and you never thought you’d have to be a marketer, but the truth is that you do have to be a marketer.  Your perfect prospect is being approached (pummeled?) by your competition every day.  You’d better be marketing to her.)

I’ve written about this before at Split Testing/Sex (how can you possibly resist clicking on that?!) but it’s an important topic, so I thought I would take a little deeper look at it . . .

In the olden days before the Internet, a split test could take weeks, or months.

For example, you mailed a solicitation with one headline to 10% of your mailing list or your patient list, and the same solicitation, just with a different headline, to another 10%. You see which one works better, and use it for the remaining 80% of your list.

This worked just fine, but you had to wait for the printing and mailing and responses, all of which could take a long time.

Because of the time lag, it was hard to react and make timely changes.

Today, you can know the results of a change in your headline or another component of an offer, or of your AdWords ad, in just days, or even in a few hours.

So why is split testing underutilized in website marketing, when it can lead to such dramatic increases in whatever you’re measuring, whether it’s clicks, conversions, appointments, consults, or profits?

Two reasons, I think.

First, although it’s simple in concept, it still isn’t well-understood by a lot of physicians, or by a lot of marketing professionals, for that matter.

Second, people are afraid that it’s too complex to implement.

Taking the second point first, it’s really not that difficult. There are plenty of tools for doing it, including some free ones, and the free “Google Analytics” tools work just fine for most people and most tests.

Becoming a master of Google Analytics isn’t easy, but almost anyone can plow his or her way through enough of it to do some split testing, and if you can’t, there are plenty of people on Upwork or Freelancer who can do it for you, very inexpensively.

Also, there are packages such as “Optimizely” and “Visual Website Optimizer” that make running split tests easy.  These solutions cost a little bit ($50-$70 a month), whereas Google Analytics is free.  But the cost is nothing compared to the increased profits you can get if you just do it.  If you’re like me, the easier and quicker you can make this kind of stuff, the more likely it is that you’ll do it.

 

So How Does Split Testing Work, Anyway?

How does a split test lead to a 300% increase in conversions? I’ll talk more about this particular experiment later, but the quick version is that changing the position of the opt-in box made the difference.

The opt-in box is where you get your visitor to sign up to receive your marketing and promotional material.

You get them to opt in by offering something of value for free– like a “Patient’s Guide to Cosmetic Surgery”, or a comparison of all the different injectables available now, or “How to Keep Your Skin Looking Fresh and Young.”

And after she opts in, you don’t beat her to death with relentless pitches and offers of “10% Off Botox this Month.”  You send her valuable, useful information, and once in a while you make an offer.  The idea is to keep yourself “Top of Mind” in her head, so that when she’s ready to think about a procedure, you’re the one who comes to mind– the one she calls.

But I digress.  Back to split testing.

I guess I should point out that you don’t often get huge performance improvements like that with a single change, as wonderful as that would be.

But you don’t need that kind of huge improvement to make a real difference.

Most people don’t realize that every time you get an increase in clicks or conversions, that increase compounds on the increases you got earlier on.  This is important.  Pay attention here:

If you get a 17% increase with a better-performing headline, and then you find that adding a graphic, or a different graphic gives you a 12% bump, and then you try a different “Get More Information” button which improves conversions 11%, and then you simplify your opt-in form for a 9% increase, all of a sudden, your conversions are 58 ½% better than they were. (100 x 1.17 x 1.12 x 1.11 x 1.09)

Pretty soon, you’re talking real money, as they say—and these are the kinds of improvements anyone can get by split testing.

 

What can you split test?

Anything. Here are just a few things you might test:

  1. The wording of your Headline
  2. The color of your headline
  3. The font of your headline. And the size.
  4. The size, color, design, placement and wording of your opt-in.
  5. The size, position, color, shape and wording of your “Submit” or “More Information” buttons.
  6. Your price.
  7. Your images
  8. Your trust seals.
  9. All the elements of your AdWords ads that drive traffic to your site
  10. The Subject Lines of your emails

The truth is that you should never stop split testing. You should always be testing some element against a different version, trying to find a version that improves results. That becomes your new standard, or “Control,” and then you test a new version against that one. And so on.

 

So Where Do You Start?

Before you start testing single elements, you work on the page as a whole. Try several completely different, or at least moderately different, layouts of your page and find one that works well. Then start on the pieces.

Of course you start on your headline first, since that’s the place where viewers will bounce away, because you didn’t capture their attention, or they’ll read further because you did.

One thing that’s important to remember is that you should only change one thing at a time, or you won’t know which change made the difference.  (Yes, you Taguchi fans, I know…but really?  I do this for a living and I don’t even have time to implement Taguchi testing!)

Another thing to remember is that you will be fooled. Often, the version that you just know is superior will get trounced by the one you run against it, sometimes against all logic. Happens to us all the time.

 

When Do You Know You Have a Winner?

This may be another area that stops people from doing split testing—knowing when you have a winner delves a little bit into the mysterious area of statistics, which is pretty scary for lots of people– even college-educated, fellowship-trained physicians. 🙂  And me.

I’ll explain a little, then give you some shortcuts that make it all much easier.

For example:

Your control page gets 974 visitors and generates 5 phone calls. Your “challenge” page gets 961 visits and generates 7 phone calls. The challenge page is clearly better, right? Seven phone calls is 40% better than five phone calls.

Nope. Statistically, according to one writer, there is only a 45% chance that the challenge page will actually perform better over time. Its apparent superiority at this point is 55% likely to be just due to chance.

Without getting deep into the statistics of it all, the point is that you don’t have enough data yet to make a decision.

One “rule” is that you need at least 30 actions before you can be reasonably confident that the long-term results will be about the same as what you see after just the first 30 actions.

That’s a little loosey-goosey, but useful as a first approximation if you want to keep it simple.

In the example above, if you had 30 conversions and 42 conversions, you could be 89% certain that the challenge page would be the long term winner. Most experts think you shouldn’t name a winner until you’re 95% confident, so we’re still not quite there.

So when do you know—without cracking open your high school statistics textbook or calling for help?

 

Here’s the easy way:

There’s a free A/B Test Calculator at www.splittestcalculator.com. It uses the two-sided Chi-Square statistic with Yates Correction.

I have no idea what that means, but the calculator is easy to understand and use— in our example above, you would just enter 974 and 5, and 961 and 7. You’re told that your confidence level is 62.77%– a long ways from 95%. [I’m not a statistics geek, and I don’t know why we get 62.77% confidence with this tool, and the writer above says the results are 55% likely to be due to chance alone. Whatever, both these numbers are indicating that with only this much information, it’s about a crapshoot which version is better.]

In this case, you wouldn’t be 95% confident until your challenger had 62 conversions and your control had 44 conversions. Sixty-two is the same 40% better than forty-four, but now you have enough data that the results are statistically significant.

There’s a lot more to know about split testing, but now you know how to measure your results, and how to assess whether the difference is meaningful.

 

Back to our 300% headline.

This one was a little counter-intuitive.

One of the “rules” of web design is that you have your opt-in box “above the fold” so that it’s right there, and your viewer doesn’t have to scroll down to see it.

However, in this particular case, with a rather complex landing page, moving the opt-in well down the page resulted in a 309% improvement in conversions. The supposition was that moving it down gave people more of a chance to get the information they needed before they were asked to make a decision.

Might make sense to try that sort of change on a complex page where you’re asking someone to trust you with her face or some other body part

Interesting stuff.  And that’s why we test, right?

What can you test – today?

Scott