What’s the best strategy for online testing- A/B or Multivariate? I promised myself that if I got this question from a client two times this week, I’d post a blog on it. I was originally planning to blog about my daughter’s fascination with watching any Elmo content on my (her) iPad rather than on TV, and the interesting corollary to device preference in content consumption at older demographics, but now that one is on the back burner.
OK, so the testing question, posed by two different clients in the same week, asked slightly differently, does warrant my response today. For the sake of my client’s sanity and mine, I typically like to answer their questions, regardless of topic, as simply as possible. And the simple answer in this case is A/B. The more complicated answer, is that it depends on many variables but because in 9 out of 10 cases where this particular question is posed, the correct course of action is A/B, I can safely suggest starting there. Why is that?
In A/B testing you are testing two or more distinct versions of creative against one another and in multivariate testing you are testing one creative with multiple combinations of elements. If a client has never done any testing or doesn’t know much about their customers online, then A/B testing will provide the fastest and cleanest results for them.
In my experience, it’s usually a dramatically different creative than the one the client is using that produces the greatest performance change rather than a “shifting of elements” within a creative such as moving a button or shifting copy, etc. Having a control creative and testing distinctively different creative will yield interesting results and can never be a bad thing. This is especially true when testing a small amount of traffic and establishing benchmarks.
So when clients ask, and they did twice this week, “What’s the best strategy for online testing?” They’ve pretty much told me what they need to know.