We recently concluded some interesting A/B/n tests for an online driving school’s landing page. I thought I’d share the results, as it’s a great example of the distinction between usable design and persuasive design.
We restricted our test to the main content area of the page.
To help develop test hypotheses, we conducted a heuristic usability review of the page. Then we threw in a few informal user tests for good measure. Our subjects were not carefully screened and did not closely match the target group, but we usually figure “any test is better than no test”. Afterward, we were quite confident in our hypotheses, which included:
- The most important conversion event was an online course registration. Yet the link to the registration forms was merely one of four grey buttons of identical size and design. We hypothesized that we could increase conversions by making the most important button much more prominent.
- The page contained four sections with non-stop animation. We believed this was a huge distraction and hypothesized that conversions would increase if we froze the central area of animation and placed bulleted benefits there instead.
We produced three alternate designs that varied somewhat in design.
Then we launched the test, confident of a big win. Yet after four weeks, Google Website Optimizer had still not revealed a winner. In fact, one version was actually performing slightly worse than the original!
Back to the drawing board. We brought in a fresh set of eyes (thanks, Darcy). He agreed that the revised versions were certainly more usable, but noted that they weren’t as much fun. Could it be that for this particular target market, the animation we found distracting and annoying was actually stimulating and motivational?
We stopped the original test and developed a new one. We revised our test page, adding the animation back in and putting the bulleted benefits below it.
It worked! After a few weeks, GWO declared our revised page a winner. It was converting over 15% better than the original. Which is a big deal, because in this experiment, a conversion was an actual sale. This test should pay for itself in just a few weeks, and from there on it’s all profit for our client.
For me, there were two main takeaways. First, more usable does not always mean more effective or persuasive. In this particular case, perhaps our target audience needed a bit more visual stimulation to hold their interest.
Second, though I still advocate that “any user test is better than no user test”, I’ll add the caveat that sometimes user tests can be misleading. Sometimes — especially when you’re trying to establish how persuasive a site is rather than just how usable — it’s important to recruit subjects who closely match the target audience. In this particular case, our test subjects were probably somewhat too sophisticated and (dare I say it?) cynical. Even if they found the animation to be fun and engaging, perhaps they thought it would be uncool to admit it.