In January of 2009, YouTube approached Cardinal Path (then VKI Studios) to develop a test using Google’s own Website Optimizer. Working together, VKI Studios and YouTube implemented one of the largest Google Website Optimizer (GWO) experiments in history, with a whopping 1024 combinations and more traffic than any reasonable site should have to maintain. The results: a 15.6% increase in YouTube sign-ups.
Last month, we interviewed Neal McGann and Michael Straker, the team behind the experiment, to ask what goes into creating such a massive test. This was originally released exclusively to subscribers to the VKI Studios newsletter, but today we’re making it available to our blog readers. As always, if you’re interested in GWO or other landing page testing, be sure to talk to us.KENT: So tell me a little about how this YouTube project came to be. I understand that a lot of it came together due to Curtis Lee.
MICHAEL: Curtis… we used to work with Curtis when he was at Google, he was one of the guys who worked with us on GWO [ed: that’s Google Website Optimizer, used for A/B and Multivartiate Conversion testing] things at Google. So when he moved over to YouTube and wanted GWO things done, he came to us. NEAL: Yeah. We’d worked with him back when he was at Google and he had thought that we were pretty good. They really wanted to foster a testing culture in the organization, and hadn’t run a large scale GWO test on YouTube before. In addition, YouTube wanted to increase account sign-ups, so it all came together.KENT: Tell me a little about developing the test and what kind of things you were thinking about when developing this recipe.
MICHAEL: It started with a heuristic review, analyzing potential weak spots, stumbling spots, areas for improvement, and then developing hypotheses about what could make pages better. It was, in essence, a pretty simple experiment but what was interesting was that it had very high traffic, how many combinations there were, and how successful it was… really, there are so few websites you could do something like this on. NEAL: I think Michael originally wanted a test with like 6000 combinations or something insane. It was ridiculous. Wanted to test absolutely everything… MICHAEL: *laughs* There was, at one point, 1800 combinations, not 6000, but we ended up with just over 1000.KENT: Does anything change when you’re dealing with a lot of combinations?
MICHAEL: Well the main reason why you don’t normally do that many combinations is that you’re spreading out the data between so many different versions that it’s going to take forever to get results. It’s only on a page like YouTube that you can do it. So that was what was fun about this: you have freedom to test as many combinations as you want (within reason) [GWO allows you to test up to 10,000 combinations]. Normally, you have to really scale it back and restrict the number of things you test, but YouTube’s traffic gave us much more freedom than you usually get.KENT: What ideas did you have that you decided not to go with?
NEAL: There was an animation. At one point we had an animated button that… well it wasn’t quite an animation, but it did kind of rotate. It was like an animated gif and the message might change or the image might change. We didn’t go with that though.KENT: Is there anything you can say about language use in the context of this kind of test?
MICHAEL: We thought that the original messaging didn’t really stand out, it was too subtle, so people would miss the sign up link. I think the original version was “sign up for a Youtube account” which didn’t sound like fun—it wasn’t a benefit or anything—so we thought we’d play with different wording, like “join the community” and “upload videos”, and all of that. Then we went to [Youtube] with them and they came back to us and gave us confirmations and rejected a few. Like we couldn’t say the word free.KENT: Really?
MICHAEL: Yeah, apparently Google doesn’t like that. But yeah, the big questions was what benefit was going to be most compelling. What do people perceive as the greatest benefit? Youtube had some specific ideas there and we had some ideas and we did some in person informal lab user testing as to why people would want to join. Was it to upload videos? To rate and comment on videos? Youtube thought that playlists and annotating videos would be important, but I would have thought that for newbies that wouldn’t be that exciting. The crux of it was that there was only room [on the page] for one benefit statement, so we wanted to develop a whole bunch and test each to see which one would turn out best. Then, we wanted to see which combination [of benefit statement and call to action] would work the best.KENT: What determined your color choices and design, and can you address the potential for contradiction between design aesthetics and testing?
NEAL: The color we took largely from the existing design. We inferred a color palette based on what was there. As for testing, the idea of the test is to present different variations and combinations to users, and you express this through design… As for contradictions, well… I guess- and this is very reductionist- the point of aesthetic design is to make things blend and be harmonious but if something’s very harmonious, then its not going to stand out and grab attention. The boss used to say that he wanted to be able to see the call to action from six feet away, and in order to see a call to action from 6 feet away it has to be pretty big and pretty ugly. MICHAEL: Yeah and as we thought, the ugliest one won. Also, there was the box [around the benefits section] itself, and how the box was laid out. The original was very subtle (gray on gray) and the one that won—which was the one we thought would win—was white with a black box around it. It stood out more than the subtle gray one.KENT: What concerns did Google/YouTube have?
NEAL: They were concerned about GWO breaking YouTube, because whenever anyone hits the page some GWO control script runs, it serves the variations from Google’s servers to the user (and for the sake of this I’m going to differentiate between Google and Youtube) so they get the JavaScript and CSS from YouTube’s servers, and then content from Google’s servers separately, which adds latency into the whole thing. So, if you have all that going on, it might impact the user experience and their servers. Then, of course, Google was worried that 6 gazillion users might flood Google’s GWO servers… And yeah, we probably did have too many combinations and they were concerned about that also. MICHAEL: …the head technical guy at Google for GWO suggested we have fewer combinations, but we said “no, we’re not going to restrict our test, fix your tools!” *laughs*KENT: Are there any concerns you have post-test?
NEAL: Well, the results were good. We are very happy with them. We got a victory, which was good. The problem is I wasn’t too thrilled with the aesthetics of some of the variations, though they were very effective—they succeeded in their goal of increasing sign-ups. Ultimately, the most compelling aspect of the test recipe was the big ugly button we stuck in the masthead, and, quite frankly, I didn’t like that one, but from the start we knew it was going to win. It was big and red and ugly, and I liked the white one which was nice and harmonious, but ultimately didn’t stand out. Aesthetics versus business goals.KENT: And you credit that to the majority of the success?
NEAL: Well obviously it was a number of factors and distinguishing which out of all the elements accounted for so much, is all but impossible, but I’m sure that the red button played a major role in the eventual success of the test, yes.KENT: And what were the results?
MICHAEL: 15.6% increase in conversions, which is… well… I can’t say how many people that was, but let’s say that with the number of visitors that Youtube gets, that is huge.As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…
Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…
Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…
This website uses cookies.