ab

A/B Testing – Rip it Up and Start Again

It’s a common fallacy that as people invested in finding a solution, we can quickly get blinkered by the small details until we can no longer see the words from the trees.

Acquiring a store of knowledge is only as useful as knowing when to stand back and let go of assumptions to start to search again for cold hard facts.

We’re firm believers in the numbers here at OneDesk – and two articles have been doing the rounds recently at HQ on the importance of tearing assumptions asunder and taking time out to base decisions on raw testing instead.

Obama Assumes Nothing

A B Testing, A B Testing

The success of the Obama 2008 election campaign has been much documented and our first article, from Brian Christian of Wired presents a behind-the-scenes look at the role of A/B testing as a key driver behind that success. He explains:

“Over the past decade, the power of A/B testing has become an open secret of high-stakes web development. It’s now the standard (but seldom advertised) means through which Silicon Valley improves its online products. Using A/B, new ideas can be essentially focus-group tested in real time: Without being told, a fraction of users are diverted to a slightly different version of a given web page and their behavior compared against the mass of users on the standard site. If the new version proves superior—gaining more clicks, longer visits, more purchases—it will displace the original; if the new version is inferior, it’s quietly phased out without most users ever seeing it. A/B allows seemingly subjective questions of design—color, layout, image selection, text—to become incontrovertible matters of data-driven social science.”

Running tests, the team tried different photos and call to action buttons – the degree to which these improved the results for Obama’s pages instantly illustrates the power of the method: the team credited their improved versions of the campaign pages as being responsible for four million signups from a total of thirteen million.

A/B Testing the Strength of a Smile

Meanwhile, a post from 37signals goes into the nitty gritty of their team’s learning experience testing different messaging and imagery across their landing pages as they aimed to increase the conversion rate of vistors.

The variance they experienced testing different configurations had little correlation with their pre-test assumptions and created some huge successes – on one occasion increasing conversion rates by upwards of 100%.

One of their findings was that including a picture of someone smiling nudged conversion rates higher – though they emphasize what worked for them may not translate into similar successes for others.

As you’d expect, we’re conducting our own tests on landing pages and various sections of the site as we roll out a revamp of our branding, so don’t be surprised if some pages are playing a little differently than usual.

Assumptions are as not as valuable as results, and A/B testing is a brilliant way of doing it wrong, quickly.

And so, as they sing in Glasgow – rip it up and start again:

Do you have experiences to share with AB Testing? Let us know in the comments below!

Related blog posts

Scroll to Top