Empirical Design Using Analytics

Avinash KaushikWorkshop: Click the Big Red Button : Tips & Techniques for Optimizing Conversion and A/B Testing”
Date: Monday, 4/16/2007
Presenter: Avinash Kaushik, Analytics Evangelist

Avinash’s talk was funny and enlightening. I got a good sense of how important it is to drive design decisions using empirical data rather than relying on intuitions or even well-established heuristics. I also have a new term in my lexicon: “HiPPO” (the Highest-Paid Person’s Opinion). Essentially, everyone has an opinion on design and assumes that they understand the user. In many organizations, the higher up on the pay scale one goes, the stronger that opinion gets. And if the opinion happens to be wrong, it gets more and more more difficult it is to correct.

This is not so much an issue at Instructables, where we have a tight-knit (and fairly horizontal) team of five, but it has been a huge issue at various companies I’ve worked for over the last ten years. Hard data rescues you from the perilous seas of subjectivity.

Avinesh also gave a good overview of the methods that one has at one’s disposal (A/B testing, Multivariate Testing, and Experience Testing.) These range from the simple and easy-to-implement to comprehensive strategies that allow companies to fully understand the impact of each design decision on the bottom line.

As usual, full notes are after the jump.

More on the speaker:

  • Blog: Occam’s Razor
  • Consultant: ZQ Insights
  • Book: Web Analytics: An Hour a Day
  • And, of course, he’s the Analytics Evangelist for Google.

Avinash begins the presentation with yet another slide poking fun at Web 2.0 logos (probably the fourth one I’ve seen at the expo.) The joke is that this is most of what has changed from the Web 1.0 logos that he displays.

He notes, however, that very few of the 1.0 logos are from companies that are still around. Fatal mistakes of Web 1.0:

  • No monetization model for eyeballs
  • Not enough focus on serving the needs of the customer

How does one’s company make it to Web 3.0?

The key is to optimize the visitor’s follow-on activity. First, Segment, segment, segment. Break apart your data by geography, time of day, referral source, type of browser, etc. Use all means at your disposal. Then, per segment, measure revenue, order size, and visitors. Don’t just look at conversion rates. Find your most valuable segments. Sometimes, you will find that the segments with higher conversion rates are actually redirecting to less-valuable activities. Take this into account!

The simplest advice: Reduce abandonment. This is the most effective first step towards raising conversion rates and revenues.

Find the Minority of Visitors that Matter

You’ll find that not many do. Take a sober look at the Primary Purpose of users when they come to your site. After all, you can only measure Task Completion when you know what the intended task was. Resign yourself to the fact that many people are not coming to your site with the purpose of doing business. Many others are there to register products, get customer service, etc. Avinash shows a pie chart, segmenting the primary purpose of visitors to several large sites. He progressively chops off all the people who will never buy something or watch anything. The resulting slice 37% or 24%. The true opportunity is thus pretty small.

Design Empirically

The key is to convert this small slice. How? Testing, testing, testing. Experiment. Don’t follow the HiPPO (Highest-Paid Person’s Opinion.) Your boss is not your user. Your designs (and certainly the HiPPO’s designs) will most often be wrong, and even when they’re right, they’ll be stale in short order. You must experiment and innovate, constantly.

There is a progression of methods that you can use to do this. Here’s the list from the fastest/easiest to the more complicated:

  • Customer Surveys [This was not actually in the talk, but I saw it on Avinash’s blog.]
  • A/B Testing
  • Multivariate Testing
  • Experience Testing

A/B Testing

A/B testing is simple testing of one variation of your site against the control. A small number of users can be given the alternate version and their behavior can be measured against the baseline.

Case Study: Intuit onced developed Flash-based shoping cart. So sexy that Avinash wanted to marry it [his words], and yet a failure! Users simply didn’t get the transaction done. The empirical data is what counts, and here it directly contradicted everyone’s intuitions, to the tune of hundreds of thousands of dollars.

The key: “Learn to be wrong, quickly.” Analytics allow you to know which designs are working, fast, to improve, and to know quantitatively how much your improvements affect your bottom line.

Multivariate Testing

A/B testing is easy to get into, but it is unfortunately limited because you can test only one change at once. The next level is multivariate testing. This allows you to mix and match several changes simultaneously. Most tools will let you embed JS in content areas, allowing the content to varied automatically. If you have three factors you want to vary, MVT lets you try all combinations at once. Once you have a statistically significant set of data, your analytics tool will show you how each factor is affecting conversion (or any kind of user action you measure.)

Avinash showed several examples (1-800 Flowers and Picasa) that showed once again out designers’ and business owners’ intuitions were incorrect when modeling the desires of the user.

Experience Testing

Unfortunately, MVT can sometimes lock you into a locally-maximal optimum. The user may prefer one particular link on a page, but perhaps the whole experience later on in the conversion ends up being poor. To really know what works, you need to MVT the whole pipeline.

Enter experience testing. Once the user has diverged MVT-wise on an initial page, the entire experience on follow-up pages is altered in a way that matches that inital segmentation. Entire user experiences can then be compared, empirically. This is an expensive approach, but the results are incredibly powerful.


Unfortunately, there are no vendors who have off-the-shelf solutions for doing Experience testing, yet. For everything else, though, there’s plenty of good software:

  • Offermatica
  • Optimost
  • Sitespect
  • Google’s Website Optimizer

In a nutshell, get beyond HiPPO-based decision making. Design empirically. This is best for everyone, including the HiPPO.

Resources: * Testing Primer: http://snipurl.com/exptest * Build a Successful Program: http://snipurl.com/testingsuccess

Related Posts:

One Response to “Empirical Design Using Analytics”

  1. ericnguyen Says:

    Presentation notes have just been posted: http://www.web2expo.com/presentations/webex2007/berkowitz_web.ppt