In Case of Data Blindness, Click Here.

June 30, 2014

I love Google Analytics. It's done more than any other tool or resource to introduce people to the world of data and data analysis. It's also immensely powerful. And free.

The only problem with Google Analytics is that, like so many data analysis platforms these days, it follows the "gather some data now, figure out what to do with it later" pattern. It's not the only culprit. Big Data technology now makes it practical to adopt this gather first approach everywhere. You can throw all your data into a big bucket and when you need to find something out, you can go digging.

It never used to be this way. In larger organizations and more established companies they still have teams of Business Intelligence specialists. They build data transformation processes, set up reporting platforms, and implement the reports that managers ask for. It used to be that you needed these people, and when you hired them you got something extra for free: the time and space to think about whether you were asking the right questions. Not any more.

When you take away the techies, and all you have are managers exploring their data in Google Analytics, QlikView or Tableau, it turns out you don't magically get insights[1]. As nice as the standard reports and fancy charts are, people end up saying

Huh, our bounce rate went up 2.3 points this month... I wonder why?.

Nice charting and drag & drop interfaces do sell analytics tools, but they also give people data blindness. Like snow blindness, only data-y. There's so much to play with that you never get past the playing and down to some serious work.

So when you find yourself succumbing to data blindness here's what you should click:

The Minimise Button

Yes, you press the minimise button.

Step one to doing analytics well is to stop playing around, to step back, and to think about what it is we're trying to find out. As this tweet from the experts at InciteBI says, start with an understanding of what you want to know:

So how do you find the right question to ask?

Well, you could try some of the common ones, such as:

  • How many of my visitors make a purchase?
  • Which sites send us the most traffic?
  • Do [group X] spend more with us than [group Y]?
  • Is paying our team overtime actually profitable?
  • How accurate are we at estimating jobs?

Alternatively, you could start by developing theories about your customers and then using data to prove or disprove that theory. Here are some theories you might want to consider:

  • Most of my social traffic comes from Facebook, not Twitter
  • People are interested in our products, but our prices are too high
  • Our content marketing strategy is profitable
  • Our site works well on a mobile device
  • Ads that take people to specific landing pages covert better than ads that take people to our homepage
  • It pays to give people an introductory offer.
  • Most support tickets come from people using feature X

All of these theories are testable using data you're probably already gathering, and each theory has an important implication for how you run your business if proven true.

If you're struggling to come up with a theory to test, you could also start with examples of what you might do differently in your business, and then consider what sort of report would convince you that that was the right call. For example:

  • Raise prices
  • Lower prices
  • Expand to a new territory
  • Sell a new product
  • Hire more people
  • Fire a particular customer
  • Buy more stock
  • Switch supplier

The more data we gather, the more business starts to look like a form of social science. As Scott Adams (the creator of Dilbert) noted in his recent piece about startups, the "pivot" is more than just a clichéd startup term, it's an important part of the scientific discovery process.

And if you though I was drawing parallels between business and social psychology was just a cute metaphor, think again. Facebook recently revealed that it had been performing a giant psychology experiment on its users to see how their moods might be altered. While the moral implications of turning your users into lab rats are questionable, the trend is clear, we're all social psychologists now. And to be good scientists, we must think clearly about what we're measuring, why we're measuring it, and what the answer means.


[1] That's not entirely true. Sometimes it helps to get a lot of eyeballs on the data and see what happens, but often it's a waist of time.