Business reports and dashboards often contain pointless, distracting or down right unhelpful numbers. Numbers that do not give the reader any actionable insight. Here are some of the types of metrics to avoid, and why.
Totals
Examples of totals you don't need are:
- Total users
- Total page views
- Total revenue
- Total units sold
- Total downloads
Totals are great vanity metrics. They make you feel good, but they lump together every data point you've ever gathered in a way that offers no clues as to what you did well, what you did poorly or how quickly you're going.
It might be helpful to know how much it's costing in marketing to get each download, or where the people who downloaded came from, or whether the rate of downloads has changed much since last week. But a grand total is useless.
One possible exception to this rule is when you want to convince a potential customer with some social proof. Many popular books proudly report "Over one million copies sold". This is a sensible strategy, but in this case you're not tracking a total for internal decision makers and so the need for actionable insight is not relevant.
Dead End Metrics
Dead end metrics are metrics that leave you thinking "so what?". There are many examples but a lot depends on what kind of business you're in. One example might be:
on average our customers have 2.3 cats
There's probably no way for you to capitalise on that statistic. Unless, of course, you make and sell cat furniture! It all depends what type of business you're in , but in general, beware of metrics that are interesting but do not lead to any obvious course of action.
Correlation Metrics
An example of a correlation metric would be:
70% of people who bought from us visited our FAQ page first.
Is this just a correlation or did the FAQ page actually make them buy? If you believe the FAQ page caused people to buy then you should probably put the copy from the FAQ page on your landing page. But it might be that people who were already going to buy often visit the FAQ page en route to the shopping basket. Correlation Metrics can be a good source of ideas for further experimentation, but that's all they are. Correlation does not imply causation.
Non-significant Metrics
An example of a non-significant metric would be:
There's an "80%" chance that making the Buy Now button green drives more sales.
Although 80% sounds convincing, what we're saying is there's a 1 in 5 chance that the green 'Buy Now' button did nothing to improve sales at all and the increase was purely random. If you do 5 experiments like this, chances are that at least one of them will give you a false positive. When experimenting with different marketing materials like this, you should look for a 95% chance (or higher) that one version performs better than the other before doing anything about it. In practice that means that A/B testing is only practical when you have a lot of visitors and your sample size is huge.
Micro Metrics
Imagine a single step in a long supply chain where two possible delivery companies could be used. One provider is cheap and slow, the other is expensive and fast. Looking at the big picture, it's not hard to imagine that slower deliveries might have knock on effects and cost more in the long run. But if you're only measuring shipping costs at that one point in the supply chain, then the actionable insight might appear to be to choose the cheaper provider.
It's important to always keep an eye on your top level metrics; CAC, LTV and COGS. While it can be beneficial to optimise tiny parts of the overall process, if you go too far down into the minute details you could find you're not moving the needle on those top line metrics at all. Reporting micro metrics can motivate people to waste time on moving the bottlenecks around while getting nowhere overall.
Reporting these useless metrics wastes time, not just to build the reports, but in lost focus and distractions every time a manager looks at them. My advice is to leave these things out of your internal reports and focus on the numbers that can give you actionable insights. In next week's post I'll talk a bit more about what those metrics are.