Tag: a/b test

Big Data, Maps, and the Precision Tradeoff

Preaching-From-the-Rooftops

In reading Discovering Statistics Using R, written by the peerless Andy Field, I came across his quick explanation of sampling, and why it’s done in science – and it brought together a broader circle of thinking that I’ve been chewing on recently. Let’s talk about abstraction.

We’ve all heard this one before, right? “The map is not the territory.” When we break this down a bit, it can tell us some interesting things. A map is useful because it communicates geography in an abstract way – it trades a certain amount of precision for a larger view, a more abstract vision of the lay of the land. A map that is perfectly precise would need to be the same size as the area it is representing – which would not help you find your way to the nearest gas station!

Consider the globe: the amount of information a globe has left out is monumental, and yet it is still highly useful – but the utility is abstracted from the precision in an interesting way. In this illustration we can see that information does not have to be perfectly precise to be useful – and in fact sometimes we can have _too much precision_, in the case of the lost travelers seeking a gas station.

Note, too, that if you’re lost somewhere between Wausau and Sheboygan, a perfectly precise map and a perfectly abstract globe are equally, perfectly, useless.

When conducting a scientific experiment, using a sample of the population operates in the same way – you conduct an experiment using a subset of the population, a sample, and then aim to apply your findings to the population at large. Interestingly, in science, as we increase precision by growing our sample size, we do not necessarily impact the final utility of our findings, but we do in fact make those findings more costly – and surely at some point there are declining returns for each additional research subject.

Again, we see in science, a need to balance precision with utility – you could, possibly, survey every English speaking human, but it’s not obviously true that your results would be much more useful than if you surveyed only 10% – or less. In this way, we can see another way in which we need to balance precision and utility.

As statistics and semi-scientific testing (I’m looking at you, ad-hoc hypothesizers!) becomes more popular among Big Data enthusiasts and Growth Engineers, it’s important that we keep in mind the need to balance precision, abstraction, and utility.

There comes a point where we can become so precise that we are no longer creating any good (and certainly no increase in revenue) – imagine discovering precisely the manner in which 19-year-old Scottish males named Chris use your product. While precise, how helpful is this? How actionable is this information? Where’s the utility?

In the same way, there comes a point at which abstraction is a barrier to action – anyone who has faced down the pure, unadulterated data barf of an untouched Google Analytics account can attest to that!

Let’s try to emulate Aristotle in finding the golden mean between these poles – considering the final utility of a study or experiment first, then adjusting the abstraction accordingly.

Don’t Confuse Your Success for Customer Success

DataTNG
Data is awesome. Scientists have known this for a while now, but now we have Big Data, which some are going so far as to call a “Natural Resource” – watching a site like Growth Hackers only confirms that we are more interested in our data, and what it can tell us, than ever before.

This is a cautionary post. I love Google Analytics. I take great pride in being a part of a data-informed company, and I think solid data analysis and the drawing of insights from that analysis has a place in any modern business.

That part we can all agree on. That part is easy.

 

What I want to distinguish here is the difference between your success as a company, and the success of your customer. It is harder to focus on customer success when your data provides actionable insight that could trade their success for yours. I don’t mean to preach to you which one you should prefer: I’m a pragmatist, I can appreciate that sometimes to keep the doors open you have to make compromises. I’d encourage you to be honest with yourself, and simply recognize when you’re acting for your customer, and when you’re acting for yourself.

Maybe some examples would help illustrate what I mean.

  • Pop-up ads were gone for a while – remember? But now they’re back. Visually disruptive ad campaigns are the easiest example in this category. They may lead to more clicks, to additional ad revenue, but they are clearly not leading your customers to success. They are on your site to engage with your content and your products – obscuring those things with an external (or internal!) ad is putting your success ahead of theirs, plainly.
  • Opt-out or cancellation buttons and screens that include passive aggressive or semi-threatening language are becoming popular – “I don’t want to maximize my income.” “Leaving now may leave you at risk!” – these are, again, plainly putting a win for the company ahead of a win for the customer. You may minimize loss, but you’re not only putting aside hospitality, you’re being a bit of a bully. url
  • A/B Testing is a huge part of growth engineering and data collection. A button placed differently, a header image removed or altered, testing adjustments to see what converts, what leads to more traffic. Try to construct your A/B tests with customer success in mind. Their success is not usually tied as closely to conversions and page views – I can’t tell you what their success looks like, but they sure can!
  • When defining your Goals in a tool like Google Analytics, the same sort of thinking applies: yes, knowing the path your customers take to the final purchase confirmation page is important, but it is also worth considering the (much larger) group that does not convert. Identifying where they drop off, and using a tool like Qualaroo to find out why they leave, would help focus on their success.

Keep collecting data. Keep drawing actionable insights from it, but remember: the data doesn’t tell the whole story. Additional conversions, decreasing customer churn, these may look great on a quarterly spreadsheet, but you need to dig deeper to see if they are really giving your customers the best experience they can have.