Tag: big data

Small Data: A Case Study



Big Data is a Big Thing, an idea that often goes hand in hand with words like “Enterprise” and “scientist.” Today I’d like to share a story from my past to illustrate that data, experimentation, and testing, are entirely accessible to business owners of all flavors and sizes, not just massive corporations with a dedicated team of growth hackers, data scientists and an in-house barista.

Two jobs before Automattic, I worked for a small chain of artisan bakeries in Providence, Rhode Island, called Seven Stars. There are three locations (very small chain), and it is owned by a lovely couple who brought me on to design and execute an improved employee training system. Once that was up and running on its own steam (after about 18 months), I became a bit more of a general utility player for them – finding problems and then solving them. It took great trust on their part, but I like to think I earned that trust, in efficiency gains, improved revenues, and tastier coffee.

During a conversation with one of the owners, he mentioned that he had a real gripe with muffins – not only were they one of the more involved pastries that we sold, they also had the slimmest margins. A situation fraught with possibility. I asked him a few more questions, and headed back to my shared office to dig through some of our historical point of sale data. I didn’t know it at the time, but what I was about to embark on was the retail bakery version of growth hacking.

At the time, we offered three different muffins every day, with the selection rotating from day to day – Blueberry, Corn and Pumpkin, say, on Monday, then Chocolate, Bran and Blueberry on Tuesday, etc etc.

After establishing a baseline (easily done with today’s computerized point of sale systems), I proposed an experiment: we would produce only 2 kinds of muffins per day, and only produce the ones that had the strongest current sales. We’d do this for six weeks, then take a look at the data, and decide from there – or, as I’d say today, we would then iterate on the process.

And, thankfully, since this is a case study, it worked! After six weeks, the sales at of each store had retained its pre-experiment growth percentage. Now, this may not sound like a success – sales growth had not changed? How can an experiment be a success if sales growth had not improved?

Sales growth may not have changed (up or down), but the numbers behind the sales growth had shifted; muffins fell significantly, but other areas (specifically scones, which interestingly sat next to the muffins in the display) grew to match the decrease in muffin sales.

If I were to guess, I would suggest that this indicated that folks who were at one time buying a muffin (perhaps the third, dropped, variety), were not simply abandoning their purchase, but rather purchasing another item, possibly even at the same price point. However, since muffins were the worst-producing item, revenue-wise, anything else represented a greater revenue for the bakery. Additionally, moving bakery labor from muffins to another product represented a second win, since muffins were the most laborious and frustrating product.

I like to think of this kind of data implementation as Small Data – using the information that you have to run experiments that are within your grasp for small, consistent wins. You don’t need a data scientist on staff, you don’t need a degree in statistics, you just need to know your business and have a curious mind. Data can work for everyone – all you need is a willingness to experiment.



Don’t Confuse Your Success for Customer Success

Data is awesome. Scientists have known this for a while now, but now we have Big Data, which some are going so far as to call a “Natural Resource” – watching a site like Growth Hackers only confirms that we are more interested in our data, and what it can tell us, than ever before.

This is a cautionary post. I love Google Analytics. I take great pride in being a part of a data-informed company, and I think solid data analysis and the drawing of insights from that analysis has a place in any modern business.

That part we can all agree on. That part is easy.


What I want to distinguish here is the difference between your success as a company, and the success of your customer. It is harder to focus on customer success when your data provides actionable insight that could trade their success for yours. I don’t mean to preach to you which one you should prefer: I’m a pragmatist, I can appreciate that sometimes to keep the doors open you have to make compromises. I’d encourage you to be honest with yourself, and simply recognize when you’re acting for your customer, and when you’re acting for yourself.

Maybe some examples would help illustrate what I mean.

  • Pop-up ads were gone for a while – remember? But now they’re back. Visually disruptive ad campaigns are the easiest example in this category. They may lead to more clicks, to additional ad revenue, but they are clearly not leading your customers to success. They are on your site to engage with your content and your products – obscuring those things with an external (or internal!) ad is putting your success ahead of theirs, plainly.
  • Opt-out or cancellation buttons and screens that include passive aggressive or semi-threatening language are becoming popular – “I don’t want to maximize my income.” “Leaving now may leave you at risk!” – these are, again, plainly putting a win for the company ahead of a win for the customer. You may minimize loss, but you’re not only putting aside hospitality, you’re being a bit of a bully. url
  • A/B Testing is a huge part of growth engineering and data collection. A button placed differently, a header image removed or altered, testing adjustments to see what converts, what leads to more traffic. Try to construct your A/B tests with customer success in mind. Their success is not usually tied as closely to conversions and page views – I can’t tell you what their success looks like, but they sure can!
  • When defining your Goals in a tool like Google Analytics, the same sort of thinking applies: yes, knowing the path your customers take to the final purchase confirmation page is important, but it is also worth considering the (much larger) group that does not convert. Identifying where they drop off, and using a tool like Qualaroo to find out why they leave, would help focus on their success.

Keep collecting data. Keep drawing actionable insights from it, but remember: the data doesn’t tell the whole story. Additional conversions, decreasing customer churn, these may look great on a quarterly spreadsheet, but you need to dig deeper to see if they are really giving your customers the best experience they can have.