Great article, but I think it somewhat misunderstands the impetus for the concept. "Data has its place" sounds obvious precisely because "data-driven" has been such a successful concept. The alternative perspective, which used to be very common in our industry and still pops up from time to time, is that metrics are something you write for debugging and business decisions are made by gut feeling or abstract philosophical analysis. (Most software companies had to make decisions this way in the pre-cloud era, because it wasn't usually feasible to collect usage metrics.)
I worked at a self-described "data-driven" company, and the analogy senior leadership liked to make was that the company was like a machine learning algorithm, using data (particularly A/B tests) to do "gradient descent" of the product into its optimal form.
My first take-away was that using data to make decisions is tremendously, tremendously powerful. A/B tests, in particular, can help determine causality and drive any key metric you want in the direction you want to. Short-term, it seems to work great.
Long-term, it fails. Being purely data-driven without good intuition and long-term bets (that can't be "proven" with data), and the product loses its soul. You can
(and should) invest in metrics that are more indicative of the long-term. And you should use data to help guide and improve your intuition.
But data is not a substitute for good judgment, or for a deep understanding of your users and their problems, or of "where the puck is going". It's just a tool. It's a very powerful tool, but if it's your main or only tool, you will lose.
Totally agreed that data is needed if you want to act. But the metrics should serve as top-level benchmark which tell if things are going well or aren't. If they aren't, investigating why is the case so can only be done using the data.
Being data-driven for the sake is being data-driven is indeed becoming an issue. The resources spent measuring and analysing data are overwhelmingly larger than they should in most cases. Cohorts of "data scientists" and "managers" dive head on into data without much (if any!) first-principles thinking. People tend to replicate metrics without much thought into their relevance to the specific situation. Thinking properly is a very hard skill to acquire (the hardest?), and most do everything they can to avoid it.
"What you measure affects what you do. If you don't measure the right thing, you don't do the right thing." -- Joseph Stiglitz
Does anyone actually buy data driven? Seems like a great methodology to adopt if you need to absolve yourself of responsibility for your failures. Most questions people try to apply “data driven” to are so ridiculously complex I’d need to see some incredible methodology and ground breaking understanding of human behavior to put any faith in them. There’s just too many unknowns and confounds.
Data driven is great when you’re monitoring computer performance, but that’s a domain humans have nearly built from the ground up. And even then it can still be very hard to utilize that data. Trying to apply the same to systems we barely understand seems fraught with error.
Being data-driven is detrimental when it replaces common-sense (talking to users, collecting feedback, improving based on feedback).
Before the internet (and being able to track every single action), successful companies were built. It can be done. Using data to drive decisions has some value, but it's not the end-all solution, it's merely a piece of the puzzle.
That is a much better approach. It incorporates the humility you need to have with this type of data. That system works because it does not assume and expect accuracy, it works around the problem in a different way.
So, I guess my original sentiment is more like: "data-driven" is not a good thing on its own, because you need more than just a bunch of data, you need analytics.
It may seem more semantic than anything else, but I think there are some real differences in how people (especially managers) perceive "data-driven" vs. some other term like "analytics-driven". "Data is not magic" should be a catch phrase spread far and wide among the non-technical business world.
>It's disturbing how often success is measured by optimizing a single metric, and how resistant people can be to recognizing issues with this approach.
this is very true. The zeal of data driven approaches sometimes reminds of "craniometry" where people tried to gauge intelligence by measuring the shape of one's skull.
The trade off of using quantitative methods is always that you might lose too much meaning. The good thing about data driven approaches is that they are transparent and enable objective decision making, but people need to pay close attention and be alert that whatever it is they are measuring still has some qualitative justification.
I'm curious if one of the 'myths' of a data driven company is that you can instantly begin making decisions fed by 'real-time data' and learn after-the-fact from feedback loops. But for many legacy businesses the data pipeline for their important KPIs still moves slowly.
And then the data that does come in quickly becomes over-valued because everyone was sold the idea of instant gratification. So there is pressure to react to things quickly like meaningless web traffic metrics or local sales data - which may fluctuate heavily on a daily basis - instead of waiting for relevant patterns to emerge over longer periods.
Statistical significance and error rates are then overlooked in the name of a cargo cult data culture.
This is why business books can be dangerous or even destructive, as business advice from one person's experience is sold as generic design patterns that apply to every business - which isn't the case. This is why understanding the business inside-and-out is the most important attribute, then having MBA-esque skills/toolset is useful second. So you take the reality of the business into full consideration and apply tools to it, rather than seek out tools and pigeonhole your business into them.
People frequently overestimate the role of data in decision-making. Metrics, numbers and other quantitative information don't tell the full story. For a CEO to make decisions, the full picture must include qualitative information - risks, opportunities, market events, competitor's actions, etc. Metrics are just part of the full picture. Far less significant than many BI developers and "data teams" tend to think.
>"Data should not dictate your strategy," Nguyen says, "But you should understand what data tells you and also what its limits are."
Listened to this last night and found it revelatory. All the 'data driven' talk misses the point. Yeah, sure, data is nice and should inform decisions, but at the end of the day, ethical, good business will drive the data.
Totally agree. Data is not a substitute for good judgment, or for deeply understanding your users and their pain points or your industry. You can use data to guide decisions but as soon as you try to rely on it completely, the part of your brain that does critical reasoning just turns off.
Agreed! It all probably started with big tech promoting the “data-driven decision” paradigm. Of course, in many cases this approach is effective, but it’s not a panacea and has its limits. It’s tempting to interpret availability of any data as an amazing untapped resource, but in many cases analyzing it is just a massive waste of resources and could be more effectively replaced with traditional tools (surveys and such).
I liked this article and agree with the point - when it comes to innovation, new products, and new markets, instinct and experience could probably help more than data.
Using data for decision making is more relevant for more mature processes where the data exists to learn from. Being data driven is especially important for data intensive sectors where the complexity of issues cannot be fathomed just by being brilliant, it's here you need to be able to analyse the intricacies using the data to find possible answers.
I'd like to see a solid argument for a new style of management that uses statistics in new ways. For the most part, my experience with the phrase "data driven" has been a negative one. Most of the time, when I have a client that claims to be "data driven", they are using a style of argument to avoid direct, honest conversations. When I wrote "When companies make a fetish of being data driven they reward a passive aggressive style" I did my best to explain what I've seen:
"As far as I know, there has never been a company that said “We want the worst informed people to make the decisions” so in a sense all companies have always valued data. But they didn’t make a fetish out of it. They simply expected people to be well informed, and to make intelligent arguments, based on what they know. That would have been true at General Motors in 1950. That much has probably been true at most companies for centuries. When management says that the company is going to be “data driven” they are implicitly asking for a particular type of interaction to happen in meetings, an elaborate dance where people hide their emotions and quote statistics."
My perception might be skewed, of course. In the broad range of practical areas I worked in I rarely came across any data-driven papers. Mind giving some examples?
I think data driven is orhtogonal to opportunity driven and vision driven. With the first, data is used to find opportunities, with the second, it shows you if your strategy to your vision is working or not.
Then there is politics driven development. You want to do something and search the trove of data for data that supports what you are doing. Or you look at the data in a strategy meeting, and then ignore it (seen this happen mostly in board meetings of large companies)
Sometimes you've got a philosophical, gut, or value-driven idea about how something should be that disagrees with an interpretation of data. In an increasingly developer-focused startup environment, people choose the most easily quantifiable metric to make decisions. At the same time, it's hard to ever claim you have enough data or the right data to make some decisions. I've struggled with the balance between Vision and Data as drivers for decision-making. Sometimes I feel like people, for lack of a better word, resort to data to make decisions. How have you dealt with this in the past and how would you in the future?
I've heard this happen in a lot of places — companies want to be "data-driven", but then leadership simply ignores the data. I think being data-driven is something that is built into company culture, or otherwise it's too easy to just ignore the results and ship.
The place I currently work is data-driven (perhaps to a fault). Every change is wrapped behind an experiment and analyzed. Engineers play a major role in this process (responsible for analysis of simple experiments), whereas the data org owns more thorough, long-term analysis. This means there are a significant number of people invested in making numbers go up. It also means we're very good at finding local maxima, but struggle greatly shipping larger changes that land somewhere else on the graph.
Some of the best advice I've heard related to this is for leadership to be honest about the "why". Sometimes we just want to ship a redesign to eventually find a new maximum, even through we know it will hurt metrics for a while.
reply