13 questions to deliver value with data

#4 Once you see it, then what?

Let’s be real — data’s supposed to help us make decisions, but most of the time it’s all just a big mess. I know, I create that mess too. And I bet you’ve probably spent way too many hours looking at numbers, wondering, “What the hell does that mean?” You check the customer churn rate and have no idea if it includes cancellations, inactivity or if it’s just plain wrong. Or when you see a spike in website traffic and can’t figure out if it’s because of that marketing campaign you worked so hard on, bots, or just wrong data. It’s easy to get lost in all of that. But hey, what if we stopped and actually figured out what we really need to know?

At heart, we humans are great at asking tons of questions. I’ll stop at 13 today — though I know we could ask many more. I just have a thing for the number 13. It’s bad luck for some, but I’d say it’s bad luck if you don’t go through them.

1. Who produces and/or owns the data?

It all starts with a very boring question, I’m sorry. But if you get this wrong, all the following questions stop being relevant. Your Data Engineers and Data Analysts are just burning cash if the data sources rely on manual inputs, lack a data catalog, have no documentation, no ownership, and no accountability.

2. How is the data validated?

Are your data sources aligned? Are you drowning in duplicates? Next thing you know, you’re chasing different versions of the same numbers. It’s the classic “Why are the numbers different?” game, and suddenly you’ve got three teams digging into it. Again, money, time and opportunity cost. Still boring, but the starts are like that.

3. Are we driving a decision, raising awareness, or just… collecting data?

This is a mystery not just for data teams but for stakeholders too. If the goal is to drive a decision, the analysis has to be laser-focused on the next action. If it’s about raising awareness, it’s more about storytelling and trends. Over-communication is key here — without it, the domino effect kicks in.

4. Once we see the data, then what?

People rarely stop to ask: “What if this number is 100 instead of 10? How does that change my decision?” Instead, it’s “Give me all the data so I can see it.” And then… nothing happens. It’s cognitively expensive to fully evaluate the implications, but skipping this step becomes “company expensive.” Usually it’s not one person enjoying that curiosity, the bigger the company the more curiosity. Multiply that and here is your bill.

5. What are the risks of not doing it?

It can be an Occam’s Razor case — sometimes the best course of action is to avoid unnecessary intervention. Before shutting down the fire, ask if it actually is a fire, or if it’s just smoke. Sometimes, doing nothing is the smartest move.

6. What are the stakeholders’ expectations?

Success needs a definition. Is it cost savings? Efficiency? Making someone’s life easier? Data has a way of making people expect magic. Before you carry that weight, manage expectations first. Knowing what success looks like helps prevent overpromising and underdelivering.

7. How much confidence in the data do we need — 95%, 80%, 60%, …?

If you’re changing your lead strategy based on churn rate, you probably want 95% confidence. If you’re just testing if people click a button? 60% might do. It’s not about eliminating uncertainty — it’s about knowing when it’s acceptable.

8. Do multiple teams consume the data?

Different teams, different languages. A Managing Director and a Product Manager will use the same numbers very differently. Keep that in mind when communicating insights. This is not only because of the context but the range and scale of decisions taken.

9. What is the data literacy level?

This is a global weakness. Most people cannot keep up, and few are interested in learning. When building an analysis, assess the level of detail required. Someone who isn’t familiar with the topic will need more context than someone who is already in the topic.

10. What is the decision-making cadence?

If it’s for a quarterly strategic decision, go deep. If it’s operational and needs to run daily, do we really need to overcomplicate it? Maybe a manager just needs to know if the website’s load time has spiked, not a full user journey analysis.

11. What level of granularity is needed?

Does someone need a one-pager or the ability to slice and dice across 20 dimensions? Should it be a dashboard, a report, or just an automated Slack message? People don’t just have different problems — they have different attitudes about consuming data. It’s an objective world within a subjective world, gotta deal with that.

12. Can we recycle and automate this?

The number of times I’ve reused the same table or query is ridiculous. Automating those processes saves time, reduces errors, and speeds up decision-making. Call it self-service, call it becoming data-literate — but do it.

13. What’s next?

After all this, did the needle move or did nothing happen? If stakeholders had all the data and still made no decision, what went wrong? Wrong format? No urgency? Bad timing? Just politics? It’s not just on stakeholders — data teams need to reflect and adjust too.

More data ≠ higher value

Data is only as useful as the questions we ask. Too often, we focus on collecting more when we should be focusing on collecting better. Go back to the questions. Ask yourself what actually matters, what decision needs to be made, and whether the data in front of you helps or just distracts.

Next
Next

Underestimated analytics practices people hate