
In 3 words: timeliness, methodology, and digestibility

A few weeks ago, I wrote about constructing systems to generate more quality insights. I presented how you could possibly increase the output of your team by working on areas equivalent to processes, tools, culture, etc., but I never defined what I meant by “quality” — so this week, we’ll do a deep-dive into this idea.
Often when someone talks about quality with regard to a knowledge study — we immediately jump to “ensuring the information evaluation is sound and the outcomes are reliable”. I consider this is only one a part of the definition. From my 8+ years of experience in analytics, for an information evaluation to be “good work” — it needs to be a mix of three fundamental elements:
- It answers an actual need with timely precision.
- It’s underpinned by a sturdy, tried-and-tested methodology.
- It’s digestible by the organization.
Let’s dive in!
For an information evaluation to be truly impactful, it’s crucial that it targets a real, well-defined need. This implies understanding exactly what problem is being addressed, identifying who it affects, recognizing why it’s relevant at this specific moment, and being clear on how the outcomes of the evaluation can be concretely used. The precision of this understanding directly correlates with the worth your evaluation brings to your end users.
And it’s critical to pinpoint an actual need — versus a perceived one. This can be certain that the evaluation just isn’t just theoretically useful but practically applicable. It is going to be certain that on the last day of the project, once you present it to your stakeholder — you don’t get questions equivalent to “so… what now?”. It makes the difference between providing insightful, actionable data and offering information that, while interesting, might not be immediately useful.
For example, a retail company might perceive a necessity to research customer demographics broadly, however the actual need could possibly be understanding the purchasing patterns of a particular age group. The latter directly influences marketing strategies and inventory decisions, thereby having a more profound impact on business operations.
Equally vital is the timeliness of the evaluation. This aspect comprises two key elements: the relevance of the necessity at the present time, and the speed of providing the evaluation.
- Relevance of the Need: The needs of companies are sometimes time-sensitive and might evolve rapidly — especially should you are in a fast-paced organization. An evaluation that addresses a current pressing issue is much more precious than one which arrives too late, or one which has been done too early. For instance, an evaluation of consumer trends within the lead-up to a significant holiday season will be invaluable for a business when it comes to stocking and marketing, but when it comes after the season has began, the chance is lost.
- Promptness of Evaluation: The speed at which the evaluation is delivered is equally critical — as that feeds into the relevance of the necessity. And that is a crucial factor to consider, as sometimes you may should make trade-offs between thoroughness of the study vs speed (e.g. if there’s a brand new trend on social media and your organization wants an evaluation to capitalize on a viral topic — you’ll be able to’t take 2 months to come back back with results).
In summary — the percentages of success to your data evaluation are significantly greater when it precisely identifies and addresses an actual, current need and when it’s delivered in a timely fashion, ensuring maximum relevance and impact.
Way too often — I see data analyses that will not be using any standard methodology. And while this doesn’t necessarily mean the study won’t be good, you highly reduce your likelihood of creating prime quality work by not following a proven methodology.
A structured / standardized approach ensures thoroughness and in addition enhances the credibility and replicability of the evaluation.
One methodology that I find easy to follow is the CRoss Industry Standard Process for Data Mining (CRISP-DM) framework. After almost a decade in the sphere, that’s still my go-to framework when starting an evaluation from scratch. This framework — which is alleged to be the usual “data science” / “data evaluation” process — has 6 principal phases:
- Business Understanding: During this phase, the information analyst needs to be thorough in understanding the “business context” of the ask: what’s the pain point we’re trying to resolve, what did we do previously, who’re the “actors”, what are the risks, resources, etc. — and in addition very importantly, what could be the success criteria for the project.
- Data Understanding: This phase involves getting acquainted with the information — it’s about descriptive & exploratory evaluation of the information, and the identification of knowledge quality issues. It’s your individual “preliminary survey,” where you begin to know the nuances and potential of the information.
- Data Preparation: This phase is about choosing the information you need to work with — with the rationale for inclusion/exclusion — then cleansing and reworking the information right into a format suitable for evaluation. It’s like preparing the ingredients before cooking a meal — essential for a great final result.
- Modeling: The thought of “modeling” will be daunting for some people — but modeling will be as easy as “making a certain threshold” for a real/false metric (for example, in case your project is knowing/defining churn). During this phase, various modeling techniques are applied to the prepared data, so which you could benchmark them against one another and understand which of them are probably the most successful ones.
- Evaluation: The models at the moment are critically assessed to make sure they meet the business objectives, and the success criteria that was set in phase #1. This often results in insights which you could use to circle back to and revise your small business understanding.
- Deployment: The ultimate phase involves applying the model to real-world data and situations, effectively putting the evaluation into motion, and beginning to use the insights to enhance the operations of the team.
This framework increases the percentages that your evaluation is more robust by forcing you to undergo those different steps — while leaving room for creativity.
Digestibility just isn’t nearly simplifying complex information and making your slide deck easier to know. It involves two integral points: (1) fostering a deep level of comprehension from the audience, and (2) enabling them to use these insights in a practical, impactful manner. This process is akin to how the body not only breaks down food but additionally utilizes the nutrients to fuel various functions.
Fostering a Deep Level of Comprehension from the Audience
Achieving this requires making the information accessible and resonanting with the audience. That is where subject material experts (SMEs) play a vital role. By involving SMEs early within the evaluation process, their domain knowledge can guide the framing and interpretation of the information, ensuring that the evaluation aligns with real-world contexts and is presented in a way that’s meaningful to the intended audience.
One other key strategy to boost digestibility is the implementation of a ‘stage-gate’ process, involving regular check-ins and updates with the stakeholder or receiving team. This approach avoids overwhelming them with a bulk of complex information at the tip of the study. As an alternative, stakeholders are brought along on the journey, allowing them to assimilate latest insights progressively. It also opens avenues for continuous feedback, ensuring that the evaluation stays aligned with the evolving needs and expectations of the audience.
Imagine you might be in a big organization implementing a brand new data-driven strategy. If the information team only presents the ultimate evaluation without prior engagement, stakeholders may find it difficult to know the nuances or see its relevance to their specific contexts. Nonetheless, by involving these stakeholders at regular intervals — through periodic presentations or workshops — they grow to be more acquainted with the information and its implications. They’ll offer precious feedback, steering the evaluation towards areas most pertinent to them, thus ensuring that the ultimate output just isn’t just comprehensible but immediately actionable and tailored to their needs.
Enabling The Audience to Apply the Insights
Actionability revolves around translating this deep comprehension into real-world applications or decisions. It’s about ensuring that the audience can effectively utilize the insights to drive tangible results. It’s about really eager about the “last mile” between your evaluation and real-life impact, and the way you might help remove any friction to adopt your insights.
For example, should you are working on a project where you goal is to define user churn — making your study more digestible might include you making a dashboard allowing your small business stakeholders to know what concretely your results looks like.
Other ideas include running workshops, developing interactive visualizations, etc. — anything to make it easier for the team to hit the bottom running.
In summary — the digestibility of an information evaluation project is significantly enhanced by involving SMEs from the outset and maintaining ongoing communication with stakeholders. This collaborative approach ensures that the study just isn’t only comprehensible but additionally directly relevant and precious to those it is meant to profit.
Successful data evaluation is an amalgamation of technical proficiency, strategic alignment, and practical applicability. It’s not nearly following a set of steps but understanding and adapting those steps to the unique context of every project. Timeliness, proper execution, and addressing real organizational needs are the pillars that support the bridge connecting data evaluation with organizational success. The last word goal is to rework data into actionable insights that drive value and inform strategic decision-making.
Hope you enjoyed reading this piece! Do you have got any suggestions you’d wish to share? Let everyone know within the comment section!
PS: This text was cross-posted to Analytics Explained, a newsletter where I distill what I learned at various analytical roles (from Singaporean startups to SF big tech), and answer reader questions on analytics, growth, and profession.