
A couple of easy frameworks to pinpoint what the analytical needs of your organization are and learn how to make it more data-driven

Understanding your organization’s analytical maturity can provide you with a powerful edge as an information skilled. It is going to make your “non-analytical” decisions higher informed (from “project prioritization” to “learn how to present your findings”) and enable you to formulate a long-term goal. And that’s truly an edge — not plenty of data professionals are taking this step back to design long-term goals (and even fewer are delivering against these long-term goals).
This text is split into 3 parts:
- Part 1: Understanding Analytical Maturity
- Part 2: Moving stages
- Part 3: What’s a “mature” organization
Let’s dive in!
Any organization (team, product, company, etc.) at a given time is at a certain stage of analytical maturity. Similar to humans who “crawl, walk after which run”, organizations undergo the identical stages. It’s one among those immutable laws of nature: every little thing goes through creation, development, and maturity.
There are a couple of interesting frameworks¹ re:Analytical maturity, with different components and emphasis. From personal experience, I discovered a company through the next 4 components to be essentially the most useful and actionable:
- Its needs: Robert D. Peng and Elizabeth Matsui wrote in “The Art of Data” that there are 6 major sorts of questions: Descriptive, Exploratory, Inferential, Predictive, Causal, and Mechanistic. The form of questions you’re being asked are frequently an amazing indicator of the extent of maturity of your org — a low-maturity org will probably be mostly keen on descriptive and exploratory data studies, vs an advanced-maturity org that may ask more predictive and causal questions.
- Its people: one other key component to analytical maturity is the people, each by way of capabilities and capacities. Depending on what number of data resources the organization has and the way good their capacities are
- Its tools & processes: are there standardized tools for the information skilled? Do we now have standardized processes (e.g. prioritization, templates, etc.) for the information team?
- Its culture: what’s the split between intuition and data in decision-making?
Depending on how your organization scores on each of those components, it should fall into one among these 3 stages:
- The launch stage: At this stage, the organization needs basic reporting to know what has already happened (“in hindsight”). There isn’t a central data team, you may not even have data analysts — data studies are being done by a couple of data-savvy operators on top of their 9–5 jobs. There’s also no tooling, no process, and no clear agreement on what lens needs to be used when a selected phenomenon. This results in plenty of noise (e.g. multiple teams with different churn definitions that result in disagreement down the road). On the cultural side, while everyone agrees that data should inform the decision-making process, as a result of the dearth of information (or the mistrust in the information) plenty of decisions are being made via “informed gut feeling”.
- The event stage: The organization has good visibility of its market and for among the key metrics it needs to be tracking. Now it needs to know why things are evolving in a certain way (“insight”). Teams start being supported by data professionals (either embedded inside them or inside a centralized data team). The information infrastructure is slowly shifting from Google Spreadsheets to more robust tooling. To triage and prioritize all of the information requests r, the primary few data professionals establish basic prioritization principles and a ticketing system (i.e. Google Form). Common lenses are being adopted across teams, and in consequence, data is an increasing number of relied upon for decision-making. Non-data professionals develop into smarter on what data inquiries to ask, With tooling, non-data professionals can have a look at the information themselves
- The maturity stage: The organization understands why things are moving in a certain way and might now predict and influence future changes (“foresight”). Centralized data teams start forming, acting as proactive thought partners (vs “reactive support” from the previous stage). Tools, processes, and metrics are being standardized. Data is predicted in every decision-making process
The image above is a simplification of the true life. In point of fact, organizations can rating very in another way on each component — but you get the gist of it. The great thing about this framework is that:
- It gives you a structured strategy to discover the critical aspects hindering your organization’s analytical growth.
- It permits you to pinpoint where your org is in its journey — and what’s next for it.
That’s really why you get a powerful edge when learn how to use this framework: it gives you a strategy to understand where you’re and where you possibly can be, and to diagnose why you usually are not there yet. Your job then is “only” to set a technique to remove the roadblocks — which is strictly what we’ll see in the subsequent
Richard Rumlet in “Good Strategy, Bad Strategy” wrote: “The core of strategy work is all the time the identical: discovering the critical aspects in a situation and designing a way of coordinating and focusing actions to take care of those aspects”.
That’s also true for when you wish to move your organization’s analytical maturity: it is advisable pinpoint the critical aspects that may enable you to move to the subsequent steps and design a plan to get there. The framework we saw above — the one which broke down analytical maturity into 4 components: the needs of a company, its data resources, its processes & tools, and its data culture — can enable you to pinpoint the gaps in your organization — but pinpointing is simply 20% of the job. Let’s discuss the remaining 80%
Good strategy, bad strategy framework
I really like Richard Rumlet’s book and I feel it gives an outstanding framework to take into consideration this. He explains that a great strategy has 3 elements:
- A diagnostic: an important a part of the framework is the diagnostic — it’s the bottom of your whole logical approach. Your diagnostic should let you understand the present situation, but in addition the cause and “why” the organization is there.
- Some guiding principles: From this diagnostic, you may derive some guiding principles — that after you’re in your journey to grow the analytical maturity will enable you to make your decision-making process easier and enable you to stay on target over time.
- A coherent motion plan following the above: Armed along with your diagnostic and your guiding principles, your major task is to determine where you wish to be under what timeline, and the way you’re going to get there.
Starting with a diagnostic
“A well-stated problem is half-solved” John Dewey
The thought is to know the present situation and the true “why” behind it. You don’t need to tackle symptoms — your goal is to go all of the strategy to the foundation cause and fix what must be fixed.
Listed here are a couple of tips about learn how to do a great diagnostic:
- Start from the 4 dimensions we saw previously: needs / people / tools & processes / culture and assess your organization using this lens and get to the foundation cause in each of those areas.
- Get data on current pain points and solutions:
- Interview people: get to know people, their jobs, their decision-making process, and the way they use the information of their day-to-day job
- Shadow people: similarly, shadowing people might be an amazing strategy to get a deeper understanding of their day-to-day jobs, and might uncover insights that you just wouldn’t have for those who were to only interview them
- Send a survey: depending on the dimensions of your organization, sending a survey might be helpful so that you can get more quantitative data. Bonus: it could actually also let you start tracking the emotions of your org toward “analytics”, and provide you with a benchmark you may report against afterward.
- Do a “literature review”, each internally (review previous work and understand how people tried to unravel the previous pain points in the event that they were successful or not, and why) and externally ( plenty of content is accessible free of charge on the internet, and probably the problems you’re serious about have been documented and discussed before (either in a pleasant article on HBR or on an obscure forum for analytics aficionados). It’s all the time extremely helpful to get other people’s perspectives on learn how to solve different problems).
- Practice the 5 “whys”: ask yourself why each time you uncover a brand new insight. You wish to take a bird’s eye view of things and understand the important thing reasons for the situation the organization is in. Note that It isn’t necessarily a simple task, especially if you could have been contained in the company for a very long time and you’re used to things the way in which they’re.
Deriving guiding policies
“Everyone has a plan until they get punched within the mouth” Mike Tyson
The diagnostic will uncover some patterns which should let you derive guiding principles. Those guiding principles will come in useful in a pair of various situations:
- When defining your motion plan: consider those as “guardrails” on the freeway: they may let you all the time stay on target and to ensure the problem you diagnosed will get solved
- When faced with a situation that you just weren’t expecting: you should utilize your different principles to facilitate and guide your decision-making — that provides you with incredible peace of mind
- When making trade-offs or saying no to stakeholders: saying no is all the time complicated — but this is important for a great strategy. By making your principles clear and having your stakeholders conform to it, pushing back on their requests will probably be a neater pill for them to swallow.
The toughest a part of the guiding principles is sticking to them — identical to in life.
Establishing an Motion Plan
This motion plan must be coherent and cohesive and canopy different components of analytical maturity.
How you can arrange an motion plan:
- Find subject material experts within the organization you’re supporting, and work with them on the plan:
- Walk them through your diagnosis and your guiding principles, and brainstorm with them on what needs to be the subsequent steps and over which period frame.
- If you happen to are in a fast-paced organization, consider optimizing for optionality — providing you with time to maneuver the maturity of the organization but in addition to have the ability to reply “fire drills” or time-sensitive questions
- Think outside of your organization: for those who are supporting one part of a bigger company, also take into consideration the way you will interact with the opposite analytical functions, and have that added to your plan
- Arrange success criteria: at any time when there’s qualitative work being done, don’t ignore establishing success criteria. Like every other work, you need to have the ability to say if it is a success when you’re done with it. So set a binary success criteria that may have the ability to inform you how you probably did. Put some thought into it — ensuring the factors will properly represent what you are attempting to unravel.
- Arrange reporting processes and timeline: doing the work is significant, but when no one knows about it or uses what you built, are you actually creating value? Establishing a correct reporting process will let you achieve multiple goals directly:
- Gives visibility to your work to a bigger audience and facilitates collaboration opportunities
- Facilitates a Go-To-Market strategy on your latest analytical product (as you could have a venue to advertise your latest dashboards and reports)
- Ensure Leadership buy-in: you may’t construct a culture around data without the support of your leads. Present the plan to them and garner their support to make sure smooth sailing towards your goal
The formula for fulfillment
The FS newsletter shared this tiny thought the opposite day:
“The recipe for fulfillment:
- The courage to begin.
- The discipline to focus.
- The boldness to figure it out.
- The patience to know progress isn’t all the time visible.
- The persistence to maintain going, even on the bad days.”
Ultimately — that is what all of it boils all the way down to. You should have the courage to begin the conversation around your organization’s analytical maturity and where it needs to be, the discipline to develop your motion plan (while addressing the immediate fire drills), the boldness to seek out the fitting solution despite potential naysayers, the patience and persistence of moving forward.
And hopefully, you’ll reach the tip goal: constructing an Analytically Mature Organization
I actually have been talking rather a lot about AMO and we’ve seen learn how to grow it — but I never concretely outlined what’s an analytically mature organization, and why it’s so great. So here is a component 3 — with concrete examples of what a mature analytical organization does in another way!
An AMO is a corporation that understands the complex dynamic of its market, and which activities can influence it.
Analytically mature organizations have clear visibility into how their activities (“input metrics”) drive short-term results (“output metrics”) which in turn drive long-term outcomes (“final result metrics”).
- Example: an analytically mature marketing org will know the impact of sending promotional emails (input: # of sent emails) in driving latest sign-ups (output: # of sign-ups), and to what extent those sign-ups will convert to paying users down the road (final result: # of paid users). They are going to use different ratios (enroll vs sent) and do benchmarks between their different campaigns, helping them improve their craft.
Mature orgs will even have a transparent understanding of the important thing aspects influencing their topline metrics. They will seamlessly perform root cause evaluation, to know the evolution of those top metrics, and take corrective actions.
- Example: a sales org will have the ability to find out which channels and customer segments to prioritize based on where there is perhaps headwinds or lucrative opportunities. They’ve perfected their investigation process — to the purpose they were capable of automate it, and at this stage, an algorithm directly surfaces the fitting insight to the fitting people.
The information needs have shifted toward more “complex” questions — akin to opportunity sizing, causal impact tracking, etc. Harder questions — that require deep domain expertise in addition to advanced statistical methodologies
- Example: an analytical mature HR org will want to begin looking into what can drive worker retention and/or success — and to achieve this, it should start running causal impact evaluation to extract the important thing aspects which might be predictors of success
An analytically mature organization is a corporation where a couple of specialized data teams are collaborating.
- The entire backbone of an analytically mature organization relies on clean data — which is the explanation why in an analytically mature organization, you could have data engineers who’re creating pipelines, datasets, and databases, and which might be committing to very strict rules and “service level agreements” (SLA) in order that it could actually be easily consumed by different downstream teams (akin to data science or business intelligence).
- You furthermore mght have product managers, working alongside those data engineers to ensure the fitting databases are being built to unravel essentially the most pressing pain points of the organization, and constructing tools to enhance data discoverability (which is, even in a really mature organization, all the time a sophisticated topic).
- You may have data scientists who devour all this data and switch it into deeper insights for product and business users — allowing the organization to make higher decisions. They are frequently a reasonably central team, with their work influencing the roadmaps of each the upstream and the downstream teams (i.e. their needs will influence the roadmap of the information engineering team, and their findings will often influence the work of other analyst teams).
- Finally, you could have business/data/financial analysts, who support each strategic decisions and day-to-day operations.
To present a concrete example of a giant retailer:
- Data engineers will construct the fitting pipeline to ensure we now have day by day databases, with the shop name, its location, its inventory, the # of sales per item, etc.
- Data scientists will use those databases to run “market basket evaluation” — to uncover which items are essentially the most bought together.
- Business Analysts will take those findings and look into learn how to operationalize them inside different stores. They are going to construct metrics to trace the “operationalization” (and potentially set OKRs for different stores against those).
An analytically mature organization is a corporation with robust tools and standardized processes — allowing different teams to derive insights faster and with a better level of quality.
- In an AMO, robust data governance processes have been implemented, making it easier for people to devour data. Analysts don’t should spend hours double-checking each data source — they’ll trust a couple of certified databases and metrics, which greatly saves their precious time.
- Multiple tools have been built (or implemented) to standardize typical data studies — which leaves less room for error for the person contributors and allows more people to get the insights they need.
- Example: as a substitute of getting to make your statistical tests for A/B tests, you could have a tool during which you simply input the information that does that for you routinely.
- Similarly, from a project management perspective — the standard “steps” of a study have been mapped, formalized & standardized (every little thing from the prioritization decision-making process to the inner go-to-market of a study). Because of those formalized processes, it is simpler for the org to know who’s doing what, and learn how to collaborate with different data teams.
Finally, an analytically mature organization is a corporation where everyone seems to be data-savvy.
- As knowledge management has been a priority (and not only an after-thought) people find it easy to seek out resources and support to reply their data requests
- There are also a couple of inspiring and tenured “data leaders” who’ve began organizing an internal “data aficionado” ecosystem (more on that in the next article!)
- Internal training is accessible and upskilling people — regardless of where they’re on their data journey
- Data forums are “cool” — they’re where great conversations and large decisions are being taken. Data teams are regarded as “thought partners” and are dropped at the table when key decisions are being taken. Every decision is informed, if not driven, by data.
In summary, you could have a really well-oiled machine. Every thing is about up in order that the information teams can concentrate on generating quality insights, and the barrier of entry to data usage has been lifted, allowing interested individuals to begin deriving insights and improving their day-to-day jobs. It’s a utopia.
This text was cross-posted to Analytics Explained, a newsletter where I distill what I learned at various analytical roles (from Singaporean startups to SF big tech), and answer reader questions on analytics, growth, and profession.
¹: Analytics Maturity Models: An Overview by Karol Król and Dariusz Zdonek