Analytics and Context

Executive Summary

  • Context is crucial in data analytics because the purpose of data analytics is always practical to improve future performance
  • The context of a decision is the totality of the conditions that constitute the circumstances of the specific decision
  • The three key characteristics of the context of human behaviour in a social setting are (i) uniqueness; (ii) “infinitiveness”; and (iii) uncertainty
  • There are five inter-related implications for data analysts if they accept the critical importance of context:

Implication 1: The need to recognise that datasets and analytical models are always human-created “realisations” of the real world.

Implication 2: All datasets and analytical models are de-contextualised abstractions.

Implication 3: Data analytics should seek to generalise from a sample rather than testing the validity of universal hypotheses.

Implication 4: Given that every observation in a dataset is unique in its context, it is vital that exploratory data analysis investigates whether or not a dataset fulfils the similarity and variability requirements for valid analytical investigation.

Implication 5: It is misleading to consider analytical models as comprising dependent and independent variable

As discussed in a previous post, “What is data analytics?” (11th Sept 2023), data analytics is best defined as data analysis for practical purpose. The role of data analytics is to use data analysis to provide an evidential basis for managers to make evidence-based decisions on the most effective intervention to improve performance. Academics do not typically do data analytics since they are mostly using empirical analysis to pursue disciplinary, not practical, purposes. As soon as you move from disciplinary purpose to practical purpose, then context becomes crucial. In this post I want to explore the implications for data analytics of the importance of context.

              The principal role of management is to maintain and improve the performance levels of the people and resources for which they are responsible. Managers are constantly making decisions on how to intervene and take action to improve performance. To be effective, these decisions must be appropriate given the specific circumstances that prevail. This is what I call the “context” of the decision – the totality of the conditions that constitute the circumstances of the specific decision.

              In the case of human behaviour in a social setting, there are three key characteristics of the context:

  1.   Unique

Every context is unique. As Heraclitus famously remarked, “You can never step into the same river twice”. You as an individual will have changed by the time that you next step into the river, and the river itself will also have changed – you will not be stepping into the same water in the exactly the same place. So too with any decision context; however similar to previous decision contexts, there will some unique features including of course that the decision-maker will have experience of the decision from the previous occasion. In life, change is the only constant. From this perspective, there can never be universality in the sense of prescriptions on what to do for any particular type of decision irrespective of the specifics of the particular context. A decision is always context-specific and the context is always unique. 

2. “Infinitive”

By “infinitive” I mean that there are an infinite number of possible aspects of any given decision situation. There is no definitive set of descriptors that can capture fully the totality of the context of a specific decision.

3. Uncertainty

All human behaviour occurs in the context of uncertainty. We can never fully understand the past which will always remain contestable to some extent with the possibility of alternative explanations and interpretations. And we can never know in advance the full consequences of our decisions and actions because the future is unknowable. Treating the past and future as certain or probabilistic disguises but does not remove uncertainty. Human knowledge is always partial and fallible

              Much of the failings of data analytics derive from ignoring the uniqueness, “infinitiveness” and uncertainty of decision situations. I often describe it as the “Masters of the Universe” syndrome – the belief that because you know the numbers, you know with certainty, almost bordering on arrogance, what needs be done and all will be well with world if only managers would do what the analysts tell them to do. This lack of humility on the part of analysts puts managers offside and typically leads to analytics being ignored. Managers are experts in context. Their experience has given them an understanding, often intuitive, of the impact of context. Analysts should respect this knowledge and tap into it. Ultimately the problem lies in treating social human beings who learn from experience as if they behave in a very deterministic manner similar to molecules. The methods that have been so successful in generating knowledge in the natural sciences are not easily transferable to the realm of human behaviour. Economics has sought to emulate the natural sciences in adopting a scientific approach to the empirical testing of economic theory. This has had an enormous impact, sometimes detrimental, on the mindset of data analysts given that a significant number of data analysts have a background in economics and econometrics (i.e. the application of statistical analysis to study of economic data).

              So what are the implications if we as data analysts accept the critical importance of context? I would argue there are five inter-related implications:

Implication 1: The need to recognise that datasets and analytical models are always human-created “realisations” of the real world.

The “infinitiveness” of the decision context implies that datasets and analytical models are always partial and selective. There are no objective facts as such. Indeed the Latin root of the word “fact” is facere (“to make”). Facts are made. We frame the world, categorise it and measure it. Artists have always recognised that their art is a human interpretation of the world. The French impressionist painter, Paul Cezanne, described his paintings as “realisations” of the world. Scientists have tended to designate their models of the world as objective which tends to obscure their interpretive nature. Scientists interpret the world just as artists do, albeit with very different tools and techniques. Datasets and analytical models are the realisations of the world by data analysts.

Implication 2: All datasets and analytical models are de-contextualised abstractions.

As realisations, datasets and analytical models are necessarily selective, capturing only part of the decision situation. As such they are always abstractions from reality. The observations recorded in a dataset are de-contextualised in the sense that they are abstracted from the totality of the decision context.

Implication 3: Data analytics should seek to generalise from a sample rather that testing the validity of universal hypotheses.

There are no universal truths valid across all contexts. The disciplinary mindset of economics is quite the opposite. Economic behaviour is modelled as constrained optimisation by rational economic agents. Theoretical results are derived formally by mathematical analysis and their validity in specific contexts investigated empirically, in much the same way as natural science uses theory to hypothesise outcomes in laboratory experiments. Recognising the unique, “infinitive” and uncertain nature of the decision context leads to a very different mindset, one based on intellectual humility and the fallibility of human knowledge. We try to generalise from similar previous contexts to unknown, yet to occur, future contexts. These generalisations are, by their very nature, uncertain and fallible.

Implication 4: Given that every observation in a dataset is unique in its context, it is vital that exploratory data analysis investigates whether or not a dataset fulfils the similarity and variability requirements for valid analytical investigation.

Every observation in a dataset is an abstraction from a unique decision context. One of the critical roles of the Exploration stage of the analytics process is to ensure that the decision contexts of each observation are sufficiently similar to be treated as a single collective (i.e. sample) to be analysed. The other side of the coin is checking the variability. There needs to be enough variability between the decision contexts so that the analyst can investigate which aspects of variability in the decision contexts are associated with the variability in the observed outcomes. But if the variability is excessive, this may call into question the degree of similarity and whether or not it is valid to assume that all of the observations have been generated by the same general behaviour process. Excessive variability (e.g. outliers) may represent different behavioural processes, requiring the dataset to be analysed as a set of sub-samples rather than as a single sample.

Implication 5: It can be misleading to consider analytical models as comprising dependent and independent variables.

Analytical models are typically described in statistics and econometrics as consisting of dependent and independent variables. This embodies a rather mechanistic view of the world in which the variation of observed outcomes (i.e. the dependent variable) is to be explained by the variation in the different aspects of the behavioural process as measured (or categorised) by the independent variables. But in reality these independent variables are never completely independent of each other. They share information (often known as “commonality”) to the extent that for each observation the so-called independent variables are extracted from the same context. I prefer to think of the variables in a dataset as situational variables – they attempt to capture the most relevant aspects of the unique real-world situations from which the data has been extracted but with no assumption that they are independent; indeed quite the opposite. And, given the specific practical purpose of the particular analytics project, one or more of these situational variables will be designated as outcome variables.

Read Other Related Posts

What is Data Analytics? 11th Sept 2023

The Six Stages of the Analytics Process, 20th Sept 2023

Measuring Trend Growth

Executive Summary

  • The most useful summary statistic for a trended variable is the average growth rate
  • But there are several different methods for calculating average growth rates that can often generate very different results depending on whether all the data is used or just the start and end points, and whether simple or compound growth is assumed
  • Be careful of calculating average growth rates using only the start and end points of trended variables since this implicitly assumes that these two points are representative of the dynamic path of the trended variable and may give a very biased estimate of the underlying growth rate
  • Best practice is to use all of the available data to estimate a loglinear trendline which allows for compound growth and avoids having to calculate an appropriate midpoint of a linear trendline to convert the estimated slope into  growth rate

When providing summary statistics for trended time-series data, the mean makes no sense as a measure of the point of central tendency. By definition, there is no point of central tendency in trended data. Trended data are either increasing or decreasing in which case the most useful summary statistic is the average rate of growth/decline. But how do you calculate the average growth rate? In this post I want to discuss the pros and cons of the different ways of calculating the average growth rate, using total league attendances in English football (the subject of my previous post) as an illustration.

              There are at least five different methods of calculating the average growth rate:

  1. “Averaged” growth rate: use gt = (yt – yt-1)/yt-1 to calculate the growth rate for each period then average these growth rates
  2. Simple growth rate: use the start and end values of the trended variable to calculate the simple growth rate with the trended variable modelled as yt+n = yt(1 + ng)
  3. Compound growth rate: use the start and end values of the trended variable to calculate the compound growth rate with the trended variable modelled as yt+n = yt(1 + g)n
  4. Linear trendline: estimate the line of best fit for yt = a + gt (i.e. simple growth)
  5. Loglinear trendline: estimate the line of best fit for ln yt = a + gt (i.e. compound growth)

where y = the trended variable; g  = growth rate; t = time period; n = number of time periods; a = intercept in line of best fit

These methods differ in two ways. First, they differ as to whether the trend is modelled as simple growth (Methods 2, 4) or compound growth (Methods 3, 5). Method 1 is effectively neutral in this respect. Second, the methods differ in terms of whether they use only the start and end points of the trended variable (Methods 2, 3) or use all of the available data (Methods 1, 4, 5). The problem with only using the start and end points is that there is an implicit assumption that these are representative of the underlying trend with relatively little “noise”. But this is not always the case and there is a real possibility of these methods biasing the average growth rate upwards or downwards as illustrated by the following analysis of the trends in football league attendances in England since the end of the Second World War.

Figure 1: Total League Attendances (Regular Season), England, 1946/47-2022/23

This U-shaped timeplot of total league attendances in England since the end of the Second World War splits into two distinct sub-periods of decline/growth:

  • Postwar decline: 1948/49 – 1985/86
  • Current revival: 1985/86 – 2022/23

Applying the five methods to calculate the average annual growth rate of these two sub-periods yields the following results:

MethodPostwar Decline 1948/49 – 1985/86Current Revival 1985/86 – 2022/23*
Method 1: “averaged” growth rate-2.36%2.28%
Method 2: simple growth rate-1.62%3.00%
Method 3: compound growth-2.45%2.04%
Method 4: linear trendline-1.89%1.75%
Method 5: loglinear trendline-1.95%1.85%
*The Covid-affected seasons 2019/20 and 2020/21 have been excluded from the calculations of the average growth rate.

What the results show very clearly is the wide variability in the estimates of average annual growth rates depending on the method of calculation. The average annual rate of decline in league attendances between 1949 and 1986 varies between -1.62% (Method 2 – simple growth rate) to -2.45% (Method 3 – compound growth rate). Similarly the average annual rate of growth from 1986 onwards ranges from 1.75% (Method 4 – linear trendline) to 3.00% (Method 2 – simple growth rate). To investigate exactly why the two alternative methods for calculating the simple growth rate during the Current Revival give such different results, the linear trendline for 1985/86 – 2022/23 is shown graphically in Figure 2.

Figure 2: Linear Trendline, Total League Attendances, England, 1985/86 – 2022/23

As can be seen, the linear trendline has a high goodness of fit (R2 = 93.1%) and the fitted endpoint is very close to the actual gate attendance of 34.8 million in 2022/23. However, there is a relatively large divergence at the start of the period with the fitted trendline having a value of 18.2 million whereas the actual gate attendance in 1985/86 was 16.5 million. It is this divergence that accounts in part for the very different estimates of average annual growth rate generated by the two methods despite both assuming a simple growth rate model. (The rest of the divergence is due to the use of midpoint to convert the slope of the trendline into a growth rate.)

              So which method should be used? My advice is to be very wary of calculating average growth rates using only the start and end points of trended variables. You are implicitly assuming that these two points are representative of the dynamic path of the trended variable and may give a very biased estimate of the underlying growth rate. My preference is always to use all of the available data to estimate a loglinear trendline which allows for compound growth and avoids having to calculate an appropriate midpoint of a linear trendline to convert the estimated slope into a growth rate.

Read Other Related Posts

Competing on Analytics

Executive Summary

  • Tom Davenport, the management guru on data analytics, defines analytics competitors as organisations committed to quantitative, fact-based analysis
  • Davenport identifies five stages in becoming an analytical competitor: Stage 1: Analytically impaired Stage 2: Localised analytics Stage 3: Analytical aspirations Stage 4: Analytical companies Stage 5: Analytical competitors
  • In Competing on Analytics: The New Science of Winning, Davenport and Harris identify four pillars of analytical competition: distinctive capability; enterprise-wide analytics; senior management commitment; and large-scale ambition
  • The initial actionable insight that data analytics can help diagnose why an organisation is currently underperforming and prescribe how its future performance can be improved is the starting point of the analytical journey

Over the last 20 years, probably the leading guru on the management of data analytics in organisations has been Tom Davenport. He came to prominence with his article “Competing on Analytics” (Harvard Business Review, 2006) followed up in 2007 by the book, Competing on Analytics: The New Science of Winning (co-authored with Jeanne Harris). Davenport’s initial study focused on 32 organisations that had committed to quantitative, fact-based analysis, 11 of which he designated as “full-bore analytics competitors”. He identified three key attributes of analytics competitors:

  • Widespread use of modelling and optimisation
  • An enterprise approach
  • Senior executive advocates

Davenport found that analytics competitors had four sources of strength – the right focus, the right culture, the right people and the right technology. In the book, he distilled these characteristics of analytic competitors into the four pillars of analytical competition:

  • Distinctive capability
  • Enterprise-wide analytics
    • Senior management commitment
  • Large-scale ambition

Davenport identifies five stages in becoming an analytical competitor:

  • Stage 1: Analytically impaired
  • Stage 2: Localised analytics
  • Stage 3: Analytical aspirations
  • Stage 4: Analytical companies
  • Stage 5: Analytical competitors

Davenport’s five stages of analytical competition

Stage 1: Analytically Impaired

At Stage 1 organisations make negligible use of data analytics. They are not guided by any performance metrics and are essentially “flying blind”. What data they have are poor quality, poorly defined and unintegrated. Their analytical journey starts with the question of what is happening in their organisation that provides the driver to get more accurate data to improve their operations. At this stage, the organisational culture is “knowledge-allergic” with decisions driven more by gut-feeling and past experience rather than evidence.

Stage 2: Localised Analytics

Stage 2 sees analytics being pioneered in organisations by isolated individuals concerned with improving performance in those local aspects of the organisation’s operations with which they are most involved. There is no alignment of these initial analytics projects with overall organisational performance. The analysts start to produce actionable insights that are successful in improving performance. These local successes begin to attract attention elsewhere in the organisation. Data silos emerge with individuals creating datasets for specific activities and stored in spreadsheets. There is no senior leadership recognition at this stage of the potential organisation-wide gains from analytics.

Stage 3: Analytical Aspirations

Stage 3 in many ways marks the “big leap forward” with organisations beginning to recognise at a senior leadership level that there are big gains to be made from employing analytics across all of the organisation’s operations. But there is considerable resistance from managers with no analytics skills and experience who see their position as threatened. With some senior leadership support there is an effort to create more integrated data systems and analytics processes. Moves begin towards a centralised data warehouse managed by data engineers.

Stage 4: Analytical Companies

By Stage 4 organisations are establishing a fact-based culture with broad senior leadership support. The value of data analytics in these organisations is now generally accepted. Analytics processes are becoming embedded in everyday operations and seen as an essential part of “how we do things around here”. Specialist teams of data analysts are being recruited and managers are becoming familiar with how to utilise the results of analytics to support their decision making. There is a clear strategy on the collection and storage of high-quality data centrally with clear data governance principles in place.

Stage 5: Analytical Competitors

At Stage 5 organisations are now what Davenport calls “full-bore analytical competitors” using analytics not only to improve current performance of all of the organisation’s operations but also to identify new opportunities to create new sustainable competitive advantages. Analytics is seen as a primary driver of organisational performance and value. The organisational culture is fact-based and committed to using analytics to test and develop new ways of doing things.

To quote an old Chinese proverb, “a thousand-mile journey starts with a single step”. The analytics journey for any organisation starts with an awareness that the organisation is underperforming and data analytics has an important role in facilitating an improvement in organisational performance. The initial actionable insight that data analytics can help diagnose why an organisation is currently underperforming and prescribe how its performance can be improved in the future is the starting point of the analytical journey.

Read Other Related Posts

The Keys to Success in Data Analytics

Executive Summary

  • Data analytics is a very useful servant but a poor leader
  • There are seven keys to using data analytics effectively in any organisation:
  1. A culture of evidence-based practice
  2. Leadership buy-in
  3. Decision-driven analysis
  4. Recognition of analytics as a source of marginal gains
  5. Realisation that analytics is more than reporting outcomes
  6. Soft skills are crucial
  7. Integration of data silos
  • Effective analysts are not just good statisticians
  • Analysts must be able to engage with decision-makers and “speak their language”

Earlier this year, I gave a presentation to a group of data analysts in a large organisation. My remit was to discuss how data analytics can be used to enhance performance. They were particularly interested in the insights I had gained from my own experience both in business (my career started as an analyst in the Unilever’s Economics Department in the mid-80s) and in elite team sports. I started off with my basic philosophy that “data analytics is a very useful servant but a poor leader” and then summarised the lessons I had learnt as seven keys to success in data analytics. Here are those seven keys to success.

1.A culture of evidence-based practice

Data analytics can only be effective in organisations committed to evidence-based practice. Using evidence to inform management decisions to enhance performance must be part of the corporate culture, the organisation’s way of doing things. The culture must be a process culture by which I mean a deep commitment to doing things the right way. In a world of uncertainty we can never be sure that what we do will lead to the future outcomes we want and expect. We can never fully control future outcomes. Getting the process right in the sense of using data analytics to make the effective use of all the available evidence will maximise the likelihood of an organisation achieving better performance outcomes.

2. Leadership buy-in

A culture of evidence-based practice can only thrive when supported and encouraged by the organisation’s leadership. A “don’t do as I do, do as I say” approach seldom works. Leaders must lead by example and continually demonstrate and extol the virtues of evidence-based practice. If a leader adopts the attitude that “I don’t need to know the numbers to know what the right thing is to do” then this scepticism about the usefulness of data analytics will spread throughout the organisation and fatally undermine the analytics function.

3. Decision-driven analysis

Data analytics is data analysis for practical purpose. The purpose of management one way or another is to improve performance. Every data analytics project must start with the basic question “what managerial decision will be impacted by the data analysis?”. The answer to the question gives the analytics project its direction and ensures its relevance. The analyst’s function is not to find out things that they think would be interesting to know but rather things that the manager needs to know to improve performance.

4. Recognition of analytics as a source of marginal gains

The marginal gains philosophy, which emerged in elite cycling, is the idea that making a large improvement in performance is often achieved as the cumulative effect of lots of small changes. The overall performance of an organisation involves a myriad of decisions and actions. Data analytics can provide a structured approach to analysing organisational performance, decomposing it into its constituent micro components, benchmarking these micro performances against past performance levels and the performance levels of other similar entities, and identifying the performance drivers. Continually searching for marginal gains fosters a culture of wanting to do better and prevents organisational complacency.

5. Realisation that analytics is more that reporting outcomes

In some organisations data analytics is considered mainly as a monitoring process, tasked with tracking key performance indicators (KPIs) and reporting outcomes often visually with performance dashboards. This is an important function in any organisation but data analytics is much more than just monitoring performance. Data analytics should be diagnostic, investigating fluctuations in performance and providing actionable insights on possible managerial interventions to improve performance.

6. Soft skills are crucial

Effective analysts must have the “hard” skills of being good statisticians, able to apply appropriate analytical techniques correctly. But crucially effective analysts must also have the “soft” skills of being able to engage with managers and speak their language. Analysts must understand the managerial decisions that they are expected to inform, and they must be able to tap into the detailed knowledge of managers. Analysts must avoid being seen as the “Masters of the Universe”. They must respect the managers, work for them and work with them. Analysts should be humble. They must know what they bring to the table (i.e. the ability to forensically explore data) and what they don’t (i.e. experience and expertise of the specific decision context). Effective analytics is always a team effort.

7. Integration of data silos

Last but not least, once data analytics has progressed in an organisation beyond a few individuals working in isolation and storing the data they need in their own spreadsheets, there needs to be a centralised data warehouse managed by experts in data management. Integrating data silos opens up new possibilities for insights. This is a crucial part of an organisation developing the capabilities of an “analytical competitor” which I will explore in my next Methods post.

Read Other Related Posts

The Six Stages of the Analytics Process

Executive Summary

  • The analytics process can be broken down further into six distinct stages:  (1) Discovery; (2) Exploration; (3) Modelling; (4) Projection; (5) Actionable Insight; and (6) Monitoring
  • Always start the analytics process with the question: “What is the decision that will be impacted by the analysis?”
  • There are three principal pitfalls in deriving actionable insights from analytical models – generalisability, excluded-variable bias, and misinterpreting causation

The analytics process can be broken down further into six distinct stages:

  1. Discovery
  2. Exploration
  3. Modelling
  4. Projection
  5. Actionable Insight
  6. Monitoring
Figure 1: The Six Stages of the Analytics Process

Stage 1: Discovery

The discovery stage starts with a dialogue between the analyst and decision maker to ensure that the analyst understands the purpose of the project. Particular attention is paid to the specific decisions for which the project is intended to provide an evidential basis to support management decision making.

The starting point for all analytics projects is discovery. The Discovery stage involves a dialogue with the project sponsor to understand both Purpose (i.e. what is expected from the project?) and Context (i.e. what is already known?). The outcome of discovery is Framing the practical management problem facing the decision-maker as an analytical problem amenable to data analysis. It is crucial to ensure that the analytical problem is feasible given the available data.

Stage 2: Exploration

The exploration stage involves data preparation particularly checking the quality of the data and transforming the data if necessary. A key part of this exploration stage is the preliminary assessment of the basic properties of the data to decide on the appropriate analytical methods to be used in the modelling stage.

Having determined the purpose of the analytics project and sourced the relevant data in the initial Discovery stage, there is a need to gain a basic understanding of the properties of the data. This exploratory data analysis serves a number of ends:

  • It will help identify any problems in the quality of the data such as missing and suspect values.
  • It will provide an insight into the amount of information contained in the dataset (this will ultimately depend on the similarity and variability of the data).
  • If done effectively, exploratory data analysis will give clear guidance on how to proceed in the third Modelling stage.
  • It may provide advance warning of any potential statistical difficulties.

A dataset contains multiple observations of performance outcome and associated situational variables that attempt to capture information about the context of the performance. For the analysis of the dataset to produce actionable insights, there is both a similarity requirement and a variability requirement. The similarity requirement is that the dataset is structurally stable in the sense that it contains data on performance outcomes produced by a similar behaviour process across different entities (i.e. cross-sectional data) or across time (i.e. longitudinal data). The similarity requirement also requires that there is consistent measurement and categorisation of the outcome and situational variables. The variability requirement is that the dataset contains sufficient variability to allow analysis of changes in performance but without excessive variability that would raise doubts about the validity of treating the dataset as structurally stable.

Stage 3: Modelling

The modelling stage involves the construction of a simplified, purpose-led, data-based representation of the specific aspect of real-world behaviour on which the analytics project will focus.

The Modelling stage involves the use of statistical analysis to construct an analytical model of the specific aspect of real-world behaviour with which the analytics project is concerned. The analytical model is a simplified, purpose-led, data-based representation of the real-world problem situation.

  • Purpose-led: model design and choice of modelling techniques are driven by the analytical purpose (i.e. the management decision to be impacted by the analysis)
  • Simplified representation: models necessarily involve abstraction with only relevant, systematic features of the real-world decision situation included in the model
  • Data-based: modelling is the search for congruent models that best fit the available data and capture all of the systematic aspects of performance

The very nature of an analytical model creates a number of potential pitfalls which can lead to: (i) misinterpretation of the results of the data analysis; and (ii) misleading inferences as regards action recommendations. There are three principal pitfalls:

  • Generalisability: analytical models are based on a limited sample of data but actionable insights require that the results of the data analysis are generalisable to other similar contexts
  • Excluded-variable bias: analytical models are simplifications of reality that only focus on a limited number of variables but the reliability of the actionable insights demands that all relevant, systematic drivers of the performance outcomes are included otherwise the results may be statistically biased and misleading
  • Misinterpreting causation: analytical models are purpose-led so there is a necessity that the model captures causal relationships that allow for interventions to resolve practical problems and improve performance but statistical analysis can only identify associations; causation is ultimately a matter of interpretation

It is important to undertake diagnostic testing to try to avoid these pitfalls.

Stage 4: Projection

The projection stage involves using the estimated models developed in the modelling stage to answer what-if questions regarding the possible consequences of alternative interventions under different scenarios. It also involves forecasting future outcomes based on current trends.

Having constructed a simplified, purpose-led model of the business problem in the Modelling stage, the Projection stage involves using this model to answer what-if questions regarding the possible consequences of alternative interventions under different scenarios. The use of forecasting techniques to project future outcomes based on current trends is a key aspect of the Projection stage.

There are two broad types of forecasting methods:

  • Quantitative (or statistical) methods of forecasting e.g. univariate time-series models; causal models; Monte Carlo simulations
  • Qualitative methods e.g. Delphi method of asking a panel of experts; market research; opinion polls

Stage 5: Actionable insight

During this stage the analyst presents an evaluation of the alternative possible interventions and makes recommendations to the decision maker.

Presentations and business reports should be designed to be appropriate for the specific audience for which they are intended. A business report is typically structured into six main parts: Executive Summary; Introduction; Main Report; Conclusions; Recommendations; Appendices. Data visualisation can be a very effective communication tool in presentations and business reports and is likely to be much more engaging than a sets of bullet points but care should be taken to avoid distorting or obfuscating the patterns in the data. Effective presentations must have a clear purpose and be well planned and well-rehearsed.

Stage 6: Monitoring

The Monitoring stage involves tracking the project Key Performance Indicators (KPIs) during and after implementation.

The implementation plans for projects should, if possible, have decision points built into them. These decision points provide the option to alter the planned intervention if there is any indication that there have been structural changes in the situation subsequent to the original decision. Hence it is important to track the project KPIs during and after implementation to ensure that targeted improvements in performance are achieved and continue to be achieved. Remember data analytics does not end with recommendations for action. Actionable insights should always include recommendations on how the impact of the intervention on performance will be monitored going forward. Dashboards can be a very effective visualisation for monitoring performance.

Read other Related Posts