The Dismal Science: A Personal Reflection – Part Four

Part 4: The Case For Practice-Led Economics

Executive Summary

  • Economics should be the science of hope in dismal times
  • The essence of mainstream economics is captured by the Robbins conception of economics as the study of the allocation of scarce resources among competing ends
  • All of the limitations of mainstream economics flow directly from conceptualising economic behaviour as rational choice – its de-contextualised universality, the emphasis on formal (mathematical) logic as the principal route to knowledge, the reduction of uncertainty to probabilistic risk, and the inherent laissez-faire presupposition against policy activism
  • The essence of the radical Keynesian approach is the Marshall conception of economics as the study of the everyday business of life
  • Radical Keynesian economics is a pragmatist, practice-led approach, grounded in the reality of everyday human economic behaviour and seeking to develop impact theory that provides practical solutions to real-world problems

I started this series of posts with the proposition that Carlyle’s characterisation of economics as the dismal science could be interpreted in two different ways. The negative interpretation is that economics is dismal in its attachment to a rather Panglossian view that “all is best in the best of all possible worlds” with little to be done by way of policy activism by government beyond regulations to protect the competitiveness of markets. From this perspective, the essence of economics is the invisible hand theorem that the price mechanism can ensure a Pareto-optimal general equilibrium provided that markets are free of structural and informational imperfections. In contrast, the more positive interpretation of economics as the dismal science is that economics tries to understand the world in order to provide ways to improve the well-being of people particularly in dismal times. Economics from this more positive perspective is a source of hope that lives can be made better by appropriate interventions by central government and other agencies. I align myself wholeheartedly with the view that economics should be the science of hope in dismal times.

              Much of the debates about the nature of economics and questions about the legitimacy of mainstream (neoclassical) economics can be summarised as the conflict between two fundamentally different conceptions of economics as a subject, what I will call the Robbins conception and the Marshall conception. In An Essay on the Nature and Significance of Economic Science (1931), Lord Robbins took the view that economics is the study of the allocation of scarce resources among competing ends. In so doing, Robbins rejected previous definitions of the subject matter of economics including that of Alfred Marshall, Professor of Economics at Cambridge. In his Principles of Economics (first published in 1890 and arguably the principal economics textbook for the first half of the 20th Century), Marshall had provided a very different definition that “economics is the study of the everyday business of life”.

The Robbins conception of economics captures the essence of the mainstream approach. Economic behaviour is conceptualised as a series of optimising choices by rational economic agents seeking to maximise their well-being (defined as utility for individuals and profits for firms) while operating as a traders in markets regulated by the price mechanism. All of the limitations of mainstream economics flow directly from this conceptualisation of economic behaviour as rational choice – its de-contextualised universality, the emphasis on formal (mathematical) logic as the principal route to knowledge, the reduction of uncertainty to probabilistic risk, and the inherent laissez-faire presupposition against policy activism.

              Mainstream economics ignores the broader context of human economic behaviour and imposes a universal frame of allocative choice in a market system. It adopts a rationalist, axiomatic approach to knowledge in which all economic behaviour is formalised as some form of constrained optimisation amenable to mathematical modelling. The market system is treated as structurally stable with uncertainty reduced to merely a series of random shocks with a well-defined probability distribution. Seemingly sub-optimal market outcomes are modelled as either optimal equilibrium outcomes under conditions of structural and/or informational imperfections or as disequilibrium outcomes with slow speeds of adjustment towards the optimal equilibrium outcome again due to structural and/or informational imperfections. Imperfectionist theories inevitably provide a weak basis for policy activism and tend to favour a more hands-off, laissez-faire approach by central government and other agencies directed more at market reform to remove the imperfections impeding the operation of the market mechanism.

              In aligning myself with a more radical vision of economics as the science of hope in dismal times, I adopt the Marshall conception of economics as the study of the everyday business of life. This summarises the essence of the radical Keynesian approach. Economics is grounded in the reality of everyday human economic behaviour. It is an inherently pragmatist approach of practice-led economics, what I called “impact theory” in Part 3 of this post. It is the “analytics” approach – analysis for practical purpose; analysis to provide practical solutions to real-world problems. It is an approach that open to the possibility that human economic behaviour is complex with often very different modes of activity that are much deeper than just price-based allocative decisions. Formal mathematical modelling and empirical data analysis both have roles in gaining knowledge just as in every other field of scientific endeavour. But there is a recognition that a pervasive feature of human life is uncertainty – we simple do not know the future. We act in anticipation of the future. Our actions are based on our beliefs, our understanding of the world but always with a recognition that our beliefs are partial understandings and the world is continually changing. And our actions are the product of both reasoned judgment and  emotional response to the specific context in which we find ourselves. Life is a process which we can influence but never full control. Structures change, sometimes suddenly and catastrophically leaving us asking “what is going on here?”, “what should we do?”. It is this pragmatist, practice-led vision of economics as the study of the everyday business of life to which this blog seeks to contribute.

Read Other Related Posts

The Keys to Success in Data Analytics

Executive Summary

  • Data analytics is a very useful servant but a poor leader
  • There are seven keys to using data analytics effectively in any organisation:
  1. A culture of evidence-based practice
  2. Leadership buy-in
  3. Decision-driven analysis
  4. Recognition of analytics as a source of marginal gains
  5. Realisation that analytics is more than reporting outcomes
  6. Soft skills are crucial
  7. Integration of data silos
  • Effective analysts are not just good statisticians
  • Analysts must be able to engage with decision-makers and “speak their language”

Earlier this year, I gave a presentation to a group of data analysts in a large organisation. My remit was to discuss how data analytics can be used to enhance performance. They were particularly interested in the insights I had gained from my own experience both in business (my career started as an analyst in the Unilever’s Economics Department in the mid-80s) and in elite team sports. I started off with my basic philosophy that “data analytics is a very useful servant but a poor leader” and then summarised the lessons I had learnt as seven keys to success in data analytics. Here are those seven keys to success.

1.A culture of evidence-based practice

Data analytics can only be effective in organisations committed to evidence-based practice. Using evidence to inform management decisions to enhance performance must be part of the corporate culture, the organisation’s way of doing things. The culture must be a process culture by which I mean a deep commitment to doing things the right way. In a world of uncertainty we can never be sure that what we do will lead to the future outcomes we want and expect. We can never fully control future outcomes. Getting the process right in the sense of using data analytics to make the effective use of all the available evidence will maximise the likelihood of an organisation achieving better performance outcomes.

2. Leadership buy-in

A culture of evidence-based practice can only thrive when supported and encouraged by the organisation’s leadership. A “don’t do as I do, do as I say” approach seldom works. Leaders must lead by example and continually demonstrate and extol the virtues of evidence-based practice. If a leader adopts the attitude that “I don’t need to know the numbers to know what the right thing is to do” then this scepticism about the usefulness of data analytics will spread throughout the organisation and fatally undermine the analytics function.

3. Decision-driven analysis

Data analytics is data analysis for practical purpose. The purpose of management one way or another is to improve performance. Every data analytics project must start with the basic question “what managerial decision will be impacted by the data analysis?”. The answer to the question gives the analytics project its direction and ensures its relevance. The analyst’s function is not to find out things that they think would be interesting to know but rather things that the manager needs to know to improve performance.

4. Recognition of analytics as a source of marginal gains

The marginal gains philosophy, which emerged in elite cycling, is the idea that making a large improvement in performance is often achieved as the cumulative effect of lots of small changes. The overall performance of an organisation involves a myriad of decisions and actions. Data analytics can provide a structured approach to analysing organisational performance, decomposing it into its constituent micro components, benchmarking these micro performances against past performance levels and the performance levels of other similar entities, and identifying the performance drivers. Continually searching for marginal gains fosters a culture of wanting to do better and prevents organisational complacency.

5. Realisation that analytics is more that reporting outcomes

In some organisations data analytics is considered mainly as a monitoring process, tasked with tracking key performance indicators (KPIs) and reporting outcomes often visually with performance dashboards. This is an important function in any organisation but data analytics is much more than just monitoring performance. Data analytics should be diagnostic, investigating fluctuations in performance and providing actionable insights on possible managerial interventions to improve performance.

6. Soft skills are crucial

Effective analysts must have the “hard” skills of being good statisticians, able to apply appropriate analytical techniques correctly. But crucially effective analysts must also have the “soft” skills of being able to engage with managers and speak their language. Analysts must understand the managerial decisions that they are expected to inform, and they must be able to tap into the detailed knowledge of managers. Analysts must avoid being seen as the “Masters of the Universe”. They must respect the managers, work for them and work with them. Analysts should be humble. They must know what they bring to the table (i.e. the ability to forensically explore data) and what they don’t (i.e. experience and expertise of the specific decision context). Effective analytics is always a team effort.

7. Integration of data silos

Last but not least, once data analytics has progressed in an organisation beyond a few individuals working in isolation and storing the data they need in their own spreadsheets, there needs to be a centralised data warehouse managed by experts in data management. Integrating data silos opens up new possibilities for insights. This is a crucial part of an organisation developing the capabilities of an “analytical competitor” which I will explore in my next Methods post.

Read Other Related Posts

Moneyball: Twenty Years On – Part Three

Executive Summary

  • Moneyball is principally a baseball story of using data analytics to support player recruitment
  • But the message is much more general on how to use data analytics as an evidence-based approach to managing sporting performance as part of a David strategy to compete effectively against teams with much greater economic power
  • The last twenty years have seen the generalisation of Moneyball both in its transferability to other team sports and its applicability beyond player recruitment to all other aspects of the coaching function particularly tactical analysis
  • There are two key requirements for the effective use of data analytics to manage sporting performance: (1) there must be buy-in to the usefulness of data analytics at all levels; and (2) the analyst must be able to understand the coaching problem from the perspective of the coaches, translate that into an analytical problem, and then translate the results of the data analysis into actionable insights for the coaches

Moneyball is principally a baseball story of using data analytics to support player recruitment. But the message is much more general on how to use data analytics as an evidence-based approach to managing sporting performance as part of a David strategy to compete effectively against teams with much greater economic power. My interest has been in generalising Moneyball both in its transferability to other team sports and its applicability beyond player recruitment to all other aspects of the coaching function particularly tactical analysis.

              The most obvious transferability of Moneyball is to other striking-and-fielding sports, particularly cricket. And indeed cricket is experiencing an analytics revolution akin to that in baseball stimulated in part by the explosive growth of the T20 format in the last 20 years especially the formation of the Indian Premier League (IPL). Intriguingly, Billy Beane himself is now involved with the Rajasthan Royals in the IPL. Cricket analytics is an area in which I am now taking an active interest and on which I intend to post regularly in the coming months after my visit to the Jio Institute in Mumbai.

              My primary interest in the transferability and applicability of Moneyball has been with what I call the “invasion-territorial” team sports that in one way or another seek to emulate the battlefield where the aim is to invade enemy territory to score by crossing a defended line or getting the ball into a defended net. The various codes of football – soccer, rugby, gridiron and Aussie Rules – as well as basketball and hockey are all invasion-territorial team sports. (Note: hereafter I will use “football” to refer to “soccer” and add the appropriate additional descriptor when discussing other codes of football.) Unlike the striking-and-fielding sports where the essence of the sport is the one-on-one contest between the batter and pitcher/bowler, the invasion-territorial team sports involve the tactical coordination of players undertaking a multitude of different skills. So whereas the initial sabermetric revolution at its core was the search for better batting and pitching metrics, in the invasion-territorial team sports the starting point is to develop an appropriate analytical model to capture the complex structure of the tactical contest involving multiple players and multiple skills. The focus is on multivariate player and team performance rating systems. And that requires detailed data on on-the-field performance in these sports that only became available from the late 1990s onwards.

              When I started to model the transfer values of football players in the mid-90s, the only generally available performance metrics were appearances, scoring and disciplinary records. These worked pretty well in capturing the performance drivers of player valuations and the statistical models achieved goodness of fit of around 80%. I was only able to start developing a player and team performance rating system for football in the early 2000s after Opta published yearbooks covering the English Premier League (EPL) with season totals for over 30 metrics for every player who had appeared in the EPL in the four seasons, 1998/99 – 2001/02. It was this work that I was presenting at the University of Michigan in September 2003 when I first read Moneyball.

              My player valuation work had got me into the boardrooms and I had used the same basic approach to develop a wage benchmarking system for the Scottish Premier League. But getting into the inner sanctum of the football operation in clubs proved much more difficult. My first success was to be invited to an away day for the coaching and support staff at Bolton Wanderers in October 2004 where I gave a presentation on the implications of Moneyball for football. Bolton under their head coach Sam Allardyce had developed their own David strategy – a holistic approach to player management based on extensive use of sport science. I proposed an e-screening system of players as a first stage of the scouting process to allow a more targeted approach to the allocation of Bolton’s scarce scouting resources. Pleasingly, Bolton’s Performance Director thought it was a great concept; disappointingly he wanted it to be done internally. It was a story repeated several times with both EPL teams and sport data providers – interest in the ideas but no real engagement. I was asked to provide tactical analysis for one club on the reasons behind the decline in their away performances but I wasn’t invited to present and participate in the discussion of my findings. I was emailed later that my report had generated a useful discussion but I needed more specific feedback to be able to develop the work. It was a similar story with another EPL club interested in developing their player rating system. Again the intermediaries presented my findings and the feedback was positive on the concept but then set out the limitations which I had listed in my report, all related to the need to use more detailed data than that with which I had been provided. Analytics can only be effective when there is meaningful engagement between the analyst and the decision-maker.

              The breakthrough in football came from a totally unexpected source – Billy Beane himself. Billy had developed a passion for football (soccer) and the Oakland A’s ownership group had acquired the Earthquakes franchise in Major League Soccer (MLS). Billy had found out about my work in football via an Australian professor at Stanford, George Foster, a passionate follower of sport particularly rugby league. Billy invited me to visit Oakland and we struck up a friendship that lasts to this day. As an owner of a MLS franchise, Oakland had access to performance data on every MLS game and, to cut a long story short, Billy wanted to see if the Moneyball concept could be transferred to football. Over the period 2007-10 I produced over 80 reports analysing player and team performance, investigating the critical success factors (CSFs) for football, and developing a Value-for-Money metric to identify undervalued players. We established proof of concept but at that point the MLS was too small financially to offer sufficient returns to sustain the investment needed to develop analytics in a team. I turned again to the EPL but with the same lack of interest as I had encountered earlier. The interest in my work now came from outside football entirely – rugby league and rugby union.

               The first coach to take my work seriously enough to actually engage with me directly was Brian Smith, an Australian rugby league coach. I spent the summer of 2005 in Sydney as a visiting academic at UTS. I ran a one-day workshop for head coaches and CEOs from a number of leading teams mainly in rugby league and Aussie Rules football. One of the topics covered was Moneyball. Brian Smith was head coach of Paramatta Eels and had developed his own system for tracking player performance. Not surprisingly, he was also a Moneyball fan. Brian gave me access to his data and we had a very full debrief on the results when Brian and his coaching staff visited Leeds later that year. It was again rugby league that showed real interest in my work after I finished my collaboration with Billy Beane. I met with Phil Clarke and his brother, Andrew, who ran a sport data management company, The Sports Office. Phil was a retired international rugby league player who had played most of his career with his hometown team, Wigan. As well as The Sports Office, Phil’s other major involvement was with Sky Sports as one of the main presenters of their rugby league coverage. I worked with Phil in analysing a dataset he had compiled on every try scored in Super League in the 2009 season and we presented these results to an industry audience. Subsequently, I worked with Phil in developing the statistical analysis to support the Sky Sports coverage of rugby league including an in-game performance gauge that included a traffic-lights system for three KPIs – metres gained, line breaks and tackle success – as well as predicting what the points margin should be based on the KPIs.

              But Phil’s most important contribution to my development of analytics with teams was the introduction in March 2010 to Brendan Venter at Saracens in rugby union. Brendan was a retired South African international who had appeared as a replacement in the famous Mandela World Cup Final in 1995. He had taken over as the Director of Rugby at Saracens at the start of the 2009/10 season and instituted a far-reaching cultural change at the club, central to which was a more holistic approach to player welfare and a thorough-going evidence-based approach to coaching. Each of the coaches had developed a systematic performance review process for their own areas of responsibility and the metrics generated had become a key component of the match review process with the players. My initial role was to develop the review process so that team and player performance could be benchmarked against previous performances. A full set of KPIs were identified with a traffic-lights system to indicate excellent, satisfactory and poor performance levels.  This augmented match review process was introduced at the start of the 2010/11 season and coincided with Saracens winning the league title for the first time in their history. The following season I was asked by the coaches to extend the analytics approach to opposition analysis, and the sophistication of the systems continued to evolve over the five seasons that I spent at Saracens.

              I finished at Saracens at the end of the 2014/15 season although I have continued to collaborate with Brendan Venter on various projects in rugby union over the years. But just as my time with Saracens was ending, a new opportunity opened up to move back to football, again courtesy of Billy Beane. Billy had been contacted by Robert Eenhoorn, a former MLB player from the Netherlands, who is now the CEO of AZ Alkmaar in the Dutch Eredivisie. Billy had become an advisor to AZ Alkmaar and had suggested to Robert to get me involved in the development of AZ’s use of data analytics. AZ Alkmaar are a relatively small-town team that seek to compete with the Big Three in Dutch football (Ajax Amsterdam, PSV Eindhoven and Feyenoord) in a sustainable, financially prudent way. Like Billy, Robert understands sport as a contest and sport as a business. AZ has a history of being innovative, particularly in youth development with a high proportion of their first-team squad coming from their academy. I developed similar systems as I had at Saracens to support the first team with performance reviews and opposition analysis. It was a very successful collaboration which ended in the summer of 2019 with data analytics well integrated into AZ’s way of doing things.

              Twenty years on, the impact of Moneyball has been truly revolutionary. Data analytics is now an accepted part of the coaching function in most elite team sports. But teams vary in the effectiveness with which they employ data analytics particularly in how well it is integrated into the scouting and coaching functions. There are still misperceptions about Moneyball especially in regard the extent to which data analytics is seen as a substitute for traditional scouting methods rather than being complementary. Ultimately an evidence-based approach is about using all available evidence effectively, not just quantitative data but also qualitative expert evaluations of coaches and scouts. Data analytics is a process of interrogating all of the data.

So what are the lessons from my own experience of the transferability and applicability of Moneyball? I think that there are two key lessons. First, it is crucial that there is buy-in to the usefulness of data analytics at all levels. It is not just leadership buy-in. Yes, the head coach and performance director must promote an evidence-based culture but the coaches must also buy-in to the analytics approach for any meaningful impact on the way things actually get done. And, of course, players must buy-in to the credibility of the analysis if it is to influence their behaviour. Second, the analyst must be able to understand the coaching problem from the perspective of the coaches, translate that into an analytical problem, and then translate the results of the data analysis into actionable insights for the coaches. There will be little buy-in from the coaches if the analyst does not speak their language and does not respect their expertise and experience.

Read Other Related Posts

The Dismal Science: A Personal Reflection – Part Three

Part 3: Great Expectations – Discovering the Meaning of Keynes

Executive Summary

  • I read Keynes’s General Theory initially for its understanding of the macroeconomic consequences of uncertainty, subsequently for its insights into economic methodology and how to conceptualise the capitalist economy, and more recently as an exemplar of a pragmatist impact theory.
  • The General Theory is an “impact theory” that seeks to change not only how we understand the world but, crucially, seeks to change how we intervene to improve the world and the everyday business of life.
  • What really counts in the interpretation of an impact text is not the authenticity of the interpretation but the effectiveness of its actionable insights.
  • I claim no privileged position in understanding Keynes but only that Keynes has a privileged position in the formation of my own understanding of how real-world economies behave.

I came to Keynes by a rather circuitous route. Despite being immersed in the Keynesian-Monetarist debates, I never read Keynes’s General Theory as an undergraduate. The attitude to Keynes seemed similar to that of the pioneering scientists in the natural sciences. You didn’t need to read Newton’s own words to understand the laws of motion which are set out so clearly in modern textbooks; so too with Keynes I was taught. My dis-satisfaction with mainstream economics and the search for alternatives initially led away from Keynes in two main directions – Karl Marx and Herbert Simon. A final-year option on Marxian economics gave me a proper grounding in classical economics from Smith to Marx as well as an introduction to radical literatures that paralleled what I was studying in more mainstream courses on macroeconomics and labour economics. Marxian theories of crisis provided a radical alternative to macroeconomics while Marxian analysis of the labour process gave a radically different conceptualisation to that provided by implicit contract theory, segmented and dual labour markets theories, and New Keynesian imperfectionist theories. The course on Marxian economics also introduced an alternative perspective on the methodology of economics that broadened my horizons beyond Popper and Kuhn.

I was actually introduced to the work of Herbert Simon in my first-year undergraduate course on International Relations. I had to write a paper on the Cuban Missile Crisis and given my growing interest in economics, my tutor suggested that I read Graham Allison’s Essence of Decision. It was this book that persuaded me to specialise in economics since it introduced alternatives to the rational-agent model, particularly Simon and the behavioural theories of the firm of Cyert and March.

By the time I arrived at Cambridge for my graduate studies in the early 1980s, my interest in a more behavioural approach to economics had led me to the problem of understanding decision making under conditions of uncertainty, and that in turn led to the work of George Shackle. It was Shackle’s The Years of High Theory that ultimately opened my eyes to the importance of reading, really reading, The General Theory. So from the outset I approached The General Theory influenced by Shackle and Joan Robinson with a focus on Chapter 12 and what Keynes had to say about long-term expectations. I was very fortunate at Cambridge to be taught macroeconomics by Geoff Harcourt and Bob Rowthorn, a great combination for someone strongly influenced by both radical Keynesian and Marxian critiques of mainstream economics. Both were wonderful role models – radical economists with a thorough knowledge of the history of the subject who combined excellent theoretical skills with a deep understanding of real-world economies, and a strong commitment to improving the lives of others.

The early 1980s was an exciting time to be in Cambridge. Rod O’Donnell was just finishing his PhD on the philosophical foundations of Keynes’s General Theory and attending one of his seminars made me realise the importance of Keynes’s Treatise on Probability. This was reinforced by Gay Meeks who taught the graduate course on Philosophical Issues in Economics. She, more than anyone, was the real starting point for what I call the “New Fundamentalist Keynesian” project of re-reading Keynes’s General Theory from the perspective of A Treatise on Probability. Gay’s paper, ‘Keynes on the rationality of decision procedures under uncertainty: the investment decision’, was first completed in 1976 and circulated around Cambridge for many years but only published in the early 1990s in her edited volume, Thoughtful Economic Man. (It was reconnecting with Gay Meeks and Geoff Harcourt in 2018 that more than anything convinced me to return to economics and that I still had something significant to contribute to the subject.)

The Cambridge-Australian influence on my thinking was strong – Geoff Harcourt and Rod O’Donnell as I have mentioned, but also Peter Kriesler whose work showed me the importance of Kalecki’s contribution. Another Australian I met at Cambridge was Murray Milgate but, unfortunately, I have to admit that his influence on my thinking was more negative. I disagreed profoundly with the Neo-Ricardian project led by John Eatwell and Murray Milgate to create an alternative economics based on Sraffa’s theory of value and Keynes’s principle of effective demand. I attended Eatwell’s undergraduate lectures as well as taking his graduate class. Milgate’s doctoral research on the Neo-Ricardian interpretation of Keynes was published as the book, Capital and Employment, while I was at Cambridge. A detailed critique of the Eatwell-Milgate position became the central focus of my graduate dissertation. I argued that the Neo-Ricardian model, like so much of classical and neoclassical economics, is a static equilibrium model devoid of historical time. In particular Eatwell and Milgate had relegated both short-term and long-term expectations to a mere friction in Keynes’s model whereas I saw the fragility of the state of long-term expectations as central to Keynes’s explanation of involuntary unemployment. In retrospect I realise that I fell into the trap that bedevils economic and political radicalism – devoting too much time and effort on arguing with other radicals on who possesses the “truth” rather than emphasising the commonality of purpose and focusing on countering the arguments of those advocating conservatism.

My Economic Journal 1991 paper in which I drew on hermeneutics to understand the multiple interpretations of Keynes’s General Theory is more reflective of a recognition that the enduring power of The General Theory is its ability to generate multiple interpretations leading to a diversity of research efforts. As a practical economist the concern is the significance of the interpretation as an understanding of real-world economies and as a guide to action. Practical significance is the ultimate criterion of justification for any proposed interpretation of Keynes, not whether or not it represents what Keynes really meant. Authenticity is unattainable in interpretation; at best all we can achieve is an interpretation that is consistent with the text and other related evidence. The General Theory is an “impact theory” that seeks to change not only how we understand the world but, crucially, seeks to change how we intervene to improve the world and the everyday business of life. What really counts in an impact text is the effectiveness of its actionable insights.

Keynes’s General Theory remains powerful and relevant today even although real-world economies have changed massively over the last 90 years or so. There are enduring insights into the behaviour of the economic system, and the limitations of how mainstream economics conceptualises and theorises that behaviour. I read The General Theory initially for its understanding of the macroeconomic consequences of uncertainty, subsequently for its insights into economic methodology and how to conceptualise the capitalist economy, and more recently as an exemplar of a pragmatist impact theory. I claim no privileged position in understanding Keynes but only that Keynes has a privileged position in the formation of my own understanding of how real-world economies behave.

Read Other Related Posts

The Six Stages of the Analytics Process

Executive Summary

  • The analytics process can be broken down further into six distinct stages:  (1) Discovery; (2) Exploration; (3) Modelling; (4) Projection; (5) Actionable Insight; and (6) Monitoring
  • Always start the analytics process with the question: “What is the decision that will be impacted by the analysis?”
  • There are three principal pitfalls in deriving actionable insights from analytical models – generalisability, excluded-variable bias, and misinterpreting causation

The analytics process can be broken down further into six distinct stages:

  1. Discovery
  2. Exploration
  3. Modelling
  4. Projection
  5. Actionable Insight
  6. Monitoring
Figure 1: The Six Stages of the Analytics Process

Stage 1: Discovery

The discovery stage starts with a dialogue between the analyst and decision maker to ensure that the analyst understands the purpose of the project. Particular attention is paid to the specific decisions for which the project is intended to provide an evidential basis to support management decision making.

The starting point for all analytics projects is discovery. The Discovery stage involves a dialogue with the project sponsor to understand both Purpose (i.e. what is expected from the project?) and Context (i.e. what is already known?). The outcome of discovery is Framing the practical management problem facing the decision-maker as an analytical problem amenable to data analysis. It is crucial to ensure that the analytical problem is feasible given the available data.

Stage 2: Exploration

The exploration stage involves data preparation particularly checking the quality of the data and transforming the data if necessary. A key part of this exploration stage is the preliminary assessment of the basic properties of the data to decide on the appropriate analytical methods to be used in the modelling stage.

Having determined the purpose of the analytics project and sourced the relevant data in the initial Discovery stage, there is a need to gain a basic understanding of the properties of the data. This exploratory data analysis serves a number of ends:

  • It will help identify any problems in the quality of the data such as missing and suspect values.
  • It will provide an insight into the amount of information contained in the dataset (this will ultimately depend on the similarity and variability of the data).
  • If done effectively, exploratory data analysis will give clear guidance on how to proceed in the third Modelling stage.
  • It may provide advance warning of any potential statistical difficulties.

A dataset contains multiple observations of performance outcome and associated situational variables that attempt to capture information about the context of the performance. For the analysis of the dataset to produce actionable insights, there is both a similarity requirement and a variability requirement. The similarity requirement is that the dataset is structurally stable in the sense that it contains data on performance outcomes produced by a similar behaviour process across different entities (i.e. cross-sectional data) or across time (i.e. longitudinal data). The similarity requirement also requires that there is consistent measurement and categorisation of the outcome and situational variables. The variability requirement is that the dataset contains sufficient variability to allow analysis of changes in performance but without excessive variability that would raise doubts about the validity of treating the dataset as structurally stable.

Stage 3: Modelling

The modelling stage involves the construction of a simplified, purpose-led, data-based representation of the specific aspect of real-world behaviour on which the analytics project will focus.

The Modelling stage involves the use of statistical analysis to construct an analytical model of the specific aspect of real-world behaviour with which the analytics project is concerned. The analytical model is a simplified, purpose-led, data-based representation of the real-world problem situation.

  • Purpose-led: model design and choice of modelling techniques are driven by the analytical purpose (i.e. the management decision to be impacted by the analysis)
  • Simplified representation: models necessarily involve abstraction with only relevant, systematic features of the real-world decision situation included in the model
  • Data-based: modelling is the search for congruent models that best fit the available data and capture all of the systematic aspects of performance

The very nature of an analytical model creates a number of potential pitfalls which can lead to: (i) misinterpretation of the results of the data analysis; and (ii) misleading inferences as regards action recommendations. There are three principal pitfalls:

  • Generalisability: analytical models are based on a limited sample of data but actionable insights require that the results of the data analysis are generalisable to other similar contexts
  • Excluded-variable bias: analytical models are simplifications of reality that only focus on a limited number of variables but the reliability of the actionable insights demands that all relevant, systematic drivers of the performance outcomes are included otherwise the results may be statistically biased and misleading
  • Misinterpreting causation: analytical models are purpose-led so there is a necessity that the model captures causal relationships that allow for interventions to resolve practical problems and improve performance but statistical analysis can only identify associations; causation is ultimately a matter of interpretation

It is important to undertake diagnostic testing to try to avoid these pitfalls.

Stage 4: Projection

The projection stage involves using the estimated models developed in the modelling stage to answer what-if questions regarding the possible consequences of alternative interventions under different scenarios. It also involves forecasting future outcomes based on current trends.

Having constructed a simplified, purpose-led model of the business problem in the Modelling stage, the Projection stage involves using this model to answer what-if questions regarding the possible consequences of alternative interventions under different scenarios. The use of forecasting techniques to project future outcomes based on current trends is a key aspect of the Projection stage.

There are two broad types of forecasting methods:

  • Quantitative (or statistical) methods of forecasting e.g. univariate time-series models; causal models; Monte Carlo simulations
  • Qualitative methods e.g. Delphi method of asking a panel of experts; market research; opinion polls

Stage 5: Actionable insight

During this stage the analyst presents an evaluation of the alternative possible interventions and makes recommendations to the decision maker.

Presentations and business reports should be designed to be appropriate for the specific audience for which they are intended. A business report is typically structured into six main parts: Executive Summary; Introduction; Main Report; Conclusions; Recommendations; Appendices. Data visualisation can be a very effective communication tool in presentations and business reports and is likely to be much more engaging than a sets of bullet points but care should be taken to avoid distorting or obfuscating the patterns in the data. Effective presentations must have a clear purpose and be well planned and well-rehearsed.

Stage 6: Monitoring

The Monitoring stage involves tracking the project Key Performance Indicators (KPIs) during and after implementation.

The implementation plans for projects should, if possible, have decision points built into them. These decision points provide the option to alter the planned intervention if there is any indication that there have been structural changes in the situation subsequent to the original decision. Hence it is important to track the project KPIs during and after implementation to ensure that targeted improvements in performance are achieved and continue to be achieved. Remember data analytics does not end with recommendations for action. Actionable insights should always include recommendations on how the impact of the intervention on performance will be monitored going forward. Dashboards can be a very effective visualisation for monitoring performance.

Read other Related Posts

The Dismal Science: A Personal Reflection – Part Two

Part 2: Mainstream Macroeconomics – A Nothing-New Consensus?

Executive Summary

  • The postwar Neoclassical Synthesis in mainstream macroeconomics has been replaced by the New Consensus Macroeconomics which combines elements of both New Keynesian Economics and New Classical Economics
  • New Keynesian Economics is effectively just the macroeconomics of market failure with policy activism justified by price and wage stickiness due to structural and informational imperfections
  • New Classical Economics integrated the rational expectations hypothesis into a macro-clearing equilibrium model of the macro economy to provide a powerful argument against policy activism
  • The New Consensus Macroeconomics, just like the earlier Neoclassical Synthesis, reduces the Keynesian-Neoclassical argument over the effectiveness of policy activism to a purely empirical question of the speeds of adjustment and the degree of price and wage stickiness

When I started to study economics in the mid-1970s, it was a period of rapid theoretical development in mainstream macroeconomics. The established post-war consensus in mainstream macroeconomics built around the Neoclassical Synthesis was breaking down because of the obvious disconnect between the rational-agent/constrained-optimisation models of microeconomics and the simple IS-LM and AD-AS macro models based on seemingly ad hoc behavioural assumptions. The search was on for the appropriate microfoundations of macroeconomics. Those of a more Keynesian disposition focused initially on disequilibrium models, providing a sophisticated treatment of quantity adjustments when prices are slow to adjust. But this still left open the question as to why prices, particularly, wages were sticky. Ultimately this led to the emergence of the New Keynesian Economics (NKE) in the 1980s/1990s with a veritable proliferation of choice-theoretic models of price and wage stickiness. The invisible-hand theorem of self-equilibrating markets requires a whole set of structural and informational conditions to be met. Relaxing any of these assumptions and allowing for imperfections could explain why prices and wages might be slow to adjust, or indeed never adjust fully, to the perfectly competitive (full-employment) equilibrium. The NKE is effectively just the macroeconomics of market failures.

              The alternative search for microfoundations focused on the dynamic behaviour of rational agents under conditions of stochastic uncertainty. The New Classical Economics (NCE) rejected the ad hoc assumption of adaptive expectations that had been used in the expectations-augmented Phillips curve (EAPC) to justify the possibility of effective short-run stabilisation policy. The NCE adopted instead the rational expectations hypothesis (REH), assuming that rational economic agents are fully informed of the systematic behaviour of the economy with expectational errors due to unpredictable shocks to the economic system. The NCE integrated the REH into a market-clearing equilibrium model of the macro economy to provide a powerful theoretical argument against policy activism. The NCE argued for a Panglossian world characterised by REH, information-efficient markets, and policy irrelevance. Two influential NCE propositions were Ricardian equivalence and the Lucas critique, both of which reinforced policy irrelevance. Ricardian equivalence implies that forward-looking rational agents treat debt-financing and tax-financing as equivalent so that fiscal policy has no first-order impact on aggregate demand. The Lucas critique recognises the endogeneity of behaviour responses to policy changes with forward-looking rational agents, making the construction optimal policy responses to economic shocks as something of a will-o’-the wisp.

The New Consensus Macroeconomics combines elements of both NKE and NCE. The simple macro model is in many ways just a modern version of the ISLM and AD-AS models of the Neoclassical Synthesis. The IS and AD curves are retained with the supply side represented by the EAPC (incorporating the REH) and the LM curve replaced by a Taylor-type monetary rule relating the nominal interest rate to deviations in inflation and output from their target levels. The case for activist stabilisation policy remains an empirical question of the speeds of adjustment and the degree of price and wage stickiness. In a very real sense the New Consensus Macroeconomics is a Nothing-Fundamentally-New Consensus Macroeconomics.

Related Post

Moneyball: Twenty Years On – Part Two

Executive Summary

  • Financial determinism in pro team sports is the basic proposition that the financial power to acquire top playing talent determines sporting performance (sport’s “ law of gravity”)
  • The Oakland A’s under Billy Beane have consistently defied the law of gravity for over a quarter of a century by using a “David strategy” of continuous innovation based on data analytics and creativity

Financial determinism in pro team sports is the basic proposition that sporting performance is largely determined by the financial power of a team to acquire top playing talent. This gives rise to sport’s equivalent of the law of gravity – teams will tend to perform on the field in line with their expenditure on playing talent relative to other teams in the league. The biggest spenders will tend to finish towards the top of the league; the lowest spenders will tend to finish towards the bottom of the league. A team may occasionally defy the law of gravity – Leicester City winning the English Premier League in 2016 is the most famous recent example – but such extreme cases of beating the odds are rare.

Governing bodies tend to be very concerned about financial determinism since it can undermine the uncertainty of outcome – sport, after all, is unscripted drama where no one knows the outcome in advance. It is a fundamental tenet of sports economics that uncertainty of outcome is a necessary requirement for spectator interest and the financial stability of pro sports leagues. Hence why governing bodies have actively intervened over the years to try to maintain competitive balance with revenue-sharing arrangements (e.g. shared gate receipts and collective selling of media rights) and player labour market regulations (e.g. salary caps and player drafts). And financial determinism creates the danger that teams without rich owners will incur unsustainable levels of debt in pursuit of the dream of sporting success and eventually collapse into bankruptcy (as Leeds United fans know only too well given their experience in the early 2000s).

Major League Baseball (MLB), like the other North American Major Leagues, have actively intervened in the player labour market via salary caps, luxury taxes on excessive spending and a player draft system to try to reduce the disparity between teams in the distribution of playing talent. But financial determinism is still strong in the MLB as can be seen in Figure 1 which shows the average win rank and average wage rank of the 30 MLB team over the 26-year period, 1998 – 2023 (1998 was Billy Beane’s first season as GM at the Oakland A’s). There is a very strong correlation between player wage expenditure and regular-season win percentage (r = 0.691). The three biggest spenders – New York Yankees, Boston Red Sox and LA Dodgers – have been amongst the five most successful teams over the period with the New York Yankees topping both charts (with an average win rank of 5.8 and an average wage rank 1.8).

Figure 1: Financial Determinism in the MLB, 1998 – 2023    

The standout team in defying the law of gravity are Oakland A’s. Over a 26-year period, their average wage rank has been 25.5 but their average win rank has been 13.0 which gives a rank gap of 12.5. Put another way, the A’s have had the 3rd lowest average wage rank over the last 26 years but are in the top ten in terms of their average win rank. Looking at Figure 1, the obvious benchmarks for the A’s in spending terms are Tampa Bay Rays, Miami Marlins and Pittsburgh Pirates but all of these teams have had much poorer sporting performance than the A’s. Indeed in terms of sporting performance as measured by average win rank, the A’s peers are LA Angels, their Bay Area rivals, San Francisco Giants, Houston Astros and Cleveland Guardians (formerly Cleveland Indians) but all of these teams have had much higher levels of expenditure on player salaries.

Figure 2 details the year-to-year record of the A’s over the whole period of Billy Bean’s tenure as GM then Executive Vice President for Baseball Operations. As can be seen, the A’s have consistently been amongst the lowest spenders in the MLB and, indeed, there are only two years (2004 and 2007) when they were not in the bottom third. The regular-season win percentage has been rather cyclical with peaks in 2001/2002, 2006, 2012/2013 and 2018/2019. The 2001 and 2002 seasons are the “Moneyball Years” covered by Michel Lewis in the book when the A’s had the 2nd best win percentage in both seasons. As discussed in Part One of this post, the efficient market hypothesis (EMH) in economics suggests that any competitive advantage based on inefficient use of information by other traders will quickly evaporate when the informational inefficiencies become widely recognised. Hence, the EMH implies that the A’s initial success would be short-lived and other teams would soon “catch up” and start to use similar player metrics as the A’s. Which is exactly what happened. In fact, Moneyball led all other MLB teams to start using data analytics more extensively, some more than others. This is what makes the A’s experience so unique – other teams imitated the A’s in their use of data analytics and developed their own specific data-based strategies but still the A’s kept punching well above their financial weight and making it to the post-season playoffs on several occasions. This suggests that the A’s have been highly innovative in developing analytics-based David strategies which have informed both their international recruitment and player development in their farm system. Just as in the Land of the Red Queen in Alice in Wonderland, so too in elite sport when competing with analytics, you’ve got to keep running to stay still.

Success = Analytics + Creativity.

Figure 2: Oakland A’s Under Billy Beane, 1998 – 2023

Read Other Related Posts

What is Data Analytics?

Executive Summary

  • Data analytics is data analysis for practical purpose
  • The three D’s of analytics are Decision, Domain and Data
  • Data analytics is a key component of an evidence-based approach to decision making
  • Data analytics consists of four modes – exploration (descriptive), modelling (diagnostic), projection (predictive) and decision (prescriptive)

Data analytics is a much used descriptor these days with its own myths and legends, usually of its successes.  The best known of these analytics myths and legends include exploding manholes, vintage wine, pregnant teenagers, Moneyball, Google Flu and Hollywood blockbusters. But what is data analytics?

A useful starting point is Wikipedia’s definition that data analytics is “the discovery and communication of meaningful patterns in data” which highlights the importance of communication. Data analytics is not just data analysis but also the effective presentation of these results. Data analytics always revolves around its intended audience. My own preferred definition is that data analytics is data analysis for practical purpose. This definition puts the stress on practical purpose. Being an empirically-minded academic in a business school, I am surrounded by data analysis but, much to the consternation of some of my colleagues, I have often said that academics don’t tend to do data analytics. Data analysis in business schools like other university faculties, especially in the social sciences, is primarily geared towards developing the academic discipline by publishing peer-reviewed journal articles. Data analytics is data analysis for practical, not disciplinary, purpose. Academic research does not necessarily produce actionable insights whereas the whole point of data analytics is to provide an evidential basis for decisions on what to do. Effective data analytics is always what I now call “impact theory” – using data to understand the world in order to intervene to change the world for the better. Analytics as impact theory is the guiding vision of Winning With Analytics.

Data analytics can be summed up the three D’s – Decision, Domain and Data. Data analytics is driven by the purpose of informing decisions by providing an evidential basis for decision makers to decide on the best available intervention to improve performance. Data analytics can only be effective if the data analysis is contextualised so that the practical recommendations are appropriate for the specific domain in which the decision makers are operating. And data analytics by definition involves the analysis of data but that analysis must be driven by the decision and domain, hence why data is listed last of the three D’s.

The essence of data analytics is improving performance by knowing your numbers. Whatever type of organisation – business, sport, public service or social – and irrespective of the level within the organisation, management is all about facilitating an improvement in performance. Management is about getting the best out people (i.e. efficacy) and the most out of the available resources (i.e. efficiency). Ultimately, data analytics is about producing actionable insight to improve efficacy and efficiency.

Data analytics is often seen as just reporting key performance indicators (KPIs). Reporting KPIs is one of the tasks of business intelligence. But data analytics is much more than reporting KPIs. Indeed one important task of data analytics is to identify the most useful set of KPIs. (The choice of KPIs will be the subject of a future post.) The various roles of data analytics can be summarised by the four modes of analytics:

  1. Exploration (what has been the level of performance?) – the descriptive role of summarising performance using descriptive statistics and data visualisation
  2. Modelling (why has performance changed?) – the diagnostic role of forensically investigating the causes of variation in performance levels
  3. Projection (how could performance be improved?) – the predictive role of projecting future performance based on recent performance trends and possible interventions
  4. Decision (what should be done?) – the prescriptive role of recommendations on the most appropriate intervention to improve current performance levels

The Dismal Science: A Personal Reflection – Part One

Part 1: The Rip van Winkle Effect

Executive Summary

  • The Rip van Winkle effect is the move from an insider perspective to an outsider perspective after a significant period of time with little knowledge of what has happened in the interim
  • Carlyle viewed economics as the dismal science in a very negative way as a science that provides a very dismal laissez-faire conclusion to leave well alone since activist policies cannot improve the economic condition
  • I take a much more positive view that economics is the dismal science in the sense of being the science of how to intervene to help human beings in dismal times.

As an undergraduate student in the mid-1970s, I remember reading Robert Gordon’s 1976 Journal of Monetary Economics paper on recent developments in macroeconomics using the Rip van Winkle device. Gordon asked the question as to what would Rip van Winkle have found so different about the theory of inflation and unemployment in 1976 if he had been asleep for 10 years. I feel in a similar Rip van Winkle position today. After 20 years in economics, I left the academic subject in the mid-1990s. There were many reasons for my exit, partly disillusionment and loss of passion, partly because I thought that I really had nothing more to say that was new, and partly because I wanted to make a practical difference. After leaving economics, I moved into accounting and finance, then data analytics, still working as a business-school academic but also involved in supporting coaches and executives in elite sport. Now 25 years further on, my passion for economics has been reignited and I have started to again present at economics conferences and publish in economics journals. But I now have a very different perspective on economics, what I’ll call the “Rip van Winkle effect” of moving from being an insider to being a complete outsider with little knowledge of what other insiders have been up to in the interim. And these days I am not just an outsider to mainstream economics; I am an outsider to all economics including radical economics. But crucially my outsider’s perspective is a much more pragmatic, practice-oriented perspective that is not so heavily constrained by the demands of the academic discipline. 

Thomas Carlyle, the 19th Century Scottish philosopher, described economics as the dismal science because, as he saw it, economics rather dismally claims that supply and demand is the secret of the Universe which leads to the laissez-faire directive of “letting man alone”, a very dismal conclusion that we can do nothing to improve the economic condition. As a radical Keynesian I fully understand Carlyle’s attitude to (mainstream) economics. But I have always considered economics to be the dismal science in a more positive sense of being the science of how to help human beings in dismal times. I started my studies of economics in the mid-1970s in a post-Watergate/post-Vietnam world suffering the consequences of the first oil price shock, the onset of a world recession and rampant stagflation. I was drawn to the later economics of John Maynard Keynes developed in the dismal times of the 1930s in which the world had to deal with the Great Depression, the rise of totalitarianism and the prospect of global military conflict. Economics to me has always been about making a positive difference to people’s lives, particularly in dismal times. When we look at today’s world and the immense problems faced, the need for an activist economics is as great as ever. But sadly economics is still failing to respond in any meaningful way to the challenges of our times, as wedded as ever to its core Panglossian instinct that all is for the best in the best of all possible worlds. Over the next three weeks I will provide my outsider’s thoughts on the current state of economics, both mainstream and radical.

Moneyball: Twenty Years On – Part One

Executive Summary

  • The lasting legacy of Moneyball is as an exemplar of the possibilities of competitive advantage to be gained from the smarter use of data analytics as part of an evidence-based approach to decision-making
  • The technical essence of Moneyball is using on-base percentage (OBP) as the primary hitter metric in baseball for player recruitment
  • Moneyball shows how Billy Beane and the Oakland A’s developed a David strategy to take advantage of the inefficiency of other MLB teams in valuing the win contributions of players.

Unbelievably it is twenty years ago this month since Michael Lewis’s book, Moneyball: The Art of Winning an Unfair Game, was published. (The subtitle is really important as I’ll discuss later.) It is a book, along with the spin-off Hollywood movie starring Brad Pitt, that has had a massive impact on elite team sports around the world and fundamentally changed the way that teams do things. And it has been hugely significant to me, personally. Moneyball quite simply changed my professional life.

              I’ve told the story so many times of how I came to read Moneyball for the first time. I was visiting the University of Michigan at the end of September 2003 to talk about the work I was doing in professional team sport both academically and as a practitioner. I had developed a player valuation system to estimate transfer values of football players. I was being driven to Detroit airport on the Friday afternoon at the end of my visit when the prof who had invited me said “You must read this new book, Moneyball. It’s you but baseball.” I purchased it in the airport at 6pm that evening and, partly due to a delay in my flight to Edmonton to visit a dear friend and fellow academic, the late Dr Trevor Slack, I completed my first read by 6am Saturday morning. I was blown away. I had been advocating a more data-based approach to player valuation and here was someone, Billy Beane, actually doing it at the elite level and creating a winning team on a very limited budget. A real-life case study of what I came to call a “David strategy” – a smart and financially sustainable way of competing against financial giants. Remember those were the days where my local club, Leeds United, were on the brink of bankruptcy thanks to a financial strategy based more on a roll of the dice than rational calculation. Smart thinking wasn’t much in evidence in that particular boardroom.

              It’s no surprise really that Moneyball is a baseball story in the sense that the first analytics-based approach in a team sport was always most likely to occur in a striking-and-fielding sport such as baseball or cricket for one very simple reason – the ease of data collection. At the core of a striking-and-fielding sports is the one-on-one contest between pitcher/bowler and batter, easily recorded by paper-and-pencil methods. Hence, the essential performance data for baseball and cricket have been widely available from the earliest days. As a consequence, you do not need to be an “insider” working at the elite level of these sports to be able to analyse the data.  Any fan with an interest in analysing baseball and cricket data has been able to do so. For example, Stephen Jay Gould, the evolutionary biologist who developed the theory of punctuated equilibrium (and, incidentally, was a visiting undergraduate student at the University of Leeds), devoted a whole section of his book Life’s Grandeur: The Spread of Excellence from Plato to Darwin (Jonathan Cape, London, 1996) to the evolution of performance in baseball, particularly focusing on why no one has posted a batting average over 0.400 in the MLB since Ted Williams in 1941. Of course, the baseball fan par excellence with an interest in analysing the data is Bill James and it was his analysis more than anything that inspired Billy Beane and the Oakland A’s.

              The technical essence of Moneyball is the use of on-base percentage (OBP) as the primary hitter metric for player recruitment. James had shown that OBP is a much better predictor of game outcomes than the two traditional hitting metrics – the batting average and the slugging average – which both only allow for the batter’s ability to hit their way to base and take no account of their propensity to be walked to base. James actually proposed combining OBP and the slugging average i.e. On-base Plus Slugging (OPS) as the preferred hitting metric. Effectively, conventional baseball wisdom treated walks more as a pitcher error or a pitcher risk-averse tactic rather than allowing for the hitter skill of selecting which pitch to swing at and which to leave. It was this perception of walks that opened up the possibility of a “free lunch”. In economic terms, by using hitting average and slugging average to value hitters and ignoring OBP, the baseball players’ labour market was being inefficient. It would be possible to buy runs more cheaply by targeting hitters that had good hitting/slugging averages but with a high propensity to be walked to base. If this latter skill was not valued by the market, it could be bought for free.

              Moneyball soon found its way onto many business school reading lists as a real-world example of the efficient market hypothesis (EMH) which proposed that there is an inherent tendency for markets to eliminate informational inefficiencies where available information is being used incorrectly. As soon as one trader recognises the inefficiency, they will exploit it by buying under-priced assets and making a profit. In the case of Billy Beane, he acquired under-valued hitters that meant that Oakland could punch way above their financial weight, buying more runs from their limited budget by being smarter than other teams in valuing the win contributions of players. And, in retrospect, it is no surprise that it was Michael Lewis who wrote Moneyball since he started his professional life as a financial trader, well aware of how to use information to profit in markets. No wonder the story of Billy Beane and the Oakland A’s appealed to him. It is a story of enduring appeal not only for baseball but all team sports and, indeed, for any organisation trying to find a David strategy to gain a competitive advantage by being smarter in their use of data. I will discuss this enduring appeal further in Part 2 next week.

Read Other Related Posts