Global warming is real and in large part due to human activities. But knowing that the climate is getting warmer on average is of limited use since the impact of climate change occurs at a local scale. Increasingly, state of the art climate models are used to make high-resolution projections about the long-term changes to be expected in our immediate environment. One such endeavour is UKCP09, which aims to provide detailed information about the local effects of global climate change over the twenty-first century. It is predicted, for instance, that under a medium emission scenario the probability for a 20%–30% reduction in summer mean precipitation in London in 2080 is 0.5. This raises the question of exactly what these models deliver. Can these models provide results as advertised? Specifically, do they provide effective decision support and should they form the basis of economic planning and policy-making?
In a collaborative project with Leonard Smith, David Stainforth, Hailiang Du, Reason Machete and Seamus Bradley we study the methods used to generate such high-resolution projections and urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as the basis for policy-relevant probabilistic forecasts can be misleading. This casts doubt on our ability, today, to make trustworthy high-resolution predictions out to the end of this century.
An important aspect of this project lies in the discussion of an underappreciated problem: the effect of structural model error (SME) on the predictive ability of a non-linear dynamical model. While the limitations that sensitive dependence on initial conditions (SDIC) impose on the predictive power of a non-linear model have been widely recognized and extensively discussed, the limitations stemming from structural model error (SME) have largely gone unnoticed. The result is that if a nonlinear model has only the slightest SME, then its ability to generate decision-relevant probabilistic predictions is seriously compromised. Moreover, SME puts us in a worse epistemic situation than SDIC. Given a perfect model, one can take the effects of SDIC into account by substituting probabilistic predictions for point predictions. This route is foreclosed in the case of SME, which undercuts both point forecasts and accurate probabilistic forecasts. Using a betting scenario we show that offering bets with odds based on model-probabilities results in dramatic losses if the model has only the lightest SME. This shows that relying on the forecasts generated with an imperfect non-linear dynamical model can be ruinous. This result is general and in no way restricted to climate models, but it has obvious implications for using climate models to make high resolution forecasts, because these models are both nonlinear and have SME due to inevitable idealisations and simplifications.
Acknowledging the limitations of predictability is no excuse for inaction. Climate change is one of the defining challenges of our age and decisive action is imperative. The thrust of the argument is that we have to relinquish unreasonable demands for detailed projections and instead think about policy making under uncertainty. We need to acknowledge the inherent limitations we face in making predictions about relevant climate variables, in relation both to our ability to assess the impact of possible interventions and to our ethical assessment of them, and to propose techniques for dealing with these limitations. So the challenge is to think about decision-making under conditions of severe uncertainty: situations in which we lack complete information about the probabilities of possible future states of the world, about what actions will be available and what their outcomes will be, and about the desirability of these outcomes. Research into these issues is currently carried out within the Managing Severe Uncertainty project, a collaboration with my colleagues Richard Bradley, Katie Steele, Alex Voorhoeve and Charlotte Werndl.
‘An Assessment of the Foundational Assumptions in High-Resolution Climate Projections: The Case of UKCP09’, Synthese available on the Journal’s website as Online First Publication, with David A. Stainforth and Leonard A. Smith.
‘Laplace’s Demon and the Adventures of His Apprentices’, Philosophy of Science 81(1), 2014, 31–59, with Seamus Bradley, Hailiang Du and Leonard A. Smith.
‘The Myopia of Imperfect Climate Models: The Case of UKCP09’, Philosophy of Science 80(5), 2013, 886–897, with David A. Stainforth and Leonard A. Smith.
‘Model Error and Ensemble Forecasting: A Cautionary Tale’, in: Guichun C. Guo and Chuang Liu (eds.) Scientific Explanation and Methodology of Science, Singapore: World Scientific 2014, 58-66, with Seamus Bradley, Hailiang Du and Leonard A. Smith.
‘Probabilistic Forecasting: Why Model Imperfection Is a Poison Pill’, in Hanne Anderson, Dennis Dieks, Gregory Wheeler, Wenceslao Gonzalez and Thomas Uebel (eds): New Challenges to Philosophy of Science. Berlin and New York: Springer 2013, 479-491, with Seamus Bradley, Reason L. Machete and Leonard A. Smith.
My interest in predictability and climate modelling has grown out of earlier work on randomness and prediction in non-linear systems:
‘The Ergodic Hierarchy, Randomness, and Chaos’, Studies in History and Philosophy of Modern Physics 37, 2006, 661-691, with Joseph Berkovitz and Fred Kronz.
‘Chaos and Randomness: An Equivalence Proof of a Generalised Version of the Shannon Entropy and the Kolmogorov-Sinai Entropy for Hamiltonian Dynamical Systems’, Chaos, Solitons and Fractals 28, 2006, 26-31.
‘In What Sense Is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour? – Bridging the Gap Between Dynamical Systems Theory and Communication Theory’, British Journal for the Philosophy of Science 55, 2004, 411-434.