Roman Frigg

Climate Change

Global warming is real and in large part due to human activities. But knowing that the climate is getting warmer on average is of limited use since the impact of climate change occurs at a local scale. Increasingly, state of the art climate models are used to make high-resolution projections about the long-term changes to be expected in our immediate environment. One such endeavour is UKCP09, which aims to provide detailed information about the local effects of global climate change over the twenty-first century. It is predicted, for instance, that under a medium emission scenario the probability for a 20%–30% reduction in summer mean precipitation in London in 2080 is 0.5. This raises the question of exactly what these models deliver. Can these models provide results as advertised? Specifically, do they provide effective decision support and should they form the basis of economic planning and policy-making?

In a collaborative project with Leonard Smith, David Stainforth, Hailiang Du, Reason Machete and Seamus Bradley we study the methods used to generate such high-resolution projections and urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as the basis for policy-relevant probabilistic forecasts can be misleading. This casts doubt on our ability, today, to make trustworthy high-resolution predictions out to the end of this century.

An important aspect of this project lies in the discussion of an underappreciated problem: the effect of structural model error (SME) on the predictive ability of a non-linear dynamical model. While the limitations that sensitive dependence on initial conditions (SDIC) impose on the predictive power of a non-linear model have been widely recognized and extensively discussed, the limitations stemming from structural model error (SME) have largely gone unnoticed. The result is that if a nonlinear model has only the slightest SME, then its ability to generate decision-relevant probabilistic predictions is seriously compromised. Moreover, SME puts us in a worse epistemic situation than SDIC. Given a perfect model, one can take the effects of SDIC into account by substituting probabilistic predictions for point predictions. This route is foreclosed in the case of SME, which undercuts both point forecasts and accurate probabilistic forecasts. Using a betting scenario we show that offering bets with odds based on model-probabilities results in dramatic losses if the model has only the lightest SME. This shows that relying on the forecasts generated with an imperfect non-linear dynamical model can be ruinous. This result is general and in no way restricted to climate models, but it has obvious implications for using climate models to make high resolution forecasts, because these models are both nonlinear and have SME due to inevitable idealisations and simplifications.

Acknowledging the limitations of predictability is no excuse for inaction. Climate change is one of the defining challenges of our age and decisive action is imperative. The thrust of the argument is that we have to relinquish unreasonable demands for detailed projections and instead think about policy making under uncertainty. We need to acknowledge the inherent limitations we face in making predictions about relevant climate variables, in relation both to our ability to assess the impact of possible interventions and to our ethical assessment of them, and to propose techniques for dealing with these limitations. So the challenge is to think about decision-making under conditions of severe uncertainty: situations in which we lack complete information about the probabilities of possible future states of the world, about what actions will be available and what their outcomes will be, and about the desirability of these outcomes.

Survey Articles:

My interest in predictability and climate modelling has grown out of earlier work on randomness and prediction in non-linear systems: