A Predictable Response by Iain Coleman “Prediction is very difficult, especially of the future.” So said the great quantum physicist Niels Bohr, and what is true of subatomic particles goes double for the global climate system. It is now well-established that human activity is causing the Earth to grow steadily hotter, and that this process will continue for some decades at least. But what consequences will this warming have, and how should we respond to it? This was the focus of the Modelling and Climate Change Research Workshop, held at the e-Science Institute on 19th June. The workshop consisted of a set of parallel demonstrations of various kinds of model that can answer some of these questions, from models of the whole Earth system to predictions of land use changes and the future of the economy. Physical models of the global climate system are becoming ever more sophisticated as our computing power and scientific knowledge increase, and yet the uncertainties in the predictions are as large as ever. As Richard Essery of Edinburgh University’s Centre for Earth System Dynamics explained, this is because, while modelling of individual processes has improved, scientists keep adding new processes to the models. Every time a crude – but precise – parametrisation of some quantity is replaced by a sophisticated – but uncertain – model calculation, a little more noise is added to the system. The resulting predictions are hopefully more accurate than previously, but they are no more precise. Modelling individual parts of the climate system can be tricky enough – atmospheric convection and cloud processes are still crudely parametrised, for example – but it’s when it comes to linking all the pieces together into a truly global model that the problems really start. Processes on land and in the ocean typically occur much more slowly than in the atmosphere, making it challenging to couple them together. Models of human inputs to the environment also need to be included, generally in the form of external forcings to the coupled model. And it was those human factors that were the focus of most of the workshop’s models. Like cloud cover and sea ice, human activity has a feedback relationship with the global climate. Unlike these other processes, that feedback is intelligent and directed. Or at least, it could be. The Regional Impact Simulator, REGIS, presented by Mark Rounsevell (Edinburgh), is a tool that goes further than simulating the impact of various climate change scenarios on regions of Britain. It also predicts the effects of various public policy choices that might mitigate or exacerbate the problems caused by global warming. This is far from being an academic model: it is driven by the needs of policymakers, and is intended to help them make informed choices about practical medium- and long-term planning. In contrast to this vision of rational central planning, Corentin Fontein (Edinburgh) presented an agent-based model of land use that built up a picture of the future development of East Anglia from simulations of the economic and social choices made by individuals. The main focus is on the built environment, and the simulation is driven by property decisions made by individual actors in the context of public policy and scenarios of environmental change. The assumptions about central planning versus the free market, implicit in many models, are explicitly addressed in FARO, Foresight Analysis of Rural Areas of Europe, presented by Marc Metzger (Edinburgh). This looks at changes in agricultural land use on a European scale, under different scenarios that vary along to axes: whether technology and infrastructure are mainly developed in the public or the private sector, and whether or not the investment has the desired effect. The model combines quantitative and qualitative approaches to provide an analysis tool for EU policymakers concerned with the future of the rural economy. Taking a more purely economic approach were two models that examine economic initiatives aimed at limiting climate change. Sandeep Kakani’s agent-based model looks at the EU’s Emissions Trading Scheme. This policy limits the total permitted carbon dioxide emissions from major industries in Europe, and divides that total up into tradeable emissions permits. The agents in the model are industries, power suppliers, market traders, brokers and regulatory bodies. Their interactions generate modelled emissions levels and permit prices as a function of time, allowing researchers and policymakers to explore the predicted behaviour of the scheme. A more holistic view of economics is at the heart of ECCO – Evaluating Capital Creation Options – presented by Nigel Goddard (Edinburgh). This model addresses the physical, rather than financial, economy with energy rather than money as the primary unit of value. The aim is to explore the physical limits of the economy: if something isn’t physically possible, it isn’t economically possible either. Everything we create has an amount of energy embodied in it, comprising all the resources that were used up in its manufacture. For our current way of life to continue, we need to maintain the input of energy that can be turned into houses, laptops and ready meals. The big question is this: as fossil fuels run out, and as we try to move away from carbon-based energy generation, can we switch to a new energy infrastructure while maintaining our way of life? The ECCO model seeks to answer this, by predicting the effects of proposed policies on the amount of energy that can be embedded in human-made capital and relating this to standards of living. Thus national policymakers and their advisors can explore which economic models can have the desired physical effects. This focus on providing advice and insight to policymakers carried on to the concluding panel discussion, featuring Stewart Russell (ISSTI) and Steve Yearley (Genomics Forum). Policymakers need to be able to judge the usefulness and trustworthiness of models, and make decisions between disputed models. At the same time, researchers need to be realistic about the world that the various public stakeholders operate in. This linkage between natural science and public policy creates dilemmas for academic scientists. Hedge all results with the appropriate caveats, qualifiers and uncertainties, and policymakers get frustrated at the lack of clear answers to their questions while mendacious critics exploit the guardedness of the predictions to dismiss the whole idea of global warming. On the other hand, oversimplifying the results and overstating the case risks damaging the public credibility of the entire scientific enterprise, particularly if the prediction are subsequently revised, and is arguably an abdication of professional and intellectual ethics. Beyond the features of the individual models themselves, what this workshop illustrated was the increasing interdisciplinarity of climate studies. Economists need to know about atmospheric chemistry, land use researchers need to understand what technological innovations are likely to appear in the future, and regional planners need a quantitative understanding of the range of likely rainfall scenarios. Climate change is the overarching challenge of our time, and only by combining all our intellectual resources can we prepare ourselves for life on a rapidly changing planet.