Understanding cyclic vulnerability to reduce the risk of global collapse Colin D Butler Australian National University Population vulnerability is cyclic, analogous to immunity. Following epidemics, surviving populations have sufficient antibodies to inhibit repeat infection until a sufficient number of immunologically vulnerability people accrue, due to waning immunity and the maturing of a new generation. Other forms of cyclic risk exist, driven by the waxing and waning of collective memory and behaviour and amplified by the rise and fall of social mechanisms. Three examples are global conflict, inequality and economic history. In the first, strong global social forces following World War II (WWII) led to a sufficiently vigorous social contract to inhibit very large-scale state violence, fortified by numerous institutions including the United Nations. Almost 70 years later, the “social immunity” generated by the two World Wars is still fairly powerful, though some of the institutions are weakening. The second example concerns inequality. Following the Depression and WWII sufficient social forces were liberated to reduce inequality of several forms; in the US memory of the “gilded age” faded, in the UK the National Health Service was born, and the global wave of decolonisation appeared unstoppable. However, gradually, many forms of inequality have reappeared, including in most formerly Communist nations. Economic history comprises the third example. Economic booms and busts have occurred since at least the Great Tulip frenzy (1634-37), and the cycle continues, not least because mainstream opinion in new generations asserts that the problem has been solved – and a new generation of naive speculators and investors is seduced. Today, global civilisation itself is threatened. This risk may be “emergent”, as defined by this meeting, but is also ancient and recurrent. Numerous civilisations have collapsed in the past; what differs today is the global scale of this risk. This is plausible due not only to globalisation but also to the convergence of several forms of risk “immuno-naïveté”. This vulnerability has also been described as arising from the “Cornucopian Enchantment”, a period since roughly 1980, when most economists, decision makers and even the academy reached quasi-consensus that the problem of scarcity had been permanently solved. This hubris seemed rational to a new generation, trained and rewarded to think that economics and ingenuity would of themselves solve all major problems; such pride was fortified (for a time) by data regarding cheap food, cheap energy and declining global hunger. However, in the last decade, data have accumulated that show not just diminishing reserves (eg oil); but less contestable evidence such as rising prices (oil, food), rising unemployment and increased social resentment. Nevertheless, most policy makers remain wedded to the “old-world thinking” that has helped create these developing, interacting crises. What can be more important than to reduce the emergent risk of global civilisation collapse? Failure to lower this risk may lead to a dramatic change in global consciousness, following a period likely to make the Dark Ages seem desirable. Instead, it is vital to “immunise” a sufficient number of people who can then demand, develop and support the requisite radical new policies. These include acceptance that resources are limited, development of green economic systems that will price negative externalities, and revival of fairness of opportunity. John Doyle, Cal Tech. “A Unified Concept of Risk and Uncertainty?” Abstract: Engineers, physicians, scientists, statisticians, economists, social scientists, neuroscientists, etc all have notions of risk and uncertainty, which can differ mildly in terminology, emphases, and details, or be frankly incompatible. I'll focus here on a subset of disciplines that may superficially differ, but have the potential for a deeper unification in theory and methods, including systems, network, and resilience engineering, evolutionary, social, and cognitive psychology and neuroscience, evolutionary biology, emergency medicine and intensive care, engineering for natural and technological disasters, and statistics and math. While there are remains large differences within each of these areas (e.g. engineers who do consumer and entertainment systems vs mission critical infrastructure, evolutionary vs social psychologists, microbial ecosystems evolution and immunology vs population genetics, etc), each has some exciting new developments that create the opportunity for a more unified framework. Particularly exciting from my theoretical perspective is a more unified mathematical foundation for computing, communications, and control that appears relevant to these domains. There is an essential role of tradeoffs involving dynamics, far from equilibrium, feedback (autocatalytic and control), robustness, and efficiency. I'll briefly describe the math progress but focus on illustrating the ideas with hopefully accessible and familiar case studies. Emergent Risk: Financial Markets as Large Technical Systems Donald MacKenzie This paper will discuss the way in which automated trading has turned US stock markets and related ‘derivatives’ markets (especially the market in stock-index futures) into a large technical system, the ‘spinal cord’ of which is the fibre-optic links between the data centres in New Jersey in which stocks are traded and the data centre of the Chicago Mercantile Exchange, in the western suburbs of Chicago, where index futures are traded. The paper will begin by briefly discussing the first full-scale crisis of automated trading, the socalled ‘flash crash’ that took place in the afternoon of May 6, 2010, focussing in particular how that afternoon’s trading disruptions moved between Chicago and New Jersey. The next section of the paper will begin by discussing risks to individual firms (e.g. the $440 million losses incurred by Knight Capital in the first 45 minutes of trading on August 1, 2012) and then move on to emergent risk, ‘the threat to the individual parts produced by their participation in and interaction with the system itself’ (Centeno and Tham). Even ‘bug-free’ programs can interact in unexpected, even bizarre ways, as will be shown by a brief discussion of the interaction of pricing algorithms on Amazon. The paper will then move on to discuss large technical system dynamics in the financial system, and how those dynamics differ from traditional views of the system (e.g. the efficient market hypothesis of financial economics). The current ‘delegitimation’ of the US stock markets will be highlighted, as will the possibility that the large technical system of US stock and stockderivative markets has drifted into Perrow’s dangerous quadrant of tight coupling and high complexity (was the flash crash a ‘normal accident’)? The paper will end, nevertheless, by listing a variety of factors that imply that these issues are not the most serious risk currently faced by global finance: for example, delegitimation has the welcome side effect that Vaughan’s ‘normalization of deviance’ is not taking place with respect to automated trading.