Abstracts of ar+cles on Crisis Management Tutorial 1. Cogni/ve failures as causes of organiza/onal crises “Human error has been proposed as the most frequent cause of accidents. Wagenaar and colleagues (1990) argue that many unsafe acts are not simply slips, but inten@onal and reasoned ac@ons that end in unforeseen results. They convincingly show that many accident preven@on programs on the work floor cannot reduce, let alone eliminate human error. In the second ar@cle, Reason (2000) – one of the coauthors of the previous ar@cle – dis@nguishes two approaches to the problem of human fallibility: the person and the system approach. He argues that high reliability organiza@ons – a term that repeatedly recurs throughout the course – recognize that human variability is a force to harness in aver@ng errors, but that they work hard to focus that variability, and that they are constantly preoccupied with the possibility of failure.” - Wagenaar, W.A., Hudson, P. T. W., & Reason, J. T. (1990). Cogni<ve failures and accidents. Applied Cogni<ve Psychology, 4, 273-294. “In this paper we argue that industrial accidents are the end-results of long chains of events that start with decisions at management level. OOen these decisions create latent failures, which may remain hidden for a long @me. Latent failures can be grouped into a limited number of classes, which are called general failure types. Examples are: wrongly designed machines, unsuitable work procedures, and incompa@ble goals. Latent failures will affect the psychological processes determining the actual behaviour of workers on the shop floor. Consequently, these workers will commit unsafe acts that, provided the system lacks appropriate defences, will cause accidents. Our discussion of accidents and their cogni@ve causes follows this logic, and stresses the point that, in modern industries, preven@ve ac@on will be more effec@ve when aimed at changes based on management decisions. The reason for this is that many unsafe acts are not simple slips, but inten@onal and reasoned ac@ons that end in unforeseen results. Erroneous plans are not easily avoided on the shop floor, once they are invited by opera@onal condi@ons replete with latent failures. However, the removal of latent failures requires a more thorough understanding of how they shape behaviour, than is provided by our current insights in cogni@on.” - Reason, J. T. (2000). Human error: Models and management. Bri<sh Medical Journal, 320, 768-770. “Two approaches to the problem of human fallibility exist: the person and the system approaches The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness The system approach concentrates on the conditions under which individuals work and tries to build defences to avert errors or mitigate their effects High reliability organisations—which have less than their fair share of accidents—recognise that human variability is a force to harness in averting errors, but they work hard to focus that variability and are constantly preoccupied with the possibility of failure” - Case: THE SCHIPHOL AIRPORT PRISON FIRE “In the night from 26 to 27 October 2005, a huge fire raged in the prison complex of the airport of Schiphol, killing eleven persons and injuring another fiOeen. This large crisis caused major upheaval in the Netherlands causing the ministers of Jus@ce and of Housingv(VROM) to resign.” 1 Tutorial 2. Decision-making biaises “Top managers and subordinates are expected to make decisions and act. However, humans are poor decision-makers and some@mes these poor decisions have big consequences. Although it seems quite easy to make good and ra@onal decisions in normal rou@ne situa@ons, when humans face adversity and threat making well-thought decisions suddenly becomes very difficult for individuals, teams, as well as for organiza@ons. Why does decision-making become more difficult when people and/or organiza@ons face (emerging) threats, disrup@ons, or crises? Why is it so difficult to keep our heads cool? How do organiza@ons organize the response to those disrup@ons? This session refers to threat-rigidity and decision-making biases as causes of organiza@onal crises as well as to the implica@ons of threat-rigidity in changing (or not) the business models.” - Staw, B. M., Sandelands, L. E., & DuUon, J. E. (1981). Threat-rigidity effects in organiza<onal behavior: A mul<level analysis. Administra<ve Science Quarterly, 26, 501-524. “This paper explores the case for a general threat-rigidity effect in individual, group, and organiza@onal behavior. Evidence from mul@ple levels of analysis is summarized, showing a restric@on in informa@on processing and constric@on of control under threat condi@ons. Possible mechanisms underlying such a mul@ple-level effect are explored, as are its possible func@onal and dysfunc@onal consequences.” - Osiyevskyy, O. & Dewald, J. (2018). The pressure cooker: When crisis s<mulates explora<ve business model change inten<ons. Long Range Planning, 51, 540-560. “How do firms respond to cri@cal threats, such as regulatory turmoils or disrup@ve innova@ons? AOer more than three decades of contradic@ng theore@cal arguments and inconsistent empirical results, this ques@on remains unresolved. One view argues for and finds evidence for amplified propensity to engage in change when organiza@on members perceive cri@cal threat. The other view supports the threat-rigidity phenomenon, reinforcing resistance to change through a focus on habitual prac@ces and rou@nes. To resolve this puzzle, we draw on the mul@-dimensional framework of ‘crisis’ strategic issues processing, supplemented with the behavioral decision making perspec@ve. In par@cular, we inves@gate the condi@ons of emergence of radical (explora@ve) business model change inten@ons within organiza@ons in response to major threats. The resul@ng model suggests that cogni@ve moderators – perceived predictability and @me pressure – lead to highly divergent results of cri@cal threat percep@on, such that low predictability and high @me pressure agenuate the threat-induced explora@ve business model change inten@ons. The model is tested in two empirical contexts, real estate brokerage and higher educa@on, finding strong empirical support.” - Kahneman, D., Lovallo, D., & Sibony, O. (2011). Before you make that big decision…Harvard Business Review, 89, 50-60. “When an executive makes a big bet, he or she typically relies on the judgment of a team that has put together a proposal for a strategic course of action. After all, the team will have delved into the pros and cons much more deeply than the executive has time to do. The problem is, biases invariably creep into any team’s reasoning—and often dangerously distort its thinking. A team that has fallen in love with its recommendation, for instance, may subconsciously dismiss evidence that contradicts its theories, give far too much weight to one piece of data, or make faulty comparisons to another business case. That’s why, with important decisions, executives need to conduct a careful review not only of the content of recommendations but of the recommendation process. To that end, the authors— Kahneman, who won a Nobel Prize in economics for his work on cognitive biases; Lovallo of the University of Sydney; and Sibony of McKinsey—have put together a 12-question checklist intended to 2 unearth and neutralize defects in teams’ thinking. These questions help leaders examine whether a team has explored alternatives appropriately, gathered all the right information, and used wellgrounded numbers to support its case. They also highlight considerations such as whether the team might be unduly influenced by self-interest, overconfidence, or attachment to past decisions. By using this practical tool, executives will build decision processes over time that reduce the effects of biases and upgrade the quality of decisions their organizations make. The payoffs can be significant: A recent McKinsey study of more than 1,000 business investments, for instance, showed that when companies worked to reduce the effects of bias, they raised their returns on investment by seven percentage points. Executives need to realize that the judgment of even highly experienced, superbly competent managers can be fallible. A disciplined decision-making process, not individual genius, is the key to good strategy.” - Case: KODAK’s FAILURE (2010s) “Eastman Kodak Company was founded in 1880 by Goerge Eastman who developed the first snapshot camera in 1888. For almost a hundred years, Kodak led the photograph business with its innova@ons. In 1976 Kodak was the market leader capturing 90% of the film market and around 85% of the camera market and in 1981 Kodak surpassed the 10 billion dollars mark. However, the company anchored itself to its tradi@onal business of film and cameras, even as digital photography emerged. Although one Kodak engineer has managed to build a digital camera in the late 1970s, they were never interested in the inven@on and where convinced that “no one would ever want to look at their pictures on a television set”. In the early 2000s, Kodak, once a dominant force in the photography industry, found itself on the brink of collapse. With digital technology on the rise, their persistent agachment to tradi@onal film cameras led to their eventual bankruptcy in 2012.” Tutorial 3. Socio-cultural causes of organiza/onal crises “Organiza@ons operate in a social context with defined structures, rules and culturally held beliefs. A crisis can arise because of some inaccuracy or inadequacy in the accepted norms and beliefs. Part of the effec@veness of organiza@ons stems from their development of such cultures, but this very property also brings with it the danger of a collec@ve blindness to important issues. Weick (1993) discusses the social cultural-causes of crises as well as four sources that can make organiza@ons more resilient. In the second ar@cle, Wolbers (2022) analyses how crisis managers from different teams made sense of the terrorist agack that happened inside a tram in Utrecht, in 2019.” - Weick, K. E. (1993). The collapse of sensemaking in organizaOons: The Mann Gulch disaster. Administra<ve Science Quarterly, 38, 628-650. “The death of 13 men in the Mann Gulch fire disaster, made famous in Norman Maclean's Young Men and Fire, is analyzed as the interactive disintegration of role structure and sensemaking in a minimal organization. Four potential sources of resilience that make groups less vulnerable to disruptions of sensemaking are proposed to forestall disintegration, including improvisation, virtual role systems, the attitude of wisdom, and norms of respectful interaction. The analysis is then embedded in the organizational literature to show that we need to reexamine our thinking about temporary systems, structuration, nondisclosive intimacy, intergroup dynamics, and team building. The purpose of this article is to reanalyze the Mann Gulch fire disaster in Montana described in Norman Maclean's (1992) award-winning book Young Men and Fire to illustrate a gap in our current understanding of organizations. l want to focus on two questions: Why do organizations unravel? And how can organizations be made more resilient? Before doing so, however, l want to strip Maclean's elegant prose away from the events in Mann Gulch and simply review them to provide a context for the analysis.” 3 - Wolbers, J. (2022). Understanding distributed sensemaking in crisis management: The case of the Utrecht terrorist aZack. Journal of Con<ngencies and Crisis Management, 30, 401–411. “On Monday morning March 18, 2019 a terrorist opened fire inside a tram in the middle of the city of Utrecht. A key challenge in the Utrecht agack was making sense of the situa@on and organizing a coherent response in a distributed command and control structure. This is a recurrent challenge in crisis management. As command structures expand, sensemaking becomes distributed when groups at different loca@ons develop par@al images of a complex environment. While most sensemaking studies focus on how specific groups agempt to collec@vely construct a plausible representa@on of the situa@on, few accounts of distributed sensemaking have appeared. This study explains how crisis managers made sense of the vola@le situa@on across different command structures. Twenty-five crisis managers from different teams were interviewed by making use of the cri@cal decision methodology. The analysis points to five factors that influence the quality of distributed sensemaking: type of interdependence, sensi@vity to opera@ons, plausibility, hierarchy, and iden@ty. It signals that upda@ng one's sensemaking does not only require no@cing discrepant cues but is especially related to key socialcogni@ve and organisa@onal processes that s@mulate doubt, ques@oning, and a plurality of perspec@ves.” - Case: PARIS ATTACKS 2015 “On the evening of 13 November 2015, Paris was struck by an unprecedented series of coordinated terrorist agacks in various loca@ons across Paris. The agacks began when three suicide bombers struck outside the Stade de France, during a football match. This was followed by several mass shoo@ngs, and a suicide bombing at cafés and restaurants. One of the most devasta@ng incidents of the night took place at the Bataclan Concert Hall where gunmen stormed the venue, taking hostages and opening fire on the crowd. Shoo@ngs and bomb blasts leO 130 people dead and hundreds wounded, with more than 100 in a cri@cal condi@on. Seven of the agackers also died, while the authori@es con@nued to search for accomplices.” Tutorial 4. Organiza/onal-technical causes of organiza/onal crisis “There is a long-standing theore@cal debate between two dominant schools on the origins of accidents and reliability: Normal Accident Theory (NAT) and High Reliability Theory (HRT; or High Reliability Organiza@on – HRO). NAT holds that, no mager what organiza@ons do, accidents are inevitable in complex, @ghtly-coupled systems. HRT asserts that organiza@ons can contribute significantly to the preven@on of accidents. Shrivastava and colleagues (2009) provide a (cri@cal) review on the NAT-HRT debate. Hopkins (2001) cri@ques the Normal Accident Theory while reexamines the Three Mile Island nuclear power sta@on accident. Based on the systems thinking, Leveson (2011) proposes alterna@ves to examine the assump@ons and paradigms underlying safety engineering.” - Shrivastava, S., Sonpar, K., & Pazzaglia, F. (2009). Normal accident theory versus high reliability theory: A resolu<on and call for open systems view of accidents. Human Rela<ons, 62, 1357–1390. “We resolve the longstanding debate between Normal Accident Theory (NAT) and High-Reliability Theory (HRT) by introducing a temporal dimension. Specifically, we explain that the two theories appear to diverge because they look at the accident phenomenon at different points of time. We, however, note that the debate’s resolution does not address the non-falsifiability problem that both NAT and HRT suffer from. Applying insights from the open systems perspective, we reframe NAT in a manner that helps the theory to address its non-falsifiability problem and factor in the role of humans in accidents. Finally, arguing that open systems theory can account for the conclusions reached by NAT 4 and HRT, we proceed to offer pointers for future research to theoretically and empirically develop an open systems view of accidents.” - Hopkins, A. (2001). Was Three Mile Island a ‘normal accident’?. Journal of Con<ngencies and Crisis Management, 9, 65–72. “Perrow's normal accident theory suggests that some major accidents are inevitable for technological reasons. An alternative approach explains major accidents as resulting from management failures, particularly in relation to the communication of information. This latter theory has been shown to be applicable to a wide variety of disasters. By contrast, Perrow's theory seems to be applicable to relatively few accidents, the exemplar case being the Three Mile Island nuclear power station accident in the U.S. in 1979. This article re-examines Three Mile Island. It shows that this was not a normal accident in Perrow's sense and is readily explicable in terms of management failures. The article also notes that Perrow's theory is motivated by a desire to shift blame away from front line operators and that the alternative approach does this equally well.” - Leveson, N. G. (2011). Applying systems thinking to analyze and learn from events. Safety Science, 49, 55-64. “Major accidents keep occurring that seem preventable and that have similar systemic causes. Too oOen, we fail to learn from the past and make inadequate changes in response to losses. Examining the assump@ons and paradigms underlying safety engineering may help iden@fy the problem. The assump@ons ques@oned in this paper involve four different areas: defini@ons of safety and its rela@onship to reliability, accident causality models, retrospec@ve vs. prospec@ve analysis, and operator error. Alterna@ves based on systems thinking are proposed.” - Case: 2020 BEIRUT EXPLOSION “On 4 August 2020, a large amount of ammonium nitrate stored at the Port of Beirut (Lebanon) exploded, causing at least 218 deaths, 7.000 injuries, and leaving an es@mated 300.000 people homeless. Approximately 2.750 tons of ammonium nitrate were stored in a warehouse at the port without proper safety measures for the previous six years aOer having been confiscated by Lebanese authori@es from the abandoned ship MV Rhosus. The explosion caused extensive damage over the en@re capital and was felt in countries such as Turkey, Syria, Israel and Cyprus more than 250 km away. This was one of the biggest non-nuclear explosions in history.” Tutorial 5. Teamwork and team adapta/on in the midst of a crisis “Today’s society faces a wide variety of crisis situa@ons. Organiza@ons, therefore, rely on (mul@disciplinary) teams to address those crisis situa@ons and deal with complex challenges under high-pressure condi@ons. Teams have to effec@vely coordinate their efforts and adapt to the changing demands imposed by the shiOing environments. However, especially under stressful condi@ons, the ability of teams to perform effec@vely and @mely may be compromised. The ques@on is: Which team aspects and behaviors contribute to the effec@veness of (mul@disciplinary) crisis management teams? The papers of this session serve as a star@ng point to this discussion. Santos et al. (2021) conducted an experimental longitudinal study to analyze the effect of a concept mapping interven@on on transi@on and reacquisi@on adapta@on via its effect on team cogni@on (i.e., task mental models and transac@ve memory systems). Uitdewilligen and Waller (2018) inves@gated informa@on processing and decision-making behaviors in teams that par@cipate in crisis management training simula@ons at the Port of Rogerdam.” - Santos, C. M., Uitdewilligen, S., Passos, A. M., Marques-Quinteiro, P., & Maynard, M. T. (2021). The effect of a concept mapping interven<on on 5 shared cogni<on and adap<ve team performance over <me. Group & Organiza<on Management, 46, 984-1026. “Research has demonstrated the value of team adaptation for organizational teams. However, empirical work on interventions that teams can take to increase adaptive team performance is scarce. In response, this study pro- poses a concept mapping intervention as a way to increase teams’ ability to adapt following a task change. Particularly, this study examines the effect of a concept mapping intervention on team transition adaptation (the drop in performance after a change) and reacquisition adaptation (the slope of performance after the change) via its effect on task mental models and transactive memory systems. We conducted a longitudinal experimental study of 44 three-person teams working on an emergency management simulation. Findings suggest that the concept mapping intervention promotes reacquisition adaptation, task mental models, and transactive memory systems. Results also suggest that task mental models mediate the effect of the concept mapping intervention on reacquisition adaptation. A post hoc analysis suggests that the concept mapping intervention is only effective if it leads to high task mental model accuracy. Our study presents concept mapping as a practical intervention to promote shared cognition and reacquisition adaptation.” - Uitdewilligen, S. & Waller, M. (2018). Informa<on sharing and decisionmaking in mul<disciplinary crisis management teams. Journal of Organiza<onal Behavior, 39, 731-748. “Mul@disciplinary crisis management teams consist of highly experienced professionals who combine their discipline-specific exper@se in order to respond to cri@cal situa@ons characterized by high levels of uncertainty, complexity, and dynamism. Although the exis@ng literatures on team informa@on processing and decision-making are mature, research specifically inves@ga@ng mul@disciplinary teams facing crisis situa@ons is limited; however, given increasingly turbulent external environments that produce complex crisis situa@ons, increasing numbers of organiza@ons are likely to call upon mul@disciplinary teams to address such events. In this paper, we inves@gate informa@on processing and decision-making behaviors in an exploratory study of 12 organiza@onal mul@disciplinary crisis management teams. We iden@fy three types of informa@on sharing and track the emergence of dis@nct communica@ve phases as well as differences between high- and low-performing teams in the occurrence of sequences of informa@on sharing behaviors. We close by discussing implica@ons for research in this area and for managers of crisis management teams.” - Case: EMERGENCY MANAGEMENT SIMULATION “During the Emergency Management Simula@on, you and your mul@disciplinary team had to manage a crisis situa@on and make important decisions under high-pressure.” Tutorial 6. Planning and preparing for crisis “In this tutorial, we broadly address the ques@on to what extent it is possible to plan and prepare for organiza@onal and societal crises. Preble (1997) explains how adding crisis management’s defensive/preven@ve capability to strategic management’s offensive market posi@oning orienta@on can yield a more comprehensive approach to the strategic management of organiza@ons. Hede (2017) examines factors that contribute to crisis preparedness among municipal leaders. McConnell and Drennan (2006) have a more cri@cal stance to planning and preparedness. They argue that high levels of crisis preparedness are not ‘mission impossible’, but that they are certainly very difficult to achieve.” - Preble, J. F. (1997). Integra<ng the crisis management perspec<ve into the strategic management process. Journal of Management Studies, 34, 769791. 6 “The fields of strategic management and crisis management have been evolving separately despite their poten@al for synergis@c integra@on. This paper explicates how adding crisis management's defensive/preventa@ve capability to strategic management's offensive market posi@oning orienta@on can yield a more comprehensive approach to the strategic management of organiza@ons. The tradi@onal strategic management process is reviewed first and then analysed with respect to the gap that exists in this orienta@on. Examining the differences and similari@es in perspec@ves between strategic management and crisis management and then reviewing the crisis management process provides a basis to proceed with a synthesis of the two fields. The paper concludes with the presenta@on of a new integrated strategic management process model that pushes forward the boundaries of strategic management and internalizes crisis management ac@vi@es into that process.” - Hede, S. (2017). Percep<ons of crisis preparedness and mo<va<on: A study among municipal leaders. Safety Science, 95, 83-91. “The need for communi@es to be prepared for a wide variety of cri@cal events places considerable responsibility on local municipal leaders. However, few studies have examined how these leaders themselves view crisis preparedness issues. The purpose of this study was to examine factors that contribute to three aspects of preparedness among municipal leaders: perceived municipal preparedness, perceived individual preparedness and mo@va@on for preparedness work. Six hypotheses were formulated. The research ques@ons were inves@gated using data from a ques@onnaire sent out to all Swedish municipali@es (N = 290) and four categories of municipal leaders respec@vely (N = 1101). The response rate was 67%. Data were analyzed by linear regression and logis@c regression. Different factors predicted the three outcome variables, which indicates different mental concepts. The hypotheses were partly supported and the results are discussed using self-efficacy theory. The findings have implica@ons for understanding perceived preparedness and mo@va@on, and can be used to e.g. develop crisis management exercises.” - McConnell, A., & Drennan, L. (2006). Mission impossible? Planning and preparing for crisis. Journal of Con<ngencies and Crisis Management, 14, 59-70. “Crisis management logic suggests that planning and preparing for crisis should be a vital part of ins@tu@onal and policy toolkits. This paper explores the difficul@es in transla@ng this ideal into prac@ce. It focuses on four key difficul@es. First, crises and disasters are low probability events but they place large demands on resources and have to compete against front-line service provision. Second, con@ngency planning requires ordering and coherence of possible threats, yet crisis is not amenable to being packaged in such a predictable way. Third, planning for crisis requires integra@on and synergy across ins@tu@onal networks, yet the modern world is characterised by fragmenta@on across public, private and voluntary sectors. Fourth, robust planning requires ac@ve prepara@on through training and exercises, but such costly ac@vi@es oOen produced a level of symbolic readiness which does not reflect opera@onal reali@es. Finally the paper reflects on whether crisis preparedness is a ‘mission impossible’, even in the post-9/11 period when con@ngency planning seems to be an issue of high poli@cal salience.” - Case: BHOPAL GAS DISASTER “The Bhopal gas disaster (or tragedy) was a chemical accident that happened on the night of 2-3 December 1984 at the Union Carbide India Limited (UCIL) pes@cide plant in Bhopal, India. More than 40 tons of methyl isocyanate spilt out from UCIL pes@cide factory turning the city of Bhopal into a massive gas chamber, and immediately killing more than 3.800 people. In total, the gas leak killed more than 15.000 people and affected over 600.000 workers. It was India's first major industrial disaster, and the world's worst industrial disaster.” 7 Tutorial 7. Safety culture “As incidents and accidents may have tragic consequences, safety is a top priority in many of todays’ organiza@ons. As you have learned in the previous sessions, accidents are due to a series of failures, faulty systems, and poor organiza@onal condi@ons. Therefore, researchers have inves@gated ways to improve safety and developed models to suggest how safety-related concepts influence organiza@ons and its employees. In this tutorial session, the background, conceptualiza@on, and theore@cal underpinnings of the safety behaviors and safety culture. Bisbey et al. (2021) review theore@cal models of organiza@onal safety culture and propose a framework for understanding the development of safety culture. Kao et al. (2019) conducted an empirical study in a construc@on company in the energy sector to examine whether safety knowledge affects safety behaviors through safety awtudes and whether supervisory safety awtudes impact the strength of these rela@onships.” Bisbey, T. M., Kilcullen, M. P., Thomas, E. J., Ottosen, M. J., Tsao, K., & Salas, E. (2021). - Bisbey, T. M., Kilcullen, M. P., Thomas, E. J., OUosen, M. J., Tsao, K., & Salas, E. (2021). Safety culture: An integra<on of exis<ng models and a framework for understanding its development. Human Factors, 63, 88– 110. “Objec@ve This study reviews theore@cal models of organiza@onal safety culture to uncover key factors in safety culture development. Background Research supports the important role of safety culture in organiza@ons, but theore@cal progress has been stunted by a disjointed literature base. It is currently unclear how different elements of an organiza@onal system func@on to influence safety culture, limi@ng the prac@cal u@lity of important research findings. Method We reviewed exis@ng models of safety culture and categorized model dimensions by the proposed func@on they serve in safety culture development. We advance a framework grounded in theory on organiza@onal culture, social iden@ty, and social learning to facilitate convergence toward a unified approach to studying and suppor@ng safety culture. Results Safety culture is a rela@vely stable social construct, gradually shaped over @me by mul@level influences. We iden@fy seven enabling factors that create condi@ons allowing employees to adopt safety culture values, assump@ons, and norms; and four behaviors used to enact them. The consequences of these enac@ng behaviors provide feedback that may reinforce or revise held values, assump@ons, and norms. Conclusion This framework synthesizes informa@on across fragmented conceptualiza@ons to clearly depict the dynamic nature of safety culture and specific drivers of its development. We suggest that safety culture development may depend on employee learning from behavioral outcomes, conducive enabling factors, and consistency over @me. Applica@on This framework guides efforts to understand and develop safety culture in prac@ce and lends researchers a founda@on for advancing theory on the complex, dynamic processes involved in safety culture development.” - Kao, K.-Y., Spitzmueller, C., Cigularov, K., & Thomas, C. L. (2019). Linking safety knowledge to safety behaviours: A moderated media<on of supervisor and worker safety aktudes. European Journal of Work and Organiza<onal Psychology, 28, 206-220. “The thousands of deaths and disabili@es due to workplace accidents and injuries each year emphasize the importance of safety research. Despite occupa@onal safety research that has contributed to iden@fying antecedents of safety, ligle is known about why and how safety knowledge leads to safety behaviours and how personal and situa@onal factors interact to promote occupa@onal safety. Using a mul@level, mul@source, and @me-lagged research design, the present study inves@gates whether safety knowledge affects safety behaviours through safety awtudes and further tests whether supervisory safety awtudes can impact the strength of these rela@onships and play a role as moderators of the proposed mediated rela@onship. Data were collected from workers (N = 177) and 8 supervisors (N = 42) in a construc@on company in the energy industry at two @me points. Results indicate full support for the moderated media@on model, demonstra@ng that worker safety awtudes par@ally mediate the rela@onship between safety knowledge and safety behaviours. Moreover, when supervisors had posi@ve awtudes towards safety, both the direct rela@onship between worker safety awtudes and safety behaviours and the indirect rela@onship between safety knowledge and safety behaviours were more posi@ve compared to when supervisors had nega@ve safety awtudes. Theore@cal and prac@cal implica@ons for occupa@onal safety are discussed.” - Case: AMAZON WAREHOUSE CONDITIONS “Over the last years, there have been several reported safety concerns related to Amazon warehouse working condi@ons. Workers have complained about unsafe working condi@ons and the injury risks they face when rushing to fill packages and get them to customers in two days or less. Although Amazon claims that its injury rate is decreasing, at the beginning of 2023 the U.S. Labor Department accused Amazon of failing to keep warehouse workers safe.” Tutorial 8. Voice behavior during crisis “The focus of this session is on voice behavior in the midst of a crisis or poten@al crisis. Morrison (2014) provides an understanding of when and why employees engage in voice and silence behaviors. Weiss et al. (2018) inves@gate the impact of leader language on voice in mul@professional teams during crisis management. Hadley et al. (2023) discuss why employees are reluctant to speak up and iden@fy behaviors that leaders can adopt to encourage employees’ voice behaviors.” - Morrison, E. W. (2014). Employee voice and silence. Annual Review of Organiza<onal, Psychology and Organiza<onal Behavior, 1, 173-197. “When employees voluntarily communicate sugges@ons, concerns, in-forma@on about problems, or work-related opinions to someone in a higher organiza@onal posi@on, they are engaging in upward voice. When they withhold such input, they are displaying silence and depriving their organiza@on of poten@ally useful informa@on. In this ar@cle, I review the current state of knowledge about the factors and mo@va@onal processes that affect whether employees engage in upward voice or remain silent when they have concerns or relevant informa@on to share. I also review the research findings on the organiza@onal and individual effects of employee voice and silence. AOer presen@ng an integrated model of antecedents and outcomes, I offer some poten@ally fruixul ques@ons for future research.” - Weiss, M., Kolbe, M., Grote, G., Spahn, D. R., & Grande, B. (2018). We can do it! Inclusive leader language promotes voice behavior in mul<professional teams. The Leadership Quarterly, 29, 389-402. “Although it is known that leaders can have a strong impact on whether employees voice work-related ideas or concerns, no research has investigated the impact of leader language on voice — particularly in professionally diverse contexts. Based on a social identity approach as well as on collectivistic leadership theories, we distinguish between implicit (i.e., First-Person Plural pronouns) and explicit (i.e., invitations and appreciations) inclusive leader language and test its effects on voice in multiprofessional teams. We hypothesized that implicit inclusive leader language promotes voice especially among team members sharing the same professional group membership as the leader (in-group team members) while explicit inclusive leader language promotes voice especially among team members belonging to a different professional group (out- group team members). These hypotheses were tested in a field setting in which 126 health care professionals (i.e., nurses, resident and attending physicians), organized in 26 teams, managed medical emergencies. Behavioral coding and leader language analyses supported our hypotheses: Leaders' “WE”-references were more strongly related to residents' (ingroup) and explicit invitations related more strongly to nurses' (out-group) voice behavior. We discuss 9 how inclusive leader language promotes employee voice and explain why group membership functions as an important moderator in professionally diverse teams.” - Hadley, C. N., Mortensen, M., Edmondson, A. C. (2023). Make it safe for employees to speak up—especially in risky <mes. Harvard Business Review, Digital Ar<cles, 1-8. “In turbulent @mes like these, it’s natural for people to hold back and avoid taking risks at work. This can mean a reluctance to report mistakes, ask ques@ons, offer new ideas, or challenge a plan. People, whether they’re aware of it or not, try to protect their reputa@ons and jobs. Unfortunately, the same behaviors that feel risky to individual employees are precisely what their companies need in order to thrive in this uncertain economic climate. To solve this dilemma, we encourage leaders to adopt a “winning formula” for achieving a more psychologically safe workplace and the benefits it provides.” - Case: WELLS FARGO’S FAKE ACCOUNT SCANDAL “In September 2016, Wells Fargo, a once trusted and reputable financial ins@tu@on, became the target of criminal inves@ga@on aOer various U.S. regulatory bodies had discovered that its employees had created more than 3.5 million fake and unauthorized accounts. Over the course of four years, at least 5.000 Wells Fargo employees had par@cipated in the scheme, where fake bank and credit card accounts were opened on behalf of clients without their consent. Leaders at Wells Fargo imposed very aggressive sales targets on its employees and promoted cross-selling. Employees who tried to speak up about these prac@ces to management or Human Resources were ignored, terminated, threatened, or worse. Some senior leaders were aware of the aggressive behaviors of supervisors but did not take the problem seriously. By the end of 2018, Wells Fargo faced civil and criminal suits reaching an es@mated $2.7 billion.” - + CRM (Crew Resource Management) training This programme is designed to develop the superior's listening skills, but also the subordinate's ability to express themselves and challenge the superior. It was developed by a NASA psychologist in 1979 and has been the interna@onal standard for airlines since 1990. Its main aim is to maintain hierarchy but encourage a less authoritarian culture, encouraging subordinates to ask ques@ons when they observe mistakes by superiors. Research has shown good effec@veness! Tutorial 9. Crisis communica/on “When a scandal or crisis hits, the organiza@on needs to communicate messages to its stakeholders. However, corporate crises exacerbate stakeholder demands in such a way that conflict can arise between the interests of shareholders and crisis vic@ms. In addi@on, due to the advancements in technology and the reach of the global and social media, stakeholders can be quickly no@fied of an organiza@on’s crisis. Therefore, organiza@ons have to make (good) use of the opportuni@es that social media offers to communicate with stakeholders. Wang et al. (2021) develop a framework that explains how an organiza@on’s response strategies can prevent, contain, and agenuate social disapproval. Hersel et al. (2023) explore how firms’ post-transgression crisis communica@on and execu@vedismissals jointly influence shareholder trust following financial misconduct.” - Wang, X., Reger, R. K., & Pfarrer, M. D. (2021). Faster, hoUer, and more linked in: Managing social disapproval in the social media era. Academy of Management Review, 6, 275–298. “Negative social evaluations and the strategies a firm employs to manage them have garnered significant attention among management scholars. However, past research has given less attention to the social media era’s revolutionary effects on these dynamics. In the present paper, we theorize that 10 the social media era’s greater velocity, emotionality, and communality increase the likelihood that social disapproval will spread faster and more empathetically among a more heterogeneous set of constituents. We develop a framework that explicates how a firm’s response strategies can prevent, contain, and attenuate “social disapproval,” which we define as constituents’ general enmity toward a firm. Our framework predicts that a firm’s greater initial transparency is more likely to prevent social disapproval, while a reticent response to others’ initial disclosures is more effective at containing it. Once social disapproval has spread, however, we posit that a firm can more effectively attenuate it by responding more deliberatively and accommodatively. Further, we theorize how traditional media era contingencies of event severity, actor prominence, and message inauthenticity take on additional meaning and have unexpected effects on the social disapproval management process in the social media era.” - Hersel, M. C., Gangloff, K. A., & Shropshire, C. (2023). Mixed messages: Crisis communica<on-dismissal (in)coherence and shareholder trust following misconduct. Academy of Management Review, 66, 638-666. “We explore how firms’ post-transgression crisis communication and executive dismissals jointly influence shareholder trust following financial misconduct. We argue the coherence of a firm’s crisis management strategy—the degree to which its elements fit together consistently and logically—plays an important but previously unconsidered role in shareholder trust repair. We utilize multiple methods to abductively develop and test our theory. First, we conduct a qualitative comparative analysis of 51 cases of financial misconduct among S&P 1500 firms that disclosed a misstatement via press release and dismissed either the CEO or CFO within 90 days of the disclosure. Analyzing aspects of these firms’ crisis communication and the type of dismissal executed, this study reveals four configurations that highlight the influence of crisis management (in)coherence on shareholder trust. Second, a policycapturing study examines the underlying mechanisms that drive shareholders’ perceptions and intended behaviors relative to manipulated crisis management strategies in a controlled setting. Together, findings from these studies indicate that shareholder trust following misconduct depends in part on the (in)coherence between what firms say about their misconduct, how they communicate that information, and what they do to resolve the problem.” - Case: KISSING SCANDAL AFTER 2023 WOMEN’S WORLD CUP WIN “AOer Spain’s women’s World Cup win (20 August 2023), the president of the Spanish soccer federa@on, Luis Rubiales, kissed one of the players’ lips, Jennifer Hermoso, during the trophy ceremony. AOer the event, Hermoso communicated she did not consent to the kiss, and Rubiales jus@fied his behaviors. Several Spanish ministers, poli@cians, and football players have voiced their discomfort with Rubiales’ behavior and jus@fica@ons. Despite the pressure to resign, Rubiales has refused to do it. On 26 August, FIFA’s disciplinary commigee suspended Rubiales as part of its inves@ga@on. This scandal is s@ll ongoing.” Tutorial 10. Crisis and organiza/onal learning “In this tutorial session, we discuss the possibility that there is something posi@ve to take away from a crisis – that organiza@ons become more resilient and beger prepared to prevent and manage disrup@ve events. Chris@anson et al. (2009) illustrate how organiza@ons can learn through (a series of) rare events. In their case, the rare event (i.e., roof collapse) offered an opportunity for the organiza@on to transform its iden@ty from that of a museum to that of an agrac@on. Carmeli and Schaubroeck (2008) explore how failures can help employees unlearn previously appropriate behaviors to enhance organiza@onal crisis-preparedness.” 11 - Chris<anson, M. K., Farkas, M. T., Sutcliffe, K. M., & Weick, K. E. (2009). Learning through rare events: Significant interrup<ons at the Bal<more & Ohio Railroad Museum. Organiza<on Science, 20, 846-860. “The collapse of the roof of the Bal@more & Ohio (B&O) Railroad Museum Roundhouse onto its collec@ons during a snowstorm in 2003 provides a star@ng point for our explora@on of the link between learning and rare events. The collapse occurred as the museum was preparing for another rare event: the Fair of the Iron Horse, an event planned to celebrate the 175th anniversary of American railroading. Our analysis of these rare events, grounded in data collected through interviews and archival materials, reveals that the issue is not so much what organiza@ons learn "from" rare events but what they learn "through" rare events. Rare events are interrup@ons that trigger learning because they expose weaknesses and reveal unrealized behavioral poten@al. Moreover, we find that three organizing rou@nes-interpre@ng, rela@ng, and re-structuring-are strengthened and broadened across a series of interrup@ons. These organizing rou@nes are cri@cal to both learning and responding because they update understanding and reduce the ambiguity generated during a rare event. Ul@mately, rare events provoke a reconsidera@on of organiza@onal iden@ty as the organiza@on learns what it knows and who it is when it sees what it can do. In the case of the B&O Railroad Museum, we find that the roof collapse offered an opportunity for the organiza@on to transform its iden@ty from that of a museum to that of an agrac@on.” - Carmeli, A. & Schaubroeck, J. (2008). Organisa<onal crisis-preparedness: The importance of learning from failures. Long Range Planning, 41, 177196. “Organisa@onal crises are rela@vely low-probability, high-impact situa@ons that threaten the compe@@veness and viability of an organisa@on. As such, a key managerial challenge is to design and implement an organisa@onal system that is capable of coping with these trauma@c events. The results of this study indicate that learning from failures is an important facilitator of preparedness for both present and prospec@ve crises. Although crisis experience and an industry's technological risk were not significantly related to crisis-preparedness, high-performing organisa@ons reported higher levels of crisis-preparedness. We discuss how these findings may help managers to prepare their organisa@ons more effec@vely for crisis situa@ons.” - Case: 2009-11 TOYOTA VEHICLE RECALLS “Between 2009 and 2011, Toyota Motor Corpora@on recalled more than 20 million vehicles worldwide for various defects. The first two recalls were related to unintended accelera@on issues. The third recalled was related to a hybrid an@-lock brake soOware. Toyota had a $1.2 billion seglement with the U.S. Jus@ce Department and was fine $50 million by the U.S. Na@onal Highway Traffic Safety Administra@on (NHTSA). Toyota was accused of being secre@ve, decep@ve, and slow to respond. The company’s recovery from the crisis was aggravated by an earthquake and tsunami that hit Japan in 2011, which disrupted Toyota’s business opera@ons and supply chain.” Tutorial 11. Ar/ficial Intelligence and Machine Learning for crisis management “Ar@ficial intelligence (AI) and machine learning (ML) can enhance crisis preven@on and management by analyzing vast amount of data to detect early warning signs, predict poten@al threats, and op@mize resource alloca@on. However, there is growing concern that algorithms may reproduce biases via the people building them or through the data used to train them, which has been supported by empirical work. For example, AI systems may encode biases against racial, gender, and religious minority subgroups, which may have detrimental implica@ons for crisis management. In this session, we discuss how AI (and ML) can be applied throughout the crisis management process, the biases in AI/ML, and how to mi@gate them. Cao (2023) presents a systema@c overview of emergency, 12 crisis, and disasters, an AI for smart disaster resilience research landscape and their challenges. Nishant et al. (2023) focus on how AI-based algorithms in ML can result in bias and poor decisions. Manyika et al. (2019) discuss two impera@ves for ac@on and what CEOs and top management teams can do to lead the way on bias and fairness.” - Cao, L. (2023). AI and data science for smart emergency, crisis and disaster resilience. Interna<onal Journal of Data Science and Analy<cs, 15, 231–246. “The uncertain world has seen increasing emergencies, crises and disasters (ECDs), such as the COVID19 pandemic, hurri- cane Ian, global financial inflation and recession, misinformation disaster, and cyberattacks. AI for smart disaster resilience (AISDR) transforms classic reactive and scripted disaster management to digital proactive and intelligent resilience across ECD ecosystems. A systematic overview of diverse ECDs, classic ECD management, ECD data complexities, and an AISDR research landscape are presented in this article. Translational disaster AI is essential to enable smart disaster resilience.” - Nishant, R., Schneckenberg, D., & Ravishankar, M. (2023). The formal ra<onality of ar<ficial intelligence-based algorithms and the problem of bias. Journal of Informa<on Technology. Advance online publica<on. “This paper presents a new perspective on the problem of bias in artificial intelligence (AI)-driven decision-making by examining the fundamental difference between AI and human rationality in making sense of data. Current research has focused primarily on software engineers’ bounded rationality and bias in the data fed to algorithms but has neglected the crucial role of algorithmic rationality in producing bias. Using a Weberian distinction between formal and substantive rationality, we inquire why AI-based algorithms lack the ability to display common sense in data interpretation, leading to flawed decisions. We first conduct a rigorous text analysis to uncover and exemplify contextual nuances within the sampled data. We then combine unsupervised and supervised learning, revealing that algorithmic decision-making characterizes and judges data categories mechanically as it operates through the formal rationality of mathematical optimization procedures. Next, using an AI tool, we demonstrate how formal rationality embedded in AI-based algorithms limits its capacity to perform adequately in complex contexts, thus leading to bias and poor decisions. Finally, we delineate the boundary conditions and limitations of leveraging formal rationality to automatize algorithmic decision-making. Our study provides a deeper understanding of the rationality-based causes of AI’s role in bias and poor decisions, even when data is generated in a largely bias-free context.” - Manyika, J., Silberg, J., & Presten, B. (2019). What do we do about the biases in AI? Harvard Business Review, Digital ar<cles, 2-5. “Over the past few years, society has started to wrestle with just how much human biases can make their way into ar@ficial intelligence systems—with harmful results. At a @me when many companies are looking to deploy AI systems across their opera@ons, being acutely aware of those risks and working to reduce them is an urgent priority. What can CEOs and their top management teams do to lead the way on bias and fairness? Among others, we see six essen@al steps: First, business leaders will need to stay up to-date on this fast-moving field of research. Second, when your business or organiza@on is deploying AI, establish responsible processes that can mi@gate bias. Consider using a porxolio of technical tools, as well as opera@onal prac@ces such as internal “red teams,” or third-party audits. Third, engage in fact-based conversa@ons around poten@al human biases. This could take the form of running algorithms alongside human decision makers, comparing results, and using “explainability techniques” that help pinpoint what led the model to reach a decision – in order to 13 understand why there may be differences. Fourth, consider how humans and machines can work together to mi@gate bias, including with “human-in-the-loop” processes. FiOh, invest more, provide more data, and take a mul@-disciplinary approach in bias research (while respec@ng privacy) to con@nue advancing this field. Finally, invest more in diversifying the AI field itself. A more diverse AI community would be beger equipped to an@cipate, review, and spot bias and engage communi@es affected.” - Case: DeePMinD AI REDUCES GOOGLE DATA CENTRE COOLING BILL BY 40% “From smartphone assistants to image recogni@on and transla@on, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world’s most challenging physical problems - such as energy consump@on. Large-scale commercial and industrial systems like data centres consume a lot of energy, and while much has been done to stem the growth of energy use, there remains a lot more to do given the world’s increasing need for compu@ng power.” - Case: HURRICANE MICHAEL, FLORIDA, 2018 “In October 2018, Hurricane Michael struck Florida with 150 mph winds and 31-foot waves. Recorded as one of the strongest storms to hit the Florida Panhandle in 100 years, the hurricane ul@mately caused $25 billion in damages, according to the Na@onal Hurricane Center.” 14