Chapter 5 Decision Making Table of Contents 1. Decision making elements 1.1 Types 1.2 Processes 1.3 Behavioral forces 1.4 Social-loafing 2. Biases 2.1 Individual Biases 2.2 Groupthink 1. Decision Making Elements 1.1 Types The distinctions between programmed and nonprogrammed judgments are simple. “Decisions are programmed to the extent that they are repetitive and routine, to the extent that a definite procedure has been worked out for handling them so that they don’t have to be treated from scratch each time they occur” (Simon, 1977, p. 46). On the other hand, decisions are nonprogrammed “to the extent that they are novel, unstructured and consequential” (Simon, 1977, p. 46). Nonprogrammed decisions are unique and lack clear parameters for achieving a solution, whereas programmed decisions are based on well-understood criteria. Managers can create rules and guidelines for pre-programmed judgments based on existing facts, allowing them to make speedy decisions. Nonprogrammed decisions take longer to settle since additional information, research, ideas, and views may be needed. 1.2 Processes A decision process is a series of activities that starts with identifying a problem and finishes with taking action. A framework might be applied to classify these activities, or patterns in the activities can be permitted to emerge from an investigation and comparison of the examples. Imposing a framework to arrange the data has the drawback of giving an otherwise chaotic process the impression of order. When this type of classification schema is utilized, decision processes may not have the tidy sequence of stages specified. The benefits of generalizability and structure, both of which are necessary for a big data base, were deemed to outweigh the disadvantages. Rational decision-making model The "flavor" of various models of rational choice is essentially determined by the assumptions made about the "givens" or restrictions that must be met in order for rational adaptation to occur. The set of alternatives available for selection, the relationships that determine payoffs as a function of the alternative selected, and the preference-orderings among pay-offs are all restrictions that come with rational decision making. The rational decision-making model's selection of these constraints and rejection of other constraints includes implicit assumptions about which variables the rational organism can change and hence optimize for rational adaptation, and which variables it must accept as fixed. It also takes into account assumptions about the characteristics of the fixed variables. When making a decision, the organism must be able to assign specific payoffs to each conceivable outcome. This also entails the capacity to specify the specific nature of the outcomes, eliminating the possibility of unintended repercussions. The payoffs must be arranged in such a way that it is always feasible to indicate, in a consistent manner, whether one outcome is better than, comparable to, or inferior to another. If certainty or probabilistic rules are used, either the outcomes of alternatives must be known with certainty or precise probability must be able to be attached to them. All options are assessed in most global models of rational choice before a decision is taken. Alternatives are frequently reviewed in order in real-life human decision-making. The mechanism that determines the order of procedures may or may not be known. We may treat the first satisfactory alternative that is analyzed as the one chosen when alternatives are investigated sequentially. This concept is dynamic in the sense that the desire level at any one time is determined by the system's previous history. The payoffs in a given trial may be influenced not only by the option selected in that trial, but also by the options selected in previous trials. The apparent paradox to be faced when attempting to define the rational decision-making model in an organizational context is that the economic theory of the firm and the theory of administration both attempt to deal with human behavior in situations where that behavior is at least intendedly rational. Simultaneously, it can be demonstrated that if we accept the classical theory's global forms of rationality, the problems of the firm's or other organization's internal structure are essentially eliminated. When we substitute a selecting organism with limited knowledge and capacity for economic man or administrative man, the paradox disappears, and the theory emerges. Bounded rationality model: What is the purpose of bounded rationality? Evidence, success, approach, and scarcity in four words. In other words, there is a lot of evidence in psychology and economics that constrained rationality is significant. Economists that use rationality bounds in their models have a lot of success characterizing economic behavior that isn't covered by traditional theory. Traditional economic approach appeals work both ways; the circumstances of a given situation may favor either bounded or unbounded rationality. Respect for scarcity is a key concept of economics, and constrained rationality models adhere to it. Human cognition should be considered as a limited resource. Intuitive Decision-Making Model In today's corporate world, making judgments based on intuition is becoming more popular. In some situations, intuition can be helpful, and it may even be the only way to make a decision. Intuition takes use of the way our brains are wired to think about things subconsciously and bring them to the surface when they're needed. Employees with this expertise will be recognized by a savvy company. In a survey of CEOs about intuitive decision making here is the conclusions on what the intuitive decision-making skills came from; 56 percent indicated experience-based decisions, 40 percent said affect-initiated decisions, 23 percent said cognitive-based decisions, 11 percent claimed subconscious mental processes, and ten percent said value-based decisions. As a result, intuition might be considered a cognitive conclusion based on a decision maker's prior experiences and emotional inputs. 42 percent of these CEOs learned and refined their intuitive decision-making skills through a variety of situations. The researchers identified four types of benefits linked with intuitive decision making: expedites decisions, enhances ultimate decisions, supports personal development, and encourages decisions that are congruent with organizational culture. 1.3 Behavioral forces Decision-making appears to be slowed by politics. The business firm is outside the domain of political science by any plausible definition. Economists, likewise, have largely ignored political systems unless they have an impact on the market. Recent research on the corporate firm as a decision-making coalition has three important implications for political science: 1. Recent experience has shown that the business firm can be thought of as a political conflict system; indeed, the firm serves as a useful test of how non-governmental settings affect political phenomena. 2. The use of computer program models to analyze political systems within businesses supports the idea that computers may be an effective tool for dealing with political conflict systems in general. 3. The apparent theoretical resemblance between a political coalition in business and a political coalition in government suggests that the substantive aspects of recent behavioral models of the company could be valuable as a foundation for equivalent models of governmental decisionmaking. According to the facts, rapid decision makers utilize more information than slow decision makers. They also create more, not fewer, options. In contrary to popular belief, this study discovered that while centralized decision-making isn't always quick, a layered advising approach that prioritizes input from competent counselors is. Conflict resolution, but not conflict itself, appears to be important for decision speed, according to the research. Finally, integrating strategic judgments and tactical plans speeds up rather than slows down decision-making. This type of integration aids decision-makers in overcoming the anxiety that comes with making highstakes decisions. Fast decision making, in general, allows decision makers to keep up with change and is associated with high performance. This research revealed a pattern of emotional, political, and cognitive processes linked to the speedy closure of key decisions. Decision making consequences According to numerous studies released over the years, 50% of all organizational decisions fail. What is the reason for this? Managers who push solutions, limit the search for alternatives, and utilize power to accomplish their goals might be blamed for failures. Failure is caused by poor decision-making and ineffective strategy, not by factors beyond their control such as restrictions. Here are some recommendations for making good judgments and avoiding negative consequences: • • Take control of your decision-making processes. When you take charge, your chances of success grow. Delegating to specialists or others who are supposed to promote your ideas may free up time for you to do other things, but it reduces your chances of success. Make an effort to comprehend. Signals that catch your attention can be signs of additional problems that aren't as serious as they appear, or are more urgent than they need to be. Careful probing can reveal a window into a landscape that provides vital information about what needs to be fixed. The time spent thinking about the problem can • • • • pay off handsomely. A greater awareness of the issues that require your attention provides better direction as well as support for the chosen course of action. Start with an intervention and a goal to set your course. The rationale for action is established through intervention. A goal that specifies the desired outcome broadens the scope of the search for fresh ideas. An open search pays off by lowering the risk of failure. Emphasize the importance of coming up with new ideas and putting them into action. Thinking about action and taking action should be guided by a decision-making process. Many decision-makers understand the value of one but not the other, favoring idea creation over managing the situation's politics. For diplomatic action, there is no alternative for clear reasoning. The production of meaningful ideas as well as the effective dissemination of those ideas are both necessary. Make a list of several options. Taking into account a number of competing possibilities increases decision-making outcomes. The options that aren't chosen aren't thrown away. They assist in confirming the worth of a selected course of action and frequently provide suggestions for how to improve it. To do so, use one of the most effective choice development strategies. Consider creating a solution that includes integrated benchmarking, cycle search, and design. This broadens your search to include a number of ideas from other sources. The finest of them, or a combination of their best traits, provides a solution that increases your chances of success. Overcome impediments to action. To be successful, implementation strategies must overcome social and political barriers to action. Intervention is the most effective technique to overcome the social and political hurdles that can prevent a choice from being made. When employing intervention will divert your focus away from other, more important activities, participation is encouraged. Even if a decision appears to be urgent, avoid edicts and persuasion. 2. Biases 2.1 Individual Biases According to Adam Hayes, Bias is an illogical or irrational preference or prejudice held by an individual, which may also be subconscious. A systematic distortion of the relationship between a treatment, risk factor or exposure and clinical outcomes is denoted by the term bias. Individual bias, on the other hand, can be either cognitive, such as overconfidence, or motivational, such as wishful thinking. In addition, when making judgements in groups, decision makers and experts might be affected by group-level biases. These biases can create serious challenges to decision analysts, who need judgments as inputs to a decision or risk analysis model, because they can degrade the quality of the analysis. We are going to discuss some of the types of individual biases as: Google effect The Google effect, also known as digital amnesia, is the tendency to forget information that is readily available through search engines like Google. We do not commit this information to our memory because we know that this information is easy to access online. Suppose that you’re reading a book and encounter an unfamiliar word. You decide to Google the word to see its definition. A few days later, you encounter the word again… but you can’t seem to remember what it means. This situation describes the Google effect, where because information is readily available online, we do not commit it to memory. It is so easy to “Google it”, that we may find ourselves repeatedly looking up the same information online instead of committing it to memory. This bias exists not only for things we look up on search engines, but for most information that is easily accessible on our computers or phones. Do you know your parents’, or best friend’s number off by heart? The answer is probably no — and this is caused by the Google effect. IKEA effect The IKEA effect, named after Swedish furniture giant, describes how people tend to value an object more if they make (or assemble) it themselves. More broadly, the IKEA effect speaks to how we tend to like things more if we’ve expended effort to create them. When we do things like assemble a piece of furniture or bake a cake, it boosts our sense of self-efficacy. Not only does this feel good in the moment, but it also fulfills a deep psychological need. This is partially why we see items that we put together ourselves as so much more valuable than they are. Research has provided good evidence that this self-efficacy boost plays a role in the IKEA effect. In one experiment, researchers started out by giving participants four math problems to solve. One group got very easy problems (e.g., “How likely is it that a fair coin that is tossed once will come up heads?”), which the other received very difficult ones (e.g., “You have 4 coins. Three of the coins are normal, but one of them is heads on both sides. You pick a coin at random without looking. The coin you pick has heads on one side. What are the odds that if you flip the coin over, the other side will be tails?”). The goal in this part of the experiment was to manipulate people’s sense of competence: the group that got the hard problems were likely to feel stressed out and incapable, while the easy problem group didn’t have their confidence shaken at all. After the math problems, participants were shown a picture of a bookcase from IKEA and asked whether they would prefer to buy it pre-assembled, or to build it themselves. The results showed that people who had had their sense of competence challenged were more likely to say they’d prefer to assemble the bookcase on their own. In other words, feeling like we’re incapable at something increases our desire to prove ourselves and appear competent, leading us to inflate the value of things we have made. Escalation of commitment Escalation of commitment describes our tendency to remain committed to our past behaviors, particularly those exhibited publicly, even if they do not have desirable outcomes. The feeling that our future behaviors must align with the things we have said and done in the past severely compromises our ability to make good decisions. This is especially true when our initial decision has led to unfavorable outcomes. Furthermore, it can be problematic when our past behaviors do not align with our current values. Refusing to change one’s stance may not only lead to undesirable results, but it can also act as a barrier to personal growth. The ability to acknowledge flaws in our past behaviors with the goal of bettering ourselves is incredibly adaptive. It will ultimately gain us greater self-insight and help us to make decisions in a more critical and logical manner. Imagine you’re wrapping up your first year of university, majoring in anatomy and cell biology. You’ve always considered science to be your passion, so it came as no surprise to anyone when this was the path you chose. During your first semester, you enrolled in an elective course about the history of modern Europe. While you generally enjoyed your core anatomy classes, you found yourself enjoying your history class above all others. You enjoyed it so much, in fact, that you decided to take a couple of other history courses in your second semester. Throughout the year, a voice in the back of your mind has been pushing you to change your major, and to get a Bachelor of Arts in History. However, this decision goes against your future goals and everything you’ve ever said about yourself. There’s nothing wrong with changing your mind, yet you feel pressured to keep things consistent. Your hesitation to change your major, even though it’s what you truly want to do, is the result of escalation of commitment. Framing effect The framing effect is when our decisions are influenced by the way information is presented. Equivalent information can be more or less attractive depending on what features are highlighted. Decisions based on the framing effect are made by focusing on the way the information is presented instead of the information itself. Such decisions may be sub-optimal, as poor information or lesser options can be framed in a positive light. This may make them more attractive than options or information are objectively better but cast in a less favorable light. Our choices are influenced by the way options are framed through different wordings, reference points, and emphasis. The most common framing draws attention to either the positive gain or negative loss associated with an option. We are susceptible to this sort of framing because we tend to avoid loss. Consider the following hypothetical: John is shopping for disinfectant wipes at his local pharmacy. He sees several options, but two containers of wipes are on sale. One is called “Bleach ox” and the other is called “Bleach-it.” Both disinfectant wipes Jon is considering are the same price and contain the same number of wipes. The only difference Jon notices, is that the Bleach ox wipes claim to “kill 95% of all germs,” whereas the “Bleach-it” wipes say: “only 5% of germs survive.” After comparing the two, John chooses the Bleach ox wipes. He doesn’t like the sound of germs ‘surviving’ on his kitchen counter. John’s decision to buy the Bleach ox over Bleach-it wipes was informed by the framing effect. Although both products were equally effective at fighting germs, and essentially claimed the same thing, their claims were framed differently. Bleach ox highlighted the percentage of germs it did kill (a positive attribute), whereas Bleach-it highlighted how many germs it did not kill (a negative attribute). Overconfidence Overconfidence bias is the tendency for people to think they are better at certain abilities and skills than they are. This false assessment of our skill levels, stemming from an illusion of knowledge or control, can lead us to make rash decisions. For instance, an overconfident CEO decides to acquire a startup that they see high potential in and believe will bring high returns even though their performance indicates otherwise. Previous success or accomplishments may lead to an inflated ego. While leading with confidence is a good thing, it’s important to not let it get in the way of logical thinking and decision-making. To avoid overconfidence bias, we need to consider the consequences. The decisions you make can have an impact on your company. Before committing to a decision, determine all the possible outcomes to ensure you’re prepared for them. We can also ask for feedback to avoid this bias. Getting feedback from your team can help you identify areas of improvement, whether it’s related to your performance or your ideas. Constructive criticism can keep egos in check. 2.2 Groupthink Groupthink is defined as “A strong concurrence-seeking tendency that interferes with effective group decision making” (Forsyth, 40). The events at Nanking occurred within an altered mode of thinking that essentially made the Japanese soldiers incapable of making rational decisions. From the outset, the Japanese soldiers living in the occupied city of Nanking formed an intensely cohesive group. Sociological research suggests that members of such unified groups lose the ability to appraise a situation and devise alternative action plans realistically. To maintain unity within the group, the Japanese based their decision-making process on reaching a complete agreement, resulting in tragic errors of judgment that could have otherwise been avoided. To understand the concept of groupthink and how it relates to the events at Nanking, it is essential to examine the symptoms and the causes of this decision-making disorder. The Groupthink hypothesis was first proposed as a psychological phenomenon by Janis (1972). Janis conceived groupthink as “a mode of thinking that people engage in when deeply involved in a cohesive in-group when the members’ striving for unanimity overrides their motivation to realistically appraise alternative courses of action” (Janis, 1972, p.8). Janis supported his hypothesis by analyzing several political-military fiascoes and successes differentiated by the occurrence or non-occurrence of antecedent conditions, groupthink symptoms, and decisionmaking defects. (Morehead, G., Ference, R., & Neck, C. P., 1991) In a new volume on groupthink, Janis added more to his study of groupthink by analyzing transcripts from the Watergate scandal and memoirs and first-hand accounts of critical individuals and principals involved in the scandal. Janis concluded that the Watergate cover-up decision resulted from groupthink (Janis, 1983). Groupthink is prevalent in today’s workplaces, especially as people try to fit in or adopt the posture of “not rocking the boat” in these scenarios, people set aside their beliefs or adopt the opinion of the rest of the group. People who are opposed to the decisions or overriding views of the group frequently remain quiet, preferring to keep the peace rather than disrupt the uniformity of the crowd. Groupthink can be problematic, but even people with good intentions are prone to making irrational decisions due to overwhelming pressure from the group. Signs of Groupthink Groupthink may not always be easy to discern, but some signs are present. There are also some situations where it may be more likely to occur. Janis identified several different "symptoms" that indicate groupthink. 1. Illusions of unanimity This symptom leads members of a group or team to believe, albeit falsely, that everyone agrees and feels the same way. It is often much more challenging to speak up when everyone else in the group is on the same page. People in a group or team suffering from this symptom often think they are doing what is best for the team by not breaking the group’s unanimity with their differing opinions or ideas. 2. Illusions of Morality The illusion of morality, in which members participate in the group decision-making process, loses sight of their moral principles. Instead, belief in the overall character of the group overrides any personal sense of right and wrong. Groups that make huge errors in judgment tend to formulate incorrect conclusions about the group’s true intentions. 3. Rationalization Rationalization prevents members from reconsidering their beliefs and causes them to ignore warning signs. Excuses are made for obvious red flags. This is when team members convince themselves that despite evidence to the contrary, the decision or alternative being presented is the best one. "Those other people don't agree with us because they haven't researched the problem as extensively as we have." 4. Stereotyping As the group becomes more uniform in their views, they begin to see outsiders as possessing a different and inferior set of morals and characteristics from themselves. These perceived negative characteristics are then used to discredit the opposition. It leads in-group members to ignore or even demonize out-group members who may oppose or challenge the group's ideas. Stereotyping causes members of the group to ignore essential ideas or information. "Lawyers will find any excuse to argue, even when the facts are clearly against them." 5. Self-Censorship Self-censorship causes people who might have doubts to hide their fears or misgivings. Rather than sharing what they know, people remain quiet and assume that the group must know best. The need to conform to the group’s ideas leads individual members to censor their opinions or views. "If everyone else agrees, then my thoughts to the contrary must be wrong." 6. Peer Pressure When a team member expresses an opposing opinion or questions the rationale behind a decision, the rest work together to pressure or penalize that person into compliance. Members who pose questions and those who question the group are often seen as disloyal or traitorous. "Well, if you feel that we're making a mistake, you can always leave the team." 7. Illusions of vulnerability The illusion of invulnerability is another symptom of groupthink. Members of a group in which no one voices their disagreement may perceive that their group is performing well. In essence, group members believe that their group could not possibly perform sweeping errors in judgment. Members are highly self-assured and confident in the group’s decision-making ability. Overconfidence in the group’s decision-making powers leads members to form an illusion of invulnerability. Members believe they are invulnerable to any obstacle, allowing them to push aside clear and analytical thinking. 8. Mind Guards Mind guards also affect groupthink. A mind guard is a member of the group who, to preserve the central group idea, omits any information which may cause doubts to arise within the group. A mind guard assumes the responsibility of sheltering the other group members from any “controversial” information that may disrupt the overall group dynamic. If a mind guard receives any negative outside details, he does not relate them to the group. A mind guard also applies pressure to dissenting members, ultimately forcing them into silence. To this end, the mind guard may employ various strategies to persuade the dissenter to change his opinion. One of these strategies would be to convince the dissenter that the group may disintegrate if all members are not in total agreement. The goal of a mind guard is to prevent any questions regarding the group’s decisions from becoming apparent to the other group members. Causes of Groupthink There are several leading causes of groupthink. These include group cohesiveness, overall group isolation, group leadership, and decision-making stress. Group friction is necessary for good decision-making because it introduces different perspectives to the decision-making process. High levels of cohesiveness decrease the amount of verbal dissension within a tight group due to interpersonal pressure to conform. This high level of cohesiveness also creates self-censorship and apparent unanimity within the group. In the absence of this disagreement, choices for action are never considered. Another cause of groupthink is isolation. Often, the decisions being made or the actions being carried out must remain secret in group situations. This requires that no outside opinions or thoughts be incorporated into the decision-making process. Frequently, groups reach resolutions and carry them out without conferring with any external sources. One result of this extreme isolation is insulation from criticism. This absence of objection may lead to illusions of group invulnerability and morality. The leadership of a group can also lead to groupthink since complete control over the group by the leader can cause an environment in which no one states their own opinions. When extremely authoritarian leadership is implemented within a group (such as in the military), group discussions are often tightly controlled. Suppose a leader in a group situation makes his opinion clear at the outset of the talks. In that case, group members will, on many occasions, refrain from expressing any disagreement with the leader’s authority. Any dissenting opinions tend to be suppressed through intimidation or by simply not allowing the dissenter to voice his objections. Another common cause of groupthink is decisional stress. When a group is forced to make an important decision, everyone often harbors insecurity. Without being aware, group members will often attempt to reduce this decisional stress since this insecurity is lessened if the decision is made quickly. The group can easily rationalize a decision with little disagreement because there is minimal friction. The positive consequences of the group’s decision serve as the focus, while there is a minimization of any adverse outcomes. Concentrating on minor details of group decisions or actions is how the group can overlook more significant issues that may need attention. In high-pressure group decision-making, attempts by members to reduce the stress associated with decision-making often result in groupthink. Groupthink and the Challenger Disaster While part of a team or group, have you ever felt pressured to do something that led to a fateful decision? For that exact reason, in January of 1986, the orbiter Challenger exploded 73 seconds after launch. Groupthink theory could help explain how leaders and decision-makers played a significant part in the disaster that occurred in 1986. America was becoming disinterested in spaceflight, and NASA saw its space shuttle program's dwindling popularity and excitement. Decision-makers and top echelons at NASA and Morton Thiokol cared more about satisfying and entertaining their primary customer, the American people, than the safety of the launch and its crew members. After the explosion, the Rogers Commission examined the causes of the blast, and one of the "potentially catastrophic" elements was a rubber part called an O-ring. The article, Challenger Explosion: How Groupthink and Other Causes Led to the Tragedy, states, "The O-ring was known to be sensitive to the cold and could only work above 53 degrees." The temperature on the launch pad that morning was 36 degrees. With this knowledge that NASA and Morton Thiokol had, how did the launch get approved for launch? Was it a lack of communication amongst the groups, a way to chase publicity that the companies saw was dwindling, a result of the group's central pressure internally and externally, or all three? National, group, and political pressure on NASA and Morton Thiokol, which built the solid rocket boosters to get the Challenger launched on time. NASA had averaged five missions a year after the projected frequency of the space shuttle program was 50 flights a year. How could they keep America's interests if they could not have as many missions as initially promised? They diversified the astronaut crews with women, people of color, and scientists, but that proved insufficient to keep the country's attention. President Ronald Reagan was also announcing the launch at his Union address that night. The only option that NASA and Morton Thiokol felt they had was to continue with the launch as scheduled. As we know that faulty decision-making led to seven people losing their lives. The effects of groupthink could be small or big, but regardless of the impact of the flawed decision, people need to know about it while trying to prevent it. Janis (1983) proposed a set of prescriptions for avoiding groupthink. The drugs generally focus on helping a group examine all relevant information and courses of action to ensure that it does not rush into making a poorly informed and reasoned decision. "Maybe if NASA and Morton Thiokol had followed Janis' set of prescriptions for preventing groupthink or examined all of the information before rushing into the launch due to pressure they felt, the outcome of that day could have been different. The Challenger tragedy led NASA to focus on a safer future in space by fixing communication and the management of safety at the organization. Groupthink and the Rape of Nanking During their occupation of the city of Nanking, Japanese forces perpetrated inconceivable acts of violence and disrespect towards human life. Our initial response to this occurrence is to question how people can commit such atrocities on fellow human beings. To obtain a firmer grasp of what occurred at Nanking, we must first look at the event from a sociological perspective. When undertaken by large groups of people, the decision-making process often produces unexpected (and illogical) results. The initial orders given to the Japanese soldiers to kill all prisoners were brutal in and of themselves, but the soldiers went well beyond those orders. Ultimately, the soldiers at Nanking chose to mutilate, torture, and rape the city’s inhabitants. Groupthink is a syndrome that develops in aggregates of people that often results in unexplainable decisions. In attempting to understand the events which culminated in the rape of Nanking, it is necessary to examine the mental decision-making which could have led to such an unanticipated and inhumane outcome. Symptoms There were many symptoms of groupthink present in the overall environment at Nanking. For instance, within military situations, interpersonal pressures to conform are intense. Tolerance for nonconformity is virtually non-existent, and extreme tactics to bring dissenters into line are common. Self-censorship was most likely another vital symptom of groupthink at Nanking. Privately, many soldiers may have disagreed with what was occurring, but they chose to keep their doubts. This self-censorship led to the appearance of unanimity among the soldiers. Even though many Japanese soldiers may have inwardly objected to the events that were taking place, there was an apparent unanimity among the group. The soldiers' objections to these events never surfaced because of the pressure to conform. If the “norm” appeared to be the torture of the Chinese captives, this false sense of unanimity discouraged each soldier from going against the overall group dynamic. Another symptom of groupthink common in military situations is an illusion of invulnerability. Since the Japanese soldiers had conquered the entire city, their confidence in the group led them to believe that significant errors were impossible, viewing their decision-making process as infallible. The Japanese soldiers may also have experienced illusions which allowed them to warp their sense of morality. Individual morals were lost in the overwhelming group’s desire to take total control of Nanking. Justification for the atrocities was embedded in the group’s passion for complete submission to the Chinese. The Japanese soldiers’ illusions of morality among their fellow citizens outweighed any personal moral thought. The Japanese soldiers shared biased perceptions of the Chinese. They did not view the people of Nanking as whole people. The Chinese were simply the enemy to the Japanese soldiers, and the enemy did not deserve to live. A dehumanization occurred, resulting in countless mutilations and rapes. Finally, defective decision-making strategies illustrate the occurrence of groupthink in Nanking. The decision to bury prisoners waist-high to allow dogs to attack the top half of their bodies was cruel, as was the decision to enable soldiers to compete in decapitation contests. Numerous people participated in arriving at these outrageous decisions. These group members lost sight of their overall goal and became wrapped up in the individual issues surrounding the disposal of the prisoners of war. How to Avoid Groupthink The challenge for any team or group leader is to create a working environment in which Groupthink is unlikely to happen. It is also essential to understand the risks of Groupthink – if the stakes are high, you need to make a real effort to ensure that you're making good decisions. To avoid Groupthink, it is essential to have a process to check the fundamental assumptions behind important decisions, validate the decision-making process, and evaluate the risks involved. For significant decisions, make sure your team does the following in their decisionmaking process: a) Explores objectives. b) Explores alternatives. c) Encourages ideas to be challenged without reprisal. d) Examines the risks if the preferred choice is chosen. e) Tests assumptions. f) If necessary, go back and re-examine the initially rejected alternatives. g) Gathers relevant information from outside sources. h) Processes this information objectively. i) Has at least one contingency plan. j) Many group techniques can help. Groupthink Mitigation Tools Group Techniques Brainstorming: Helps ideas flow freely without criticism. Modified Borda Count: Allows each group member to contribute individually, mitigating the risk that more substantial and persuasive group members dominate the decision-making process. Six Thinking Hats: Helps the team look at a problem from many different perspectives, allowing people to play "Devil's Advocate.” The Delphi Technique: This Allows team members to contribute individually, with no knowledge of a group view, and with a minor penalty for disagreement. Decision Support Tools Risk Analysis: Helps team members explore and manage risk. Impact Analysis: Ensures that the consequences of a decision are thoroughly explored. The Ladder of Inference: Helps people check and validate the individual steps of a decisionmaking process. How to Overcome Groupthink However, if Groupthink sets in, you must recognize and acknowledge it quickly to overcome it and promptly get back to functioning effectively. Follow these steps to do this: Even with good group decision-making processes in place, be on the lookout for signs of Groupthink so that you can deal with them swiftly. If there are signs of Groupthink, discuss these in the group. Once acknowledged, the group can consciously free up its decision-making. Assess the immediate risks of any decision and the consequences for the group and its customers. If chances are high (for example, risk of personal safety), make sure you take steps to validate any decision before it is ratified fully. If appropriate, seek external validation, get more information from outside, and test assumptions. Use the bullets above as a starting point in diagnosing things that need to change. Introduce formal group techniques and decision-making tools, such as the ones listed above, to avoid Groupthink in the future. References Burke, Lisa A. and Miller, Monica K., Taking the mystery out of intuitive decision making, Academy of Management Executive, Vol 13, No. 4, (1999). Conlisk, John, Journal of Economic Literature, Vol 34, No. 2 (Jun., 1996), pp. 669-700, American Economic Association, https;//www.jstore.org/stable/2729218. Druckman, J. (2001). The Implications of Framing Effects for Citizen Competence. Political Behavior, 23(3), 225-256. Retrieved July 25, 2020, from www.jstor.org/stable/1558384 Eisenhardt, Kathleen M., The Academy of Management Journal, Vol. 32, No. 3 (Sep., 1989), pp. 543-576, Academy of Management, https://www.jstor.org/stable/256434. Forsyth, Donelson R. "The Psychology of Groups." In Psychology, edited by R. Biswas-Diener and E. Diener. Noba Textbook Series. Champaign, IL: DEF Publishers, 2014. http://nobaproject.com/textbooks/introduction-to-psychology-the-full-noba-collection. Gigerenzer, Gerd and Goldstein, Daniel G., Reasoning the Fast and Frugal Way: Models of Bounded Rationality, Psychological Review, Vol 103, No. 4 (1996), pp. 650-669. Gruman, J. A., Schneider, F. W. , &. Coutts, L.M. (Eds.). (2016). Applied social psychology: Understanding and addressing social and practical problems 3rd edition. SAGE Publications. Hayes, Adam. “What Is Bias in Investing?” Investopedia, Investopedia, 27 Sept. 2021, https://www.investopedia.com/terms/b/bias.asp. JANIS, I. L. Groupthink (2nd ed., revised). Boston: Houghton Mifflin, 1983. JANIS, I. L. Victims of groupthink. Boston: Houghton Mifflin, 1972 Lovallo, D., & Kahneman, D. (2003, July). Delusions of success: How optimism undermines executives’ decisions. Harvard Business Review. https://hbr.org/2003/07/delusions-ofsuccess-how-optimism-undermines-executives-decisions Makin, S. (2018, November 28). Searching for digital technology’s effects on well-being. Nature. https://www.nature.com/articles/d41586-018-07503-w March, James G., The Journal of Politics, Vol. 24, No. 4 (Nov., 1962), pp. 662-678, The University of Chicago Press, https://www.jstor.org/stable/2128040. Morehead, G., Ference, R., & Neck, C. P. Group Decision Fiascoes Continue: Space shuttle challenger and a revised groupthink framework, 1991. Nutt, Paul C., Administrative Science Quarterly, Vol 29, No. 3 (Sep., 1984), pp. 414-450, Sage Publications, Inc., https://www.jstor.org/stable.2393033. Nutt, Paul C., Surprising but true: Half the decisions in organizations fail, Academy of Management Executive, Vol. 13, No. 4 (1999). Reb, J., & Connolly, T. (2007). Possession, feelings of ownership, and the endowment effect. Judgment and Decision making, 2(2), 107 Simon, Herbert A., The Quarterly Journal of Economics, Vol. 69, No. 1 (Feb., 1955), pp. 99-118, Oxford University Press, https://www.jstor.org/stable/1884852. Staw, B. M. (1976). Knee-deep in the big muddy: a study of escalating commitment to a chosen course of action. Organizational Behavior and Human Performance, 16(1), 27–44. https://doi.org/10.1016/0030-5073(76)90005-2 Steele, R. R. (2009). Psychological Motivation for Genocide [Review of WHY NOT KILL THEM ALL? THE LOGIC AND PREVENTION OF MASS POLITICAL MURDER, by D. Chirot & C. McCauley]. International Journal on World Peace, 26(2), 83–88. http://www.jstor.org/stable/20752887 Street, M. D. (1997). Groupthink: An Examination of Theoretical Issues, Implications, and Future Research Suggestions. Small-Group Research, 28(1), 72–93. https://doi.org/10.1177/1046496497281003