Sheila Jasanoff (sheila_jasanoff@harvard.edu) Time: W 1:00-4:00 PM Office: Harvard Kennedy School, L354; Phone: 617-495-7902 Location: 66-148 Assistant: Lauren Schiff (lauren_schiff@harvard.edu, 617-495-5636) Office hours: By appt. MIT Assistant: Debbie Meinbresse (meinbres@mit.edu, 617-452-2390) EXPERTISE AND DEMOCRACY IN SCIENCE AND TECHNOLOGY POLICY (STS 910 [MIT], IGA 317 [HKS] – Spring 2009) Policymaking in today’s complex, technologically advanced societies could not proceed without the continual involvement of experts. But who are experts, whom do they represent, what are the sources of their authority, and how can expertise be held to democratic controls? What kinds of institutions employ expertise, and what are the organizational characteristics of such institutions? This course takes a critical look at the assumptions underlying the use of expertise in policymaking and asks how the growing global reliance on experts affects the quality, effectiveness, and accountability of public policy and governance. Drawing on several bodies of literature—including law, political science, policy analysis, and science and technology studies—the course considers how expertise is defined and constituted in such areas as environmental, medical, defense, and public health policy, as well as in various types of legal proceedings. Case studies will be used to explore the basis for claims of expertise, the reasons for expert controversies, the relations between laypeople and experts, and the measures used to hold experts accountable in diverse national and international policy settings. Readings Required readings will be available online through the MIT course website. Harvard students and auditors should contact Lauren Schiff and Debbie Meinbresse (contact information above) to obtain access to the course website. Books for deeper exploration of the subject matter, but not required for purchase, include: S. Jasanoff, The Fifth Branch: Science Advisers as Policymakers (1990) S. Jasanoff et al., eds., Handbook of STS, 2nd edition (1995) E. Hackett et al., eds., Handbook of STS, 3rd edition (2007) M. Leach, I. Scoones, and B. Wynne, eds., Science and Citizens (2005) Format and Intended Audience • The course will be conducted in an interactive style, with considerable opportunity for class participation; brief lectures by the instructor will situate issues and readings as appropriate. • The course is suitable for MIT graduate students in HASTS, TPP, DUSP, Engineering Systems, and the Sloan School. It is also appropriate for Harvard students in history of science, public policy, health policy, law, design, and other professional schools. • Auditors will be accepted subject to space constraints, provided they attend class regularly, do the readings, and fully participate in discussion. Requirements • Regular attendance and class participation; • Weekly written responses on the readings (ca. 250-350 words, frequency depending on class size); • Written notes for class discussion during the semester (frequency depending on class size); • Written analysis of a contemporary news story book, or govrrnment report about expertise, using concepts and materials from the course (3-5 pages); • Final paper (ca. 25 pages) on a topic and in a format to be negotiated with the instructor. COURSE SYLLABUS EXPERTISE AND DEMOCRACY IN SCIENCE AND TECHNOLOGY POLICY (STS 910 [MIT], IGA 317 [HKS] – Spring 2009 February 4: Why Do We Need Experts? Experts are a “human kind.” That is, they are a kind of person in the same way that these others are also kinds: parent, businessman, executive, or vagrant. But when do experts appear on the stage of modern society, why are they needed, what makes them different from lay people, wherein lies their political authority, and why do they deserve study in their own right by students of science and technology policy? This week provides preliminary answers to these questions. Course aims and objectives, teaching methods, syllabus review, requirements. *** T. Golan, Laws of Men and Laws of Nature: The History of Scientific Expert Testimony in England and America (Cambridge, MA: Harvard University Press, 2004), Ch. 1 (“‘Where There’s Muck There’s Brass’: The Rise of the Modern Expert Witness”), pp. 5-51. H.J. Laski, “The Limitations of the Expert,” Harper’s Monthly Magazine 162 (1930), pp. 101-110. A.M. Weinberg, “Science and Trans-Science,” Minerva 10 (1972), pp. 209-222. S. Jasanoff, The Fifth Branch: Science Advisers as Policymakers (Cambridge, MA: Harvard University Press, 1990), Ch. 1 (“Rationalizing Politics”), pp. 1-19. T.F. Gieryn, “The Boundaries of Science,” in S. Jasanoff et al., eds., Handbook of Science and Technology Studies (Thousand Oaks, CA: Sage Publications, 1995), pp. 393-443. C.R. Sunstein and R. Zeckhauser, “Overreaction to Fearsome Risks,” HKS Faculty Research Working Paper Series, http://ksgnotes1.harvard.edu/Research/wpaper.nsf/rwp/RWP08-079. February 11: Controversy, Uncertainty, and Power Experts, as is regularly observed, often have trouble agreeing on a diagnosis of problems, and hence also on solutions. This creates a paradox: society increasingly depends on experts; yet experts are unable to provide the certainty and security society desperately seeks. This week’s readings explore the causes of expert disagreement from different perspectives. E. Goffman, Frame Analysis: An Essay on the Organization of Experience (Cambridge, MA: Harvard University Press, 1974), pp. 21-39. B. Martin and E. Richards, “Scientific Knowledge, Controversy and Public Decision Making,” in Jasanoff et al, eds, Handbook of STS, pp. 506-526. T.J. Pinch, “Testing—One, Two, Three … Testing!’: Toward a Sociology of Testing,” Science, Technology and Human Values 18:25-41 (1993). H. Collins and T. Pinch, The Golem at Large (Cambridge: Cambridge University Press, 1998), Ch. 1 (“A Clean Kill? The Role of Patriot in the Gulf War”), pp. 7-29. B. Bimber, The Politics of Expertise in Congress (Albany: SUNY Press, 1996), Ch. 2 (“A Theory of the Politicization of Expertise”), pp. 12-24. D. Sarewitz, Frontiers of Illusion (Philadelphia: Temple University Press, 1996), Ch. 5 (“The Myth of Authoritativeness”), pp. 71-96. M. Leach, “Accommodating Dissent,” Nature 450:483 (22 November 2007) February 18: The Expert-Lay Divide Whether or not experts agree among themselves, their status vis-à-vis laypeople is deemed secure: experts know more, hence are thought to deserve greater respect from policymakers and publics. This week we look at how the lay-expert boundary is created, maintained, or contested through strategies ranging from institutional practices to elite practitioner and academic analyses. S. Breyer, Breaking the Vicious Circle (Cambridge, MA: Harvard University Press, 1993), pp. 55-81. Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 (1993), http://supct.law.cornell.edu/supct/html/92-102.ZS.html. B. Wynne, “Misunderstood Misunderstandings: Social Identities and Public Uptake of Science,” in A. Irwin and B. Wynne, eds., Misunderstanding Science? The Public Reconstruction of Science and Technology (Cambridge: Cambridge University Press, 1996), pp. 19-46. H. Nowotny, P. Scott and M. Gibbons, Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty (Cambridge: Polity, 2001), Ch. 14 (“Socially Distributed Expertise”), pp. 215-229. H.M. Collins and R. Evans, “The Third Wave of Science Studies: Studies of Expertise and Experience,” Social Studies of Science 32(2):235-296 (2002); commentaries by S. Jasanoff and B. Wynne, SSS 33(3):389-400 and 401-418 (2003). February 25: The Discourses of Expertise A considerable part of the power of experts derives from their ability to control a specialized technical language, or discourse. This week’s readings introduce us to the construction of such discourses and their shaping of the terrain of public problems. We also consider the problems involved in crossing discursive boundaries, especially between lay and professional groups. L. Winner, “On Not Hitting the Tar-Baby,” in The Whale and the Reactor: A Search for Limits in an Age of High Technology (Chicago, IL: University of Chicago Press, 1986), pp. 138-154. M. Mulkay, T. Pinch and M. Ashmore, “Colonizing the Mind,” Social Studies of Science 17(3):231:256 (1987). C. Cohn, “Sex and Death in the Rational World of Defense Intellectuals” Signs 12(4):687-718 (1987). R. Rapp, “Extra Chromosomes and Blue Tulips: Medico-Familial Interpretations,” in M. Lock, A. Young and A. Cambrosio, eds., Living and Working with the New Medical Technologies (Cambridge: Cambridge University Press, 2000), pp. 184208. H. Collins and R. Evans, Rethinking Expertise (Chicago: University of Chicago Press, 2007, Ch. 3 (“Interactional Expertise and Embodiment”), pp. 77-90. Revisit Sunstein and Zeckhauser. March 4: Do Artifacts Have Expertise? Experts derive their authority as much from the management and control of technological artifacts as from the mastery of specialized discourses. In what ways is expertise related to the design, development, or deployment of technological objects? What special issues or problems are associated with technological (as opposed to scientific) expertise? L. Winner, “Do Artifacts Have Politics,” in The Whale and the Reactor (Chicago: University of Chicago Press, 1986), pp. 19-39. B. Latour, “Where Are the Missing Masses? The Sociology of a Few Mundane Artifacts,” in W.E. Bijker and J. Law, eds., Shaping Technology/Building Society (Cambridge, MA: MIT Press, 1992), pp. 225-258. R. Bjork, The Strategic Defense Initiative: Symbolic Containment of the Nuclear Threat (Albany, NY: State University of New York Press, 1992), pp. 115-132. R. Sclove, Democracy and Technology (New York: Guilford, 1995), Ch. 3 (“‘In Every Sense the Experts’: Strong Democracy and Technology”), pp. 25-57. M.A. Hajer, “Politics on the Move: The Democratic Control of the Design of Sustainable Technologies,” Knowledge and Policy: The International Journal of Knowledge Transfer and Utilization 8 (1996), pp. 26-39. A. Barry, Political Machines: Governing a Technological Society (London: Athlone, 2001), Ch. 7 (“Political Chemistry”), pp. 153-174. March 11: Law’s Construction of Expertise The law is modern society’s most powerful institution for mobilizing, testing, and validating expertise. We have already seen some examples of the law’s role in constructing expertise. This week we look more closely at the legal rules governing expert testimony, as well as their intended and unintended consequences. Cole, Simon A. 2000. Suspect Identities: A History of Fingerprinting and Criminal Identification (Cambridge, MA: Harvard University Press), Ch. 2 (“Measuring the Criminal Body”), pp. 32-59. Federal Rules of Evidence, Article VII (“Opinions and Expert Testimony”) http://www.law.cornell.edu/rules/fre/overview.html. Kumho Tire Company v. Carmichael, 526 US 137 (1999), http://supct.law.cornell.edu/supct/html/97-1709.ZO.html. Berger, Margaret A. 2000. “Expert Testimony: The Supreme Court’s Rules,” Issues in Science and Technology XVI(4):57-63. D.L. Faigman, “Is Science Different for Lawyers?” Science 297: 339-340 (2002). M. Angell, Science on Trial (New York: Norton, 1996), Ch. 5 (“Scientific Evidence: What It Is and Where It Comes From”), pp. 90-110. S. Jasanoff, “The Eye of Everyman: Witnessing DNA in the Simpson Trial,” Social Studies of Science, Vol. 28, No. 5-6, pp. 713-740 (1998). J. Mnookin, “Expert Evidence, Partisanship and Epistemic Competence,” Brooklyn Law Review 73:1009 (2008). March 18: Scientific Advice and Its Social Problems Injecting expert knowledge into public policy brings problems going beyond those of certifying experts and producing specialized knowledge. At stake in this translation process are problems of quality control, political pressure, and even the structured production of ignorance. This week’s readings look at these problems, their causes, and proposed solutions. S. Jasanoff, The Fifth Branch, Ch. 4 (“Peer Review and Regulatory Science), pp. 60-83. D.H. Guston, “Boundary Organizations in Environmental Policy and Science: An Introduction,” Science, Technology & Human Values 26(4):399-408 (2001). D. Michaels et al., “Advice Without Dissent,” Science 298:703 (2002). D. Ferber, “Overhaul of CDC Panel Revives Lead Safety Debate,” Science 298:732 (2002). S.G. Stolberg, “Bush’s Science Advisers Drawing Criticism,” New York Times, October 10, 2002. N. Oreskes and E.M. Conway, “Challenging Knowledge: How Climate Science Became Victim of the Cold War,” in R.N. Proctor and L. Schiebinger, eds., Agnotology: The Making and Unmaking of Ignorance (Stanford, CA: Stanford University Press, 2008), pp. 55-89. S. Jasanoff, “Judgment under Siege: The Three-Body Problem of Expert Legitimacy,” in P. Weingart and S. Maasen, eds., Democratization of Expertise? Exploring Novel Forms of Scientific Advice in Political Decision-Making, Sociology of the Sciences Yearbook (Dordrecht: Kluwer, 2005), pp. 209-224. *** MARCH 22-29: SPRING RECESS (NO CLASS) *** April 1: Ethical Professionals and Professional Ethics Would the problems of enlightened, knowledge-based policymaking vanish, or at least be reduced to manageable proportions, if experts were more ethical? How has the rise of ethics as a professional discourse influenced public policy? This week’s readings offer varied answers. Time, “Persons of the Year 2002,” http://www.time.com/time/subscriber/personoftheyear/2002/poyintro.html. C. Djerassi, “Basic Research: The Gray Zone,” Science 261:972-973 (1993). D.S. Greenberg, Science, Money, and Politics: Political Triumph and Ethical Erosion (Chicago: University of Chicago Press, 2001), Ch. 13 (“The Public Understanding of Science”), pp. 205-233. Harvard Medical School, Guidelines for Conflicts of Interest, http://www.hms.harvard.edu/integrity/guide.html (ONLINE). S. Shapin, “Trust, Honesty, and the Authority of Science,” in R. Bulger et al., eds., Society’s Choices: Social and Ethical Decision Making in Biomedicine (Washington: National Academy Press, 1995), pp. 388-408. T.O. McGarity and W. Wagner, Bending Science: How Special Interests Corrupt Public Health Research (Cambridge, MA: Harvard University Press, 2008), Ch. 8 (“Packaging Science”), pp. 181-203. J. Evans, “Between Technocracy and Democratic Legitimation: A Proposed Compromise Position for Common Morality Public Bioethics,” Journal of Medicine and Philosophy 31(3):213-234 (2006). April 8: Getting Things Wrong; Learning from Experience Expert agreement can be as much of a problem as expert disagreement, since convergence on the wrong knowledge can lead to insupportable consequences for society. This week we look at a variety of sociotechnical disasters and the lessons drawn from them by experts, policy professionals, and scholars of risk and disasters. Video: Directed by Errol Morris (with Robert McNamara), The Fog of War Eleven Lessons from the Life of Robert S. McNamara (2004). S. Visvanathan with R. Kothari, “Bhopal: The Imagination of a Disaster,” Lokayan Bulletin 3 (1985), pp. 48-76. B. Wynne, “Unruly Technology,” Social Studies of Science 18:147-167 (1988). D. Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA (Chicago: University of Chicago Press, 1996), Ch. 3 (“Risk, Work Group Culture, and the Normalization of Deviance”), pp. 77-118. National Commission on Terrorist Attacks Upon the United States, 9-11 Commission Report, Ch. 11 (“Foresight and Hindsight”), pp. 339-360. S. Jasanoff, “Restoring Reason: Causal Narratives and Political Culture,” in B. Hutter and M. Power, eds., Organizational Encounters with Risk (Cambridge: Cambridge University Press, 2005), pp. 209-232. April 15: Communicating Expertise Improved communication between experts and the public serves two functions. First, it creates transparency, allowing citizens to review the basis for experts’ conclusions. Second, it permits citizens to bring their own knowledges and beliefs into decisionmaking. What problems might arise in this communicative process? D. Nelkin, Selling Science (New York: Freeman, 1987), Ch. 5 (“Media Messages, Media Effects”), pp. 70-84. Gamson, William A. and Andrew Modigliani, “Media Discourse and Public Opinion on Nuclear Power: A Constructionist Approach.” American Journal of Sociology 95(1):1-37 (1989). S. Hilgartner, Science on Stage: Expert Advice as Public Drama (Stanford, CA: Stanford University Press, 2000), Ch. 1 (Introduction: extract), pp. 7-20; Ch. 2 (“Staging Authoritative Reports”), pp. 42-85. S. Jasanoff, “Civilization and Madness: The Great BSE Scare of 1996,” Public Understanding of Science 6:221-232 (1997). T.J. Pinch, “The Golem: Uncertainty and Communicating Science,” Science and Engineering Ethics 6(4):511-523 (2000). B. Wynne, “Expert Discourses of Risk and Ethics on GMOs: Creating Public Alienation,” Science as Culture 10:445-481 (2001). S. Jasanoff et al., “Conversations with the Community: AAAS at the Millennium,” Science 278:2066-2067 (1997). April 22: Social Movements, Activism, and Expertise In today’s complex world, we cannot depend on formal mechanisms for the production of expertise to generate all knowledge relevant to policymaking. Social movements have played a key role in challenging and augmenting the knowledge of experts. This week’s readings examine the phenomenon of epistemic social activism and its consequences for public knowledge and public reason. R. Wachter, “AIDS, Activism, and the Politics of Health,” New England Journal of Medicine 326:128-133 (1992). S. Epstein, “Democracy, Expertise, and AIDS Treatment Activism,” in D.L. Kleinman, ed., Science, Technology and Democracy (Albany: SUNY Press, 2000), pp. 1527 (bibliography omitted). M. Callon, “The Role of Lay People in the Production and Dissemination of Knowledge,” Science, Technology and Society 4(1):81-94 (1999). L. Schiebinger, Has Feminism Changed Science? (Cambridge, MA: Harvard University Press, 1999), Ch. 6 (“Medicine”), pp. 107-125. M. Kirejczyk, “Parliamentary Cultures and Human Embryos,” Social Studies of Science 29(6):889-912 (1999). NIH Guidelines on the Inclusion of Women and Minorities as Subjects in Clinical Research, http://grants.nih.gov/grants/guide/notice-files/NOT-OD-00-048.html. April 29: Democratizing Expertise The problem of rule by unelected experts is at least as old as the past century, but our understanding of the scope of the problem and its proposed solutions has changed over the years. This week, we review some recent debates surrounding the democratizing of expertise. P. Stern and H. Fineberg, eds., Understanding Risk (Washington, DC: National Academy Press, 1996), pp. 73-96. European Commission, Report of the Working Group “Democratising Expertise and Establishing Scientific Reference Systems,” Brussels, May 2001, http://europa.eu.int/comm/governance/areas/group2/report_en.pdf. UK Parliamentary Office of Science and Technology, Open Channels: Public Dialogue in Science and Technology, Report No. 153, March 2001, pp. 1-22, http://www.parliament.uk/post/pr153.pdf. World Conference on Science, “Declaration on Science and the Use of Scientific Knowledge,” Budapest, Hungary (1999), http://www.unesco.org/science/wcs/eng/declaration_e.htm. S. Visvanathan, “Knowledge, Justice and Democracy,” in M. Leach, I. Scoones and B. Wynne, eds., Science and Citizens (London: Zed Books, 2005), pp. 83-94. M. Kusch, “Towards a Political Philosophy of Risk,” in T. Lewens, ed., Risk: Philosophical Perspectives (London: Routledge, 2007), pp.131-155 Revisit Sunstein and Zeckhauser. May 6: Experts in a Global World How have the nature and problems of expertise shifted as social problems spill across the borders of nation states? How do lay-expert disagreements play out on transnational stages, and can expertise be held democratically accountable when the “demos” in question is not the citizenry of a single nation state? Who are the experts that advise international bodies, and how is their expertise rendered accountable? Where does accountability fail? S. Jasanoff, “Product, Process, or Programme: Three Cultures and the Regulation of Biotechnology,” in M. Bauer, ed., Resistance to New Technology (Cambridge: Cambridge University Press, 1995), pp. 311-331. M. Goldman, “Inventing the Commons: Theories and Practices of the Commons’ Professional,” in Goldman, ed., Privatizing Nature: Political Struggles for the Global Commons (New Brunswick, NJ: Rutgers University Press, 1998), pp. 2053. G.C. Bowker and S.L. Star, Sorting Things Out: Classification and Its Consequences (Cambridge, MA: MIT Press, 1999), Ch. 3 (“The ICD as Information Infrastructure”), pp. 107-133. C.A. Miller, “Democratization, International Knowledge Institutions, and Global Governance,” Governance 20:325-357 (2007). K.M. Vogel, “Biodefense: Considering the Sociotechnical Dimension,” in A. Lakoff and S.J. Collier, eds., Biosecurity Interventions: Global Health and Security in Question (New York: Columbia University Press, 2008), pp. 227-256. May 13: Concluding Reflections Can we develop more mature ways of relying on experts in complex, multicultural societies, confronting intractable, potentially catastrophic problems, aware of the limits of our collective knowledge and experience, and yet seeking rational policy choices? How? D.A. Schon and M. Rein, Frame Reflection: Toward the Resolution of Intractable Policy Controversies (New York: Basic Books, 1994), pp. 3-58. S. Jasanoff, “Technologies of Humility: Citizen Participation in Governing Science,” Minerva 41:223-244 (2003). *** FINAL PAPER DUE ***