DRAFT – PLEASE DO NOT CITE FOLDING: INTEGRATING ALGORITHMS ON THE FLOOR OF THE NYSE Daniel Beunza London School of Economics d.beunza@lse.ac.uk Yuval Millo University of Leicester ym95@le.ac.uk February 26, 2013 Abstract What are the consequences of automation on the social structure of a market? So far, existing studies have focused on technologies designed to dilute social relations, overlooking instances where the opposite effect might take place. Our ethnography of the NYSE addresses this gap by exploring the automation of its trading floor at various points in time. First, we observed the NYSE prior to automation in 2003, and outline the functions traditionally played by its intermediaries. Second, we compared its two rounds of automation in 2006 and in 2008 using a combination of ethnographic observation and interviews, and identify the mechanisms that allowed the automated NYSE to effectively preserve its structure. We refer to this as folding. Third, we use the 2008 round of automation as a natural experiment to test the separability of the social and material aspects of the market. Finally, we use the Flash Crash of May 2010 to examine the effects of automation on the ability of the NYSE to perform its traditional exchange functions. Our proposed notion of folding contributes to economic sociology by outlining how a market can retain the advantages provided by social structure while accessing the benefits of automation. 1 DRAFT – PLEASE DO NOT CITE FOLDING: INTEGRATING ALGORITHMS ON THE FLOOR OF THE NYSE The automation of financial markets poses new challenges to scholars of markets and policy makers. Trading in American stock exchanges is now almost entirely dominated by financial algorithms (Tabb, 2011). As with other automation processes, financial algorithms have not only reduced labor costs but also transformed existing practices, altering the social structure of the market by displacing traditional intermediaries such as floor brokers and specialists. The risks posed by these changes became apparent with the Flash Crash of May 6, 2010, the fastest and second-largest percentage point decrease in history of the Dow Jones, and widely attributed to the use of trading algorithms (CFTC/ SEC, 2011). The incidence of the Flash Crash suggests that the consequences of automation have not been fully understood, and calls for an analysis that focuses on its more profound effects. What are the consequences of automation on the social structure of the market? In grappling with financial automation, sociologists can benefit from an established literature on market intermediaries but need to consider the effects of technology on these intermediaries. In particular, sociologists have pointed to the embedded and institutionalized nature of securities exchange (Baker 1984, Abolafia 1996). But by virtue of the time in which this literature was developed, it presupposes the presence of a human at both ends of the transaction – a presence that automation has consigned to the past. Alternatively, research on the social studies of finance offers a clearer focus on technology (MacKenzie 2006; MacKenzie and Millo 2003, Beunza and Stark 2004), but its emphasis on financial models is different from the risks posed by automation. A related literature in science and technology studies has explicitly considered the effects of new technology on social relations. For instance, Callon (1998) has shown how that technology is introduced in markets to dilute the effect of social ties, rendering the market closer to the atomized ideal espoused by economic theory. But this is not the only possible outcome: for instance, Muniesa’s (2004) study of the automation of the Bourse de Paris points to a very different effect, namely, the preservation of the original social structure of the market. A more complete understanding of the effect of 2 DRAFT – PLEASE DO NOT CITE technology on social structure is thus needed to grasp the full consequences of market automation. Our study explores this by examining the introduction of algorithms in the New York Stock Exchange (henceforth “the Exchange”). Founded in 1792, the Exchange is known for its iconic trading floor and specialized form of market makers – the “specialist system.” The NYSE’s trading floor remained vibrant through the 1990s even as rival exchanges in London and Paris closed theirs. But in 2006 the Exchange confronted the regulatory mandate to automate following the promulgation of Regulation National Market System by the Securities and Exchange Commission. The Exchange decided on an original automation design that would introduce algorithms while preserving its trading floor and intermediary structure of specialists and floor brokers. Its first attempt to do so was largely unsuccessful, leading to a steep drop in its market share (see Table 1). Its second attempt, introduced in 2008, led to a stable market share and a robust performance during the Flash Crash of 2010, suggesting the technical viability of integrating algorithms within an existing social structure. ---Table 1 here --Our study asks four questions about the automation of the NYSE. First, what were the advantages of the original specialist system that the Exchange sought to preserve? We examine this with ethnographic observations on the floor of the Exchange before automation in 2003. We categorize the advantages of automation in three broad groups, coordination, sensemaking and norm enforcement, and find that these are consistent with sociological studies of intermediation (Simmel 1902 [1950], Burt 1992, Baker 1984, Obstfeld 2005) as well as ethnographies of trading floors (Abolafia 1996, Zaloom 2001, Pitluck 2007). We ask a second, related question. How did the Exchange automate its processes while preserving its social structure? We identify the underlying mechanism by comparing the two rounds of automation at the Exchange in 2006 and 2008 using a combination of ethnography and oral history interviews. On the basis of these, we develop the notion of folding. Drawing on a culinary metaphor, we define folding as a process of automation that preserves the original social structure of a market or organization. In the realm of cuisine 3 DRAFT – PLEASE DO NOT CITE folding refers to the act of gently combining a delicate mixture into a thicker one without impeding the ability of both to work as desired. For instance, when adding chocolate powder to whipped cream the chocolate needs to be sprinkled on successive layers of cream rather than stirred into it; this prevents the thicker chocolate from bursting the air bubbles of the more delicate cream. In finance, we use folding to designate a form of automation that keeps the social structure in place, thus preserving its original function. Third, did was social structure preserved by retaining the original technology, or in a new technological setting? This question speaks to an important academic debate on the relationship between technology and society. In the past two decades sociologists of science have challenged the traditional notion of social structure by putting forth the view that the sphere of the social is inseparable from that of the material (Pickering 1993; see review by Orlikowski and Scott 2008). Thus whereas sociologists rely on social structure to explain phenomena in markets, science scholars contend that structure cannot be abstracted from its material setting, and that it caused by such material setting. Our study addresses this debate using the NYSE automation as a natural experiment to test the separability of the social from the material. We find that the NYSE managed to fold algorithms into the specialist system by creating a dual trading system that combined an automated matching engine with the traditional specialist auction on the floor. These two modes, however, did not coexist in parallel but were sharply separated in time through an elaborate switching mechanism explored below. This has two implications. First, the duplicative approach to folding chosen by the NYSE (a trading floor and a data center) suggests that NYSE’s role structure could not be extended to a material setting different than the original – a form of inseparability. Second, and in spite of this inseparability, the specialist role existed in the minds of the actors at the NYSE with independence from any material basis. Indeed, the central reason for the second round of automation was to preserve the viability of specialists and floor brokers. We theorize these findings by building on Feldman and Pentland’s (2003) notion of ostensive routines, and conclude that social structure, 4 DRAFT – PLEASE DO NOT CITE although inseparable from its original material setting, is an explanatory factor rather than a consequence of social action. Having explored the automation process at the NYSE, our study returns to the problem of automation for the market at large. What are the consequences of the automation for market properties such as liquidity and stability? To answer this question we exploit the Flash Crash of 2010 as another natural experiment. The crash threatened all exchanges equally, including the Nasdaq, Bats or DirectEdge (SEC/CFTC 2011). Yet the NYSE was the only exchange that managed to avoid cancelling its trades after the crash; all the other (automated) exchanges experienced hundreds of trade cancellations, undermining investor confidence. We conclude that an intermediary social structure performs important functions in the market that need to be taken into account in the ongoing design of market automation. AUTOMATING THE MARKET INTERMEDIARY Understanding market automation calls for a firm grasp of the ways in which market intermediaries shape the exchange of securities. A voluminous literature in the economics of market microstructure has explored these intermediaries using the industry term, “specialist system,” to describe the NYSE and other exchanges. Saar (2010) defines a specialist market as “a hybrid market structure that includes an auction component (e.g., a floor auction or a limit order book) together with one or more designated market makers (‘specialists’) who trade as dealers for their own account. The designated market makers have some responsibility for the market.” Saar thus points to three aspects of intermediaries encoded in the expression “specialist system”: a set of roles (specialist and broker), a practice (the call auction), and a material setup, primarily the trading floor. Structure, practice and materiality are thus bundled together in the academic conception of the NYSE intermediaries. Efforts to understand the effects of automation of the NYSE need to take these three aspects into account. In doing so, we draw on two distinct intellectual traditions, economic sociology and science and technology studies. Economic sociologists have focused on the structural component of markets with the study of the intermediary. Appreciation for the intermediary 5 DRAFT – PLEASE DO NOT CITE goes back to the study of third party mediation formulated by Simmel (1902 [1950]). Simmel’s third party profits from exploiting the disunion of the other two, as elaborated in Burt’s (1992) concept of brokerage; or it can profit from moderating the forces that divide the group, as formalized by Baker (1984), Khurana (2003), and Obstfeld (2005). This structuralist approach is complemented by ethnographies of trading floors that explored the material and embodied aspects of trading floors (Abolafia 1996, Zaloom 2001, Pitluck 2007). Trading, these contend, is not only shaped by the structure of roles and social ties but also by the material basis of trading such as the architecture of the building or the choice of open outcry technology. What do financial intermediaries do? Our own reading of the aforementioned literatures offers an answer to the question of what do intermediaries accomplish. Financial intermediaries within exchanges are particularly relevant in markets characterized by opportunism and uncertainty (Baker 1984). Intermediaries provide coordination (Khurana 2003, Abolafia 1996), sensemaking (Zaloom 2001, Pitluck 2007) and norm enforcement (Baker 1984, Abolafia 1996), as well as give rise to their own form of opportunism (consistent with Burt 1992). We consider each of these mechanisms below. Coordination. One key contribution of intermediaries is to facilitate exchange by coordinating the transacting parties. The theoretical mechanisms are particularly clear in Khurana’s (2002) study of a different form of intermediary, executive recruitment firms. These headhunters, he argues, are particularly useful in contexts of opportunism and uncertainty: as he puts it, in contexts of “few buyers and sellers, high risk to both parties, and institutionalized gaps between them” (Khurana 2003: 241). At its most basic, the coordinative function of headhunters entails matching, that is, mobilizing the intermediary’s contacts to expand the array of potential trading partners. Coordination extends to buffering the uncertainties entailed in the transaction. It also includes pacing the rhythm at which the parties interact by dictating a schedule and offering their resources help the transaction take place. While not explicitly labeling the process as coordination, ethnographies of trading 6 DRAFT – PLEASE DO NOT CITE floors have identified how brokers and market makers accomplish these functions. Abolafia (1996), for instance, underscored the role of NYSE specialists in matching buyers and sellers, and in buffering imbalances in demand and supply. Sensemaking. Ethnographies of exchanges have also outlined how trading floors produce social cues that allow the transacting parties to engage in sensemaking, that is, give meaning to incoming orders and prices amidst market uncertainty. For instance, Zaloom (2001, 2006) has documented how these cues arise non-purposefully in the very act of trading: “because of the physical and emotional information conveyed with numbers,” she argues, “not all bids and offers are equal” (Zaloom 2001: 263). An order conveyed with a fearful voice, for instance, elicits a different response than one with confidence. A more purposeful form of sensemaking helps intermediaries overcome the problem of adverse selection identified in the economics literature. As miscrostructure economists have established, a customer agreeing to trade at a given price may be trading because he or she knows something that the other side does not, posing a risk for the latter (Glosten and Milgrom 1985). This so-called “lemons” problem discourages transactions, drying up liquidity. But as the ethnographic work of Pitluck (2008) shows, intermediaries can address this problem by crafting their discourse appropriately. For instance, they can disclose the identity of the seller in a limited fashion by using an abstract category rather than the full name. This semi-identification can give meaning to the seller’s actions, eliminating the buyer’s suspicions of adverse selection while protecting his or her identity. As Pitluck (2008: 3) concludes, “a market’s anonymity is a social byproduct of market participants’ strategic interaction as they exchange, conceal, or reveal identity information.” We refer to the combined task of information exchange and social cues display as partial disclosure. Norm enforcement. The sociological literature discusses yet another intermediary role of the exchanges, namely, enforcing norms and limiting opportunism. Baker (1984) established that market makers in an exchange enforce norms such as not selling while prices are falling. They do so by “freezing out” opportunistic colleagues from trading (Baker 1984: 782). Norm enforcement has also been a traditional role of the specialist, as shown in 7 DRAFT – PLEASE DO NOT CITE Abolafia’s (1996) ethnographic study of the NYSE. Over the years, he found, the Exchange instituted formal controls over its specialists, including “affirmative” and “negative” obligations, an embryonic computerized auction, and a bureaucracy that awarded new listings to compliant specialists. These formal means of control were supplemented by an informal culture of “rule veneration” (Abolafia 1996) whereby the Exchange’s rulebook was repeatedly cited and known to everyone on the floor. Opportunism. Sociologists have also explored the problems that intermediaries in trading floors can themselves create. These go back to the intermediary’s tendency, discussed by Burt (1992), to exploit his or her structural advantage. In the context of trading floors, these ideas are echoed in Abolafia’s (1996: 105) analysis of the NYSE prior to the reforms of the 1960s: “in the 1920s,” he explains, “specialists aided and abetted ‘bear raids’ and manipulated their stocks” (see also Brooks 1969, Sobel 1975). The paradox is thus clear: while trading floors aim at limiting opportunism among transacting parties, they can also generate their own form of opportunism. Indeed, Abolafia (1996) posits a feedback model in which intermediary opportunism ebbs and flows: extreme opportunism prompts regulators and exchange bureaucracies to impose restrain, but the resulting reduction in opportunism leads to laxer controls. At the time of his study, 1990 to 1992, Abolafia was impressed by the restraint at the NYSE, which contrasted sharply with his observations on the culture of the investment banks and commodities exchanges. But according to his model, such restrain can be the breeding ground for opportunism later on. In sum, sociological studies highlight three key functions of intermediaries in exchanges, as well as a danger of opportunism. Yet these studies are only the first step, for they all refer to a human standing between other humans. In outlining the effects of automation on the intermediary, studies should examine its effects on coordination, sensemaking, norm enforcement and opportunism. A greater analytical focus on technology is thus required, and more specifically a focus on the algorithmic technology that has automated financial exchanges. How does automation reshape the market intermediary? 8 DRAFT – PLEASE DO NOT CITE By automation we refer to the introduction of technology that increases labor productivity, as in the replacement of workers by machines. In finance, the native expression for automation is the “introduction of algorithms.”1 In theorizing the effect of these algorithms we draw on the broader literature on science and technology studies. Of particular relevance is research in actor-network theory, which has granted agency to the objects (Latour 1991, Callon 1986). Specifically, Callon’s (1998) argued that technology is often introduced in markets with the express objective of reducing the effect of social relations on value. In the example he gives, a change in the architecture of the strawberry auction house in the French town of Sologne reduced the influence that social ties between farmers and traders had on the value of strawberries (Callon 1998, Garcia-Parpet 2007). By separating or “disentangling” the strawberries from the social relations between buyer and seller, the French auction house redefined the value of the fruit around intrinsic properties such as weight, size, ripeness, etc., bringing the market closer to the economic ideal of the atomized decision-maker (see also MacKenzie and Millo, 2003). The sociological literature has also pointed to the possibility that automation might be a complement rather than a substitute of the existing structure. Muniesa’s (2004) study of the Bourse de Paris notes that the committee charged with automation initially entertained a design that preserved the trading floor, allowing automated and manual trading to complement one another. By keeping the trading terminals on the exchange floor rather than at the banks’ offices, the French planners hoped to have “something like criee groups [crowds] with computers” (Muniesa 2004: 16). But the committee eventually changed course and eliminated the trading floor altogether. Indeed, the historical site of Paris Bourse at the Palais Brongniart is nowadays devoid of brokers or market makers, and used primarily to host 1 Financial algorithms are akin to the algos used in social media: relevance algorithms used by Google, user recommendations in Amazon’s site, or customized newsfeeds shown by Facebook. In all these, the algorithm mediates the inputs of users to reproduce some of the advantages of the social context. In a similar vein, financial algorithms – whether in order matching or order execution– replace tasks traditionally performed by humans such as making sales calls or working an order. By automating a task previously performed by intermediaries, algorithms can render these intermediaries redundant and thus reshape social structure. 9 DRAFT – PLEASE DO NOT CITE corporate events. Although the complementarity between automation and social structure was not the focus of Muniesa, we drew inspiration in his initial use of the term, as in “Folding a Market Into a Machine,”2 for our own definition. Our interest in such complementarity goes beyond the theoretical. An automation path that preserves social structure could address a key problem that legal scholars have highlighted when discussing automation, namely, the loss of the norm-enforcement mechanisms that make markets and organizations viable (David, 2010). According to Lessig (2000), automation entails a fundamental change in the governance of social activity. Legal and social norms are replaced by computer code, and power is shifted to new groups with different interests (see Barley [1986] for a related argument at the level of the organization). Left unattended, Lessig adds, this replacement of rules by code is at risk of being hijacked by technologists. The latter may find ways to profit from opportunistic activities that were previously barred by social norms, yielding technologies that run ahead of the system’s ability to manage them. Lessig insists that automation must be designed in a way that produces what he calls “electronic communities” rather than “hyper-connected networks” dominated by opportunism. The complementarity that Muniesa hints at, if theoretically developed, may thus offer a framework to address Lessig’s concerns. The separability between the social and the material The distinct automation path of the NYSE speaks to a related academic debate over the sociological conceptualization of technology. Starting with the work of Simmel and Weber, sociology has been premised on a notion of social structure that is independent of the material setting – an “enduring and relatively stable patterns of relationship between different entities or groups” (Levi Martin 2012: 4). Science scholars have challenged this view of 2 The differences can be explained by the multiple meanings of the term folding. According to the American Heritage Dictionary of English Language (4th ed, 2004), it is defined as “to make compact by doubling or bending over parts,” as in a folding a sheet of paper or the laundry. But folding is also defined as “to blend (a light ingredient) into a heavy mixture with a series of gentle turns,” as in folding beaten egg into batter. While Muniesa built on the first of these meanings (compacting social relations in a material artifact), we developed our notion on the basis of the second meaning (blending new technology in an existing pattern of social relations). 10 DRAFT – PLEASE DO NOT CITE structure by contending that there is a co-constitution between the social and the material (Pickering 1993). Within science studies, actor network theorists have argued that social life is made durable by material associations, and that as a result the social is a consequence rather than an explanation for action (Latour 1986, 2005). Whether in the form of coconstitution or in the stronger actor-network formulation of the social as outcome, these scholars posit an inseparability between the social and the material, challenging traditional sociological formulations of social structure (Orlikowski and Scott 2008). More recently, Feldman and Pentland (2003) have contributed to this debate by detailing a different form of separability between the social and the material. The authors distinguish between the ostensive (abstract, ideal) and the performative (enacted, concrete) aspect of organizational routines. Against Latour (1986), the authors argue that routines entail an ostensive aspect that is abstract, distributed, and manifested in overlapping narratives used by organizational actors. This view implies a different understanding of the role of routines in organizational change. Such change, the authors argue, can take place when performative routines differ from their ostensive ideal and the difference is retained. The work of Pentland and Feldman (2003) has implications for the aforementioned debate over the separation of the social and the material. Because the ostensive aspect of routines emphasized is akin to social structure, their study can be read to imply that social structure can exist regardless of its material basis. Our study provides an setting to advance this debate. If social structure and the material setting are indeed separable, organizations should be able to preserve their original structure when introducing a new technology. If, on the other hand, the two were inseparable we would not expect to see the original structure surviving automation. Our study speaks to this in connection with the automation of the NYSE. RESEARCH METHODS Research Site Our ethnographic design departs in various ways from the canonical single-site, single-period approach to fieldwork. In addition to our primary period of observation during 11 DRAFT – PLEASE DO NOT CITE 2008-10, we relied on observations in 2003, on oral history interviews, on interviews within the field of securities trading during 2008-10, on follow-up interviews in 2011-12, and on the incidence of the Flash Crash. We discuss these in greater detail below. Data sources Fieldwork at the NYSE 2008-10. Our primary data entails fieldwork at the NYSE during 2008-2010. By then, the regulatory mandate that forced the Exchange to automate had already been promulgated, and the Exchange was in the midst of profound change. The turmoil became clear in our difficulties at gaining access. Our initial point of contact in the Research division of the Exchange was made redundant after our second interview. We persevered, and after several attempts were invited by an official to visit the floor. It was during that visit that we serendipitously met the chairman, Duncan Niederauer, and he agreed to a case study of the transformation of the Exchange. As a result, over the period 2008 to 2010 we made 31 visits to the NYSE, interviewed 19 officials, including its chairman, the top management team and several floor governors. We conducted detailed observation of the floor booths of two brokers, VDL and Rosenblatt Securities, and of the post of Bank of America, later sold to Getco Llc. We interviewed the designated market makers and floor brokers responsible for these booths, as well as the clerks that worked with them. In addition, we also observed two regular market openings, one market closing, and one special situation during the record-volume stock-rebalancing auction of Citibank Group. Observations in 2003. In making sense of the observations noted above, we benefited from previous fieldwork at the NYSE. We first visited the Exchange in June 2003. At the time we interviewed a specialist, a floor broker, a research official and a compliance officer, witnessed a market opening ceremony, and conducted observations on the trading floor and at the Luncheon Club. This provided a precious window of observation into a world that would subsequently disappear and gave us grounds to compare the Exchange before and after automation. Follow-up after 2010. Our data also extends beyond 2010. We conducted five follow-up interviews in person at the NYSE during 2011, as well as five telephone interviews. 12 DRAFT – PLEASE DO NOT CITE We also conducted nine follow-up interviews in person during 2012. These allowed us to gauge the response of the Exchange to the Flash Crash and lent historical continuity to the changes that the Exchange had put in place during 2008, including the sale of the Bank of America’s floor operations to the algorithmic trading company Getco Llc. These changes also led us away from the Exchange for interviews with ex-specialists that had been laid off in the four years since our study began. Interviews outside the NYSE. We complemented our ethnographic data with interviews of industry participants and academics in financial exchanges. This encompassed officials at competing exchanges, including the President and Chief Executive Officer of the International Stock Exchange, the Chairman and three officials at the American Stock Exchange, as well as three high-ranking officials at the Nasdaq. We also interviewed a highranking official at Goldman, who has both a sales trading desk and a dark pool. Our data also went beyond practitioners to include regulators, consultants and academics. Of these we interviewed two officials at the Securities and Exchange Commission, including the Chief Economists of the SEC during the key period when Reg-NMS had been enacted, Lawrence Harris. We also interviewed industry consultants and academic specialists on market microstructure, including Stephen Wunsch, Wayne Wagner, and William Harts. In addition, we interviewed influential market microstructure academics such as with Lawrence Glosten and Charles Jones, and attended three academic seminars in market microstructure. Analysis To build theory from the case, we make use of various analytical strategies. Following Agar (1986), we identified breakdowns in our initial conception of the phenomenon and reconceptualized our thinking around them. We also used grounded theory as defined by Glaser and Strauss (1967) by conducting two within-case comparisons. We first compare the two rounds of automation at the NYSE: the so-called Hybrid of 2006-08 with the New Generation Model implemented in 2008-10. Because the first automation round was relatively unsuccessful and the second one was relatively more successful, the comparison allows us to identify the mechanism that underlies the simultaneous automation and 13 DRAFT – PLEASE DO NOT CITE preservation of the original social structure. Second, we compare the practices at the NYSE before automation in 2003 with those after the second automation in 2008-10. Finally, the Flash Crash of 2010 provided us with a form of natural experiment. The crash affected all the equities exchanges and trading venues in the US, but their performance varied dramatically. The differences in impact provide us with grounds to explore whether the NYSE’s automation design gave superior capabilities than its counterparts. The use of an extended ethnography also responds to the call by Feldman and Pentland (2008: 312) for studies that are both ethnographic and longitudinal in order to uncover the relationship between espoused (“ostensive”) structures and actual routines. We complemented our organizational ethnography with interviews and observations in the exchanges industry at large. We observed trading and interviewed the management of rival exchanges, attended technology fairs, conferences, and interviewed regulators as well as technologists. This contextualization was important given the external nature of the impetus that forced the Exchange to automate: because the decision to automate was taken by regulators outside the NYSE, it was critical to leave the Exchange to capture both sides of the debate that led to enforced automation (see Fligstein [2008] for a general elaboration of this argument). While our main method for data collection is ethnographic, the methodological scope of our investigation is also historical sociology. As such, we regard the data we collected – about the day-to-day actions of the actors at the NYSE – as being contingent upon wider, transformative events (Paige, 1999). Our choice of this method is motivated by three main reasons. First, the transformative nature of events that financial markets faced in the last decade are similar, although at a narrower scale, in many respects, to the grand events of discontinuity and change that were the backdrop for the social theories of Marx and Weber (Skocpol, 1984; Riley, 2006). Second, the automation of financial markets and its implications is clearly a “big” question – an issue that both academics and the wider public are interested in and therefore call for a historical scope investigation (Mahoney and Rueschemeyer 2003). Third, the multiplicity of arenas and scales where automation of 14 DRAFT – PLEASE DO NOT CITE financial markets took place and the complex potential causalities that accompany such events fit the comparative-historical approach in historical sociology (Mahoney, 2004). AUTOMATING THE NYSE 2003: The NYSE Before Automation It did not take us long to witness the social nature of the NYSE. We first entered its iconic trading floor at 9:25 am in the morning of May 23, 2003. We were there to observe the opening of the market, guided by an Exchange official called Murray Teitlebaum. Looking up from the floor we saw the chairman of the NYSE, Richard Grasso, standing on a podium and accompanied by a mixed troupe that included high-ranking military officers and Miss America (O. Jennifer Rose) dressed in full Beauty Queen costume –including bathing suit, crown and band. As time approached 9:30 am, the crowd of brokers on the floor began clapping, the bell clanged and a raft of camera flashes immortalized the moment. After the bell, a loudspeaker invited everyone to join as Miss America sang the national anthem to commemorate Memorial Day. Such elaborate beginning of the trading day --with the exception of the anthem-- is performed daily at the NYSE. The ceremony was revamped by Grasso in 1995 in response to a technology-themed advertisement campaign by the Nasdaq. By inviting celebrities and giving television stations access to the floor, Grasso created an event that could be broadcast: a live performance, in response to Nasdaq’s electronics. The strategy proved successful, and by 2003 the Exchange counted 55 international television networks broadcasting live from its floor through the day. Social interaction is thus not a means to conduct transactions at the NYSE, but a central feature of its brand and identity, celebrated and leveraged by management. The NYSE, we quickly learnt, rested on a social division of labor between specialists and floor brokers – the so-called specialist system. Specialists acted as both principals and agents on designated stocks; brokers represented client’s orders. As principals, specialists were expected to “make a market” on designated stocks, that is, act as counterpart to buyers and sellers; as agents, specialists were expected to hold call auctions. Brokers represented clients’ orders. Specialists stood still at the trading post; brokers took clients’ orders from 15 DRAFT – PLEASE DO NOT CITE vertical telephone booths in the periphery of the floor and placed those by walking to the specialists post in the center of the floor. Specialists wore sober suits; clerks and brokers dressed in colorful jackets. Seen from above, then, the Exchange looked like many other markets: a few individuals standing still, surrounded by others walking between them. Like a market on a public square, and unlike the trading rooms of investment banks, the Exchange’s floor grouped together both sides of the transaction on the same space, allowing for face-toface interaction across the sensitive buyer-seller divide. What functions did this peculiar setup allow? We addressed this question with our ethnographic observation. We started by walking to the post of Robert Hardy, a specialist on several French companies that stood at the post of Fleet Financial. Hardy gave us the first clues into the nature of the specialists’ job, confirming the presence of the three mechanisms discussed above: coordination, sensemaking and norm enforcement. Coordination. Hardy coordinated buyers and sellers by conducting call auctions at his posts at designated times, with a clerk behind him typing the prices he dictated on a computer terminal known as the Display Book. To do so, Hardy established the price at which demand and supply equilibrated, an activity that the Exchange denoted “price discovery.” These call auctions had a special feature: they batched all the orders before setting a price, “pouring” orders “like water on a swimming pool,” as someone explained us. This reassured brokers that their orders would be processed at the same time, preventing the proverbial rush to the fire exit. An ex-specialist gave us the following example: Let’s say you are a seller for 200,000; you are a seller for 200,000; you are a seller for 200,000; I’m a specialist, I come in and say ‘Calm down, all right, just everybody calm down, what do you have to do? [moving his head left] What do you have to do? [moving right] What do you have to do?’ Ok the market right now is $20 bid for 100 shares and a million shares offered at $21’ you show it on the screen. ‘Ok what do you want to do? You want to sell 100 shares now at the dollar? Ok now you sold 100 shares, now the market is $19 for 100 shares, do you want to sell another 100 shares?’ ‘ok let’s calm down, let’s see if we can find some buyers, let’s see what happens at various prices, let’s talk this thing out, let’s do business’. The quotation suggests that price discovery is peculiar form of economic intervention, smoothing prices by managing people. Such pacing of the rhythm resulted in “slowing the transacting, preserving the flow,” as another specialist explained. The role of time, we 16 DRAFT – PLEASE DO NOT CITE realized, was particularly important in trading because liquidity is a temporal variable. Liquidity refers to the availability of counterparts during a given interval of time. Batching orders as the specialists did extended the length of time used for matching, increasing the likelihood of finding a counterparty. In this sense, price discovery is similar to that of the headhunters analyzed by Khurana (2003): by managing the speed of the interaction between buyer and seller, both headhunters and specialists ensured that the rhythm of activity was not destructive to the successful completion of the transaction. Indeed, in their efforts at pacing trading, the specialists went as far as to routinely freeze the computerized Display Book in order to prevent it from sending orders during the call auction. Freezing became later on the basis for a lawsuit by SEC discussed below. Another form of coordination was the specialists’ role as market makers, dealing in stock for their own proprietary account in order to limit volatility. Specialists, Hardy explained, were “like shock absorbers,” compensating the pressures of supply and demand to make prices more stable. This contributed to fulfill one of their affirmative obligations, namely, keeping a “fair and orderly market.” To do this, Hardy relied on the information he had about the upcoming orders in the book. The resulting temptation to sell ahead of a client’s order, “front-running” the customer, was partly managed by a negative obligation: the specialist was not allowed trade for his own account at a price at which an unexecuted agency order he or she was holding could be executed. In practice, then, the specialists took a position in the reverse direction of the latest price movement: if the specialist received many sell orders, he would want to buy (Rutigliano interview). The specialists also relied on price and order book signals: as we saw, Hardy used price charts with trend lines as a retail investor would. Such was the Exchange’s preference for public information that the use of personal mobile phones was not allowed on the floor. In sum, the specialists’ market-making activity performed a coordinative function and relied primarily on trading flow information to do so. Sensemaking. Our observations also underscored the importance of sensemaking. We observed this at Hardy’s post when a short broker named Salvatore walked to his post. “Salvatore and I,” he said, touching Salvatore’s shoulder in appreciation, “worked together 17 DRAFT – PLEASE DO NOT CITE fifteen years ago.” But Salvatore’s visit post was not a courtesy call: “still here snooping?” asked another broker when Salvatore arrived at the post. Salvatore, it turned out, had a large buy order; instead of simply handing it over to Robert he told him about it. “I think it’s a little heavy,” Robert replied, suggesting that there were many other buy orders at that time, and that Salvatore might want to come back later. This partial disclosure of book orders enabled Salvatore to better time his order. The practice, known as “giving a view,” allowed specialists to provide information to elicit additional orders without compromising the positions of the existing ones. We observed this yet again as we followed Dan, another floor broker, from one specialist post to another. When Dan approached a specialist with a question, the answer was “stock’s hanging in there, lots of machine buying, Morgan’s a seller, Merrill has an interest.” In subsequent interviews with industry consultants we learnt just how elaborate and important the conversations between buyers and sellers on the floor were. Conversations were crucial for matching large blocks of shares, where the problem of adverse selection is most acute. Adverse selection limits the ability of specialists and floor brokers to disclose the size of their position for fear of influencing the other side. But as we learnt from microstructure consultant Wayne Wagner, some disclosure is inevitable: given the adversarial nature of transacting, one side cannot interest the other without saying something about the size and nature of the block (Wagner interview). As Wagner eloquently wrote in a trade journal, this complicates matters, as even a minimal disclosure exposes the actor to opportunism: “it is impossible to draw a black and white distinction between seeking liquidity and violating confidentiality (…) The market maker cannot accelerate liquidity arrival without revealing trading interest” (Wagner, 2004: 5). Matching large orders thus calls for a high level of trust among the negotiating partners. It was this trust that allowed NYSE specialists to tap into the latent demand and supply for stocks held by institutional investors (Wunsch interview; see also Wunsch 2011). In short, sensemaking at the NYSE took place through partial disclosure and was particularly important for large blocks of shares. Norm enforcement. We learnt about the Exchange’s norm enforcement mechanisms from Robert Hardy over an elaborate breakfast at the NYSE’s Luncheon Club. Conflicts were 18 DRAFT – PLEASE DO NOT CITE managed on the floor through a combined mechanism of formal rules and informal norms. This mechanism included the figure of the floor governor. Governors worked for the Exchange, and controlled a bureaucratic system of listings allocation that rewarded the specialists who followed the norms. Hardy provided us various examples of instances where he controlled his self-interest. Referring to one particular instance, he said: The governor said to me, ‘Bobby, that was a good trade.’ I lost money on twelve consecutive orders, and then made money on the last one. But all got the same price. The implication from this quotation is that, as Abolafia (1996) argued, specialists were characterized by restraint as much as by eagerness to profit. The Exchange had developed internal mechanisms that rewarded specialists to leave money on the table for the sake of the good functioning of the system. Formal organization was not the only means of norm enforcement. As we followed Dan, the floor broker mentioned above, from one post to the next, we also noticed how he addressed, backslapped and saluted with nicknames everyone he met on his way. Everyone on the floor was Johnny, Jimmy or Bobby; there were no Johns, James or Roberts. Indeed, over the course of our many interactions --whether at the elevator, on the floor, or elsewhere-we observed that actors had developed an ability to make a quick joke that acknowledged the presence of the other without being formulaic. Interactions were humorous, fast, witty and casual. This was consistent with those of Baker (1984), who emphasized the importance of network cohesiveness to forestall opportunism and bring about an orderly market. In sum, our observations at the NYSE in 2003 showed that the Exchanged appeared to fulfill the three functions of intermediaries discussed by sociologists. But our fieldwork also revealed the limitations of the trading floor. Physical limits of the floor. Chief among the limitations of the floor was its almost complete disregard of new information technology. Indeed, as the Exchange expanded from one to five rooms over the decades, walking to each post became increasingly difficult for brokers. We understood this as we overheard an elevator conversation between two brokers. Both used a special cloakroom to change their shoes before entering the floor and avoid back 19 DRAFT – PLEASE DO NOT CITE pain. One of them had even gone past rubber soles, and began “experimenting” with his son’s skateboarding shoes. Similarly, as trading volume rose the clerks who worked for the specialists found typing the orders on their Display Books increasingly difficult, with some doing so at the unlikely pace of seven keystrokes per second (Pastina interview). On so-called Marlboro Friday of 1993 (named after the sharp fall in Philip Morris’ price) the clerk at the Philip Morris post famously typed on his Display Book at such furious speed that the machine ended up breaking down from overheating. In short, by 2003 the Exchange seemed to be bursting at the seams, prompting us to ask ourselves (and write on our notes) whether we might not be seeing a world about to disappear. Opportunism. Although our limited observations in 2003 did not hint at any form of broker opportunism, soon after our initial fieldwork in 2003 the Exchange confronted various lawsuits against its Chairman and specialists. In July 2003, news that the board of the NYSE had granted Richard Grasso a combined retirement and compensation package of $190 m. prompted widespread media outcry. Grasso’s resignation did not put an end to the threat to the institution: in May 2004, the SEC and New York State’s attorney general Eliot Spitzer submitted a civil lawsuit accusing Grasso and other board members of manipulating the NYSE board, and in October 2006 the New York Supreme Court ordered Grasso to repay the NYSE part of the compensation package. However, this was reversed in 2008 following an appeal. Short after the Grasso lawsuit, in 2003 the SEC also sued a number of the specialists. They were accused of neglecting their obligations by engaging in inter-positioning (unnecessarily placing of an order at a price between current bids and offers), of front running (trading ahead of a client’s order), as well as of freezing the Display Book as mentioned above. Following an internal investigation, on October 2003 the NYSE fined five of the seven specialist firms $150 million for habitual abuse of their market roles. The specialists firms also agreed to pay $240 million to settle with the Exchange but the SEC persisted with the case separately. In 2006, however, a judge reversed the conviction of leading specialist David Finnerty in the SEC court case (Colesanti, 2008). 20 DRAFT – PLEASE DO NOT CITE The lawsuits prompted a top management change at the NYSE that paved the way for automation. Grasso had been a vocal defender of the trading floor and a detractor of automation; following Grasso’s departure, the Exchange’s board appointed John Reed, the ex-CEO of Citigroup, as interim chairman. Reed was a noted promoter of technology in finance, known for introducing the ATM in the 1970s. He introduced a new governance structure at the Exchange and led a search that culminated in the appointment of another technology enthusiast, John Thain, as Chief Executive (Gasparino 2006). In barely more than year, the NYSE lost its chief executive, changed its governance procedures, and appointed a new chief executive. Yet these changes were soon overshadowed by an all-encompassing regulatory reform introduced by the SEC. 2004-08: Algorithms vs. intermediaries Starting in 2005, a sequence of changes in regulation and technology combined to create an algorithm-based approach to securities exchange that challenged the dominance of the specialists at the NYSE. These changes go back the late 1960s: starting then, and over the course of the following four decades, the SEC mandated automation to put in place a form of managed competition among exchanges, known as the National Market System. The NMS, as it is generally known, connects the various exchanges via order routers, and routes every order to the exchange with the best price. The NYSE, long considered a “slow” manual exchange relative to automated competitors, was initially excluded from the SEC’s requirement to connect to the National Market. But by the early 2000s the SEC, with Lawrence Harris as chief economist, opposed the NYSE’s exception on the grounds that it gave specialists an unfair advantage in the form of a “look-back option” (Harris interview). In December 2005 the SEC promulgated Regulation National Market System (Reg-NMS), requiring disclosure and immediate tradability of prices in all the exchanges of the National System, including the NYSE. This provision, known as Rule 611 and informally referred to as the “trade-through rule,” forced the Exchange to respond to an order within a second. Because the humans on the floor could take up to 30 seconds, the NYSE found itself in urgent need to accommodate automated trading. 21 DRAFT – PLEASE DO NOT CITE Reg-NMS had profound intellectual roots. According to Muniesa (2007), the computer-inspired view of financial markets that lies at the root of automation was first sketched in the utopian vision of an economist, Fischer Black (1971). The renown economist, who also pioneered the computerization of libraries and hospitals as a consultant for Arthur D. Little (Mehrling 2005), was a strong advocate of automating the NYSE (Harts interview). At the heart of Black’s proposal lied a vision of trading as processing information, and of the Exchange as a self-organized book of orders -- namely, a database. The critical role of price discovery performed by NYSE specialists, Black argued, could be left to investors posting their own bids and asks through limit orders. As for the specialists’ obligation to “keep fair and orderly markets,” Black interpreted it as a mandate to dampen price volatility. This form of intervention, he argued, was inconsistent with an efficient market and random-walk prices. In an automated exchange, Black (1971: 87) concluded, “there will be little need for dealers, market makers, or block positioners who maintain quotes for their own accounts.” In the following four decades, academic economists specialized in exchanges coalesced into a subfield known as “market microstructure,” and debated the relative benefits of automation and by the mid-2000s support for automation was widespread (e.g. Hendershott et al 2011). Beyond the work of academics, Reg-NMS would not have been possible without the efforts of private entrepreneurs. Between 1971 and 2004 these contributed to fulfil Black’s vision of a disintermediated and computerized exchange in various ways. The first was Instinet, a private company that offered computer links between institutions in the early 1970s, “with no delays or intervening specialists” (Behrens interview). In 1996, and taking advantage of the then-new Internet technology, online trading entrepreneurs founded Island Inc. The company formed a so-called electronic communication network (ECN) that matched and executed internally the orders sent by the clients (Katz interview). At the heart of an ECN was a matching algorithm with explicit rules for prioritizing orders: unlike dealerdominated exchanges like Nasdaq, ECNs delegated the pricing of stocks to an algorithm. In effect, this did away with the market maker’s prerogative to set prices: as one of our interviewees put it, in an ECN “every customer is a dealer.” In 1998, the SEC put the ECNs 22 DRAFT – PLEASE DO NOT CITE on an equal footing with Nadaq market makers with Regulation Alternative Trading Systems, giving them access to the inter-exchange order-routing system the SEC had built in the 1970s. Following this move, and unable to compete, a nearly-bankrupt Nasdaq acquired three ECNs in 2003 and replaced its market makers with algorithms. As a Nasdaq executive explained, “the ECNs won” (Concannon interview). The private efforts at automation were intensified during the 2000s. Wall Street banks and brokers, long opposed to the Exchange’s dominance, worked alongside new automated exchanges like Bats Exchange and Direct Edge. Run as consortia for the benefit of the banks, these exchanges offered low prices, fast speeds and quickly took market share away from the NYSE (Wolkoff interview, Williams interview). These new exchanges instituted a system of payment for order flow: by offering rebates to customers who entered limit orders but were not immediately executed, they compensated for ”adding liquidity” to the system (Harris interview). The first venue to do so was Island, the Chicago-based ECN (Mackenzie and Pardo-Guerra 2013). Partly in response, buy-side funds such as Fidelity followed suit by developing technology for automated order execution. To do this, technology firms like Pragma and Deep Value developed algorithms that broke up large orders into small parts, replicating the specialists’ ability to minimize market impact of larger orders by “working an order.” The best known among these was the so-called VWAP, or “Volume-Weighted Average Price” algorithms. These helped ECNs by providing a mechanism to handle large orders, rendering the matching capability of the NYSE trading less unique. By the end of the 2000s, the VWAP had become the metric by which sales traders were being evaluated, thus institutionalizing the use of algorithms -- even for the purpose of evaluating humans. Beyond execution, so-called “high frequency trading” firms introduced algos in their trading strategy, unleashing large volumes of transactions and coupling them with frequent cancellation of orders. Their activity in electronic market-making predominantly involves “providing liquidity,” that is, posting limit orders that others execute against. This was conducted by large hedge funds such as Chicago-based Citadel, as well as by specialist firms 23 DRAFT – PLEASE DO NOT CITE like Chicago-based GETCO (Global Electronic Trading Co.), Kansas City-based Tradebot, and Amsterdam-based Optiver. The automated alternative to the NYSE was further developed with the rise of socalled “dark pools.” Brokers-dealer firms such as Goldman or Credit Suisse started to offer a specialized order-matching service in which prices were only displayed after a trade had been executed. In this manner, these dark pools avoid publicizing bids and asks, allowing fund managers to trade large blocks of shares without creating price movements against them or having orders sliced by execution algorithms. By 2006, the combination of the SEC’s regulation, automated exchanges, algorithmic execution, high-frequency trading and dark pools made it possible for investors to trade stocks without intermediaries at the NYSE. A combination of economists, entrepreneurs and regulators had offered an alternative model: the exchange as a database. And in an important sense, their efforts appear successful: as microstructure economists like Hasbrouck and Saar (2010) have argued, spreads in the most-often traded indexes have narrowed following the introduction of Reg-NMS (see also Hendershott et al., 2011). In sociological terms, the automation enforced by the SEC can be read as an attempt to disentangle equities trading from the social relations between brokers and specialists. As someone told us at the NYSE, the SEC believed that the NYSE is “nothing more than an old boy’s network, a bunch of guys who take care of each other.” The SEC, with Lawrence Harris as its chief economist, were not unlike the chairman of the strawberry auction house described by Callon (1998): a reformer who found the relationship between market participants much to close for the proper functioning of the market. Indeed, there was a clear sense of technological optimism among industry participants in 2008. Automation would not only improve market structure, but also eliminate opportunism. For instance, according to Gregory Maynard, Systems and Product Strategy Director at electronic pioneer International Securities Exchange, automation would reduce behavior among traders to the explicitly permitted, eliminating ambiguity. According to him, 24 DRAFT – PLEASE DO NOT CITE On the floor there are all these rules that you’re not allowed to disobey. In the computer there are all these rules that you’re allowed to follow and … nothing else. The rules tell you what you can do in the computer, they tell you what you can’t do on the floor (Maynard interview; our emphasis). In other words, to Maynard automation was a form of better norm enforcement. This view is exactly the reverse as Lessig’s (2000): whereas Lessig warned about the risk of opportunism posed by automation, Maynard expected dubious manual practices to disappear in the process of translating norms into code. His argument for automation was not only about efficiency but a moral one. By the end of 2005, the discussion about automation had concluded, and Reg-NMS was in place. In 2006 the head of the NYSE, John Thain, began a long march towards automation. He started with demutualization in 2006, enriching the Exchange’s existing members by buying their seats, and shifting control of the Exchange’s board in the process. Thain continued by diversifying the Exchange’s business away from floor trading, acquiring the large ECN Archipelago and rebranding it as part of the NYSE, thus achieving an appearance of higher market share (Concannon interview). In addition, Thain moved away from trading low-margin US equities by merging with Euronext, a European exchange conglomerate active both in equities and higher-margin derivatives. The combined effect of these moves was to reduce the power of the specialists, as well as the relative importance of the NYSE floor to ten percent of the Exchange’s overall revenue. Having seized control and diversified the business away from the specialists, Thain proceeded to lead the first round of automation, culminating in a new trading model tellingly known as HybridMarket. 2008-10 The automation of the NYSE We returned to the floor of the NYSE in February 2008. Our goal was to understand how the Exchange had addressed the challenge posed by automation. We started with a guided visit to the floor with Murray Teitlebaum, the same official that we met in 2003. This time however the floor looked remarkably emptier: “things have been very, very difficult,” Murray explained. He added that the Hybrid, NYSE’s response to Reg-NMS, had failed to live up to its promise. The NYSE had lost the bulk of its market share, dropping from a peak 25 DRAFT – PLEASE DO NOT CITE share of 83 percent in 2003 to 27 percent in 2009, and specialist firms had laid off clerks and specialists to the point that three of the five trading rooms of the NYSE had closed. ---- Table 1 here ---In effect, Hybrid was attempting a difficulty compromise between automated and manual trading. The Exchange had disabled the artificial limit that constrained its previous automated system (called “Direct +”) to trades smaller than 1,000 stocks. It also preserved the specialists at the post, giving customers a choice between automatic and manual auctions. But it soon became clear that this was not working as intended. A basic problem was speed: NYSE’s servers and routers were designed for reliability rather than timeliness (Leibowitz interview), and processing an order took 360 milliseconds while the competitors were in the ten-millisecond range (Pastina interview) and in some cases even less (MacKenzie and PardoGuerra 2013). A more serious problem was the incompatibility between algorithms and specialists: while algos matched orders in a continuous auction, specialists engaged in a discrete call auction that made it impossible for them to interact with a continuous electronic order flow (Wunsch interview). As an ex-specialist explained, “the bid that you think you are matching has already been hit and it’s offered there [somewhere else]” (O’Donnell interview). As others put it, “the order walks away from you.” As a result, the participation rates of the specialists, which measure the degree to which they engaged as principal, fell from 20 percent to between 1 and 2 percent. One specialist firm even handed back its own license rather than attempting to sell it, arguing that it was worth nothing. Similarly, floor broker participation rates (their percentage of volume traded, relative to the total) fell from 10 percent to 3-4 percent. At some point, an Exchange official recounts, “it became clear that while the operation was a success, the patient was really dying” (Pastina interview). The extent of the dissatisfaction at the Exchange became clear when we met with Robert Hardy, the specialist we had followed in 2003. We found him at the post of a different firm, LaBranche & Co, tall and impeccably dressed as usual. Yet this time he appeared concerned. “I don’t see the market coming back until financials are doing better,” he told us 26 DRAFT – PLEASE DO NOT CITE about stock prices, resorting to the sensemaking routine he knew well. As he spoke we noticed a peculiar artifact resting on the top of one of his monitors: a small statue of a bull with a folded cylinder of paper protruding from behind it. The bull was labeled “Hybrid” and the cylinder read “ECNs.” When we asked about it, the clerk in Hardy’s post gave us an embarrassed look. “No, it’s ok,” Hardy said to him. And, to us: “that’s what we think is happening with the market right now. We’re getting screwed by the ECNs.” One of the specialists went on to describe the problems that other NYSE officials had already highlighted: a proliferation of ECNs and dark pools, had led to a system that was unstable and unfair. “Stocks are now very volatile, very thin margins. Before, it used to be everyone on an equal footing. Now, the people who have the bigger computer and the more money are winning. It’s a poor system.” Amidst its limitations, Hybrid had one promising feature. It could shift between automated and manual trading. This feature, known as Liquidity Replenishment Point (LRP), shifted the auction from automatic to manual when prices moved beyond a certain volatility threshold. This feature was aimed to dampen volatility under crises by conducting a call auction at the specialist post. Hybrid thus restored the specialists’ ability to pace the rhythm of trading. It also recreated the crowd of floor brokers at the post, allowing them to engage in sensemaking. A second attempt at automation: the New Generation Model. A change in the management of the Exchange in December 2007 created an opening for reforming Hybrid. Following the departure of John Thain for Merrill Lynch, the Exchange appointed his second in command, Duncan Niederauer, as chief executive officer. Niederauer started by investing $500 million on automation, building a state-of-the-art data center in Mahwah, New Jersey, that would allow high frequency traders to co-locate their servers. But Niederauer also had an appreciation for the NYSE’s heritage, and made clear his interest in maintaining the floor while continuing to invest in technology – a strategy described as “all things to all people.” To reform Hybrid, Niederauer assembled a team of executives that were both familiar with the NYSE and had enjoyed successful careers outside it. This included ex-specialists 27 DRAFT – PLEASE DO NOT CITE such as Michael Rutigliano, ex-customers like Joseph Mecane, and ex-competitors such as Larry Leibowitz. As Niederauer explained, “the management team that’s here observed [the NYSE] as an outsider.” Niederauer created two additional jobs, Specialist Liaison and Floor Broker Liaison, to ensure that the needs of those two collectives were taken into account in the redesign of the system. Niederauer thus brought in a governance framework for the design of automation that gave voice to the users, rather than just catering to technologists. The management team at the Exchange went on to debate how to redesign Hybrid. There were two key issues in the discussions. First, how to preserve the interaction between brokers and specialists? Some wanted to maintain the obligation for brokers to conduct trades by walking to the specialists post, while others did not (Willis interview). A second issue under discussion was how to regain block trading. Some participants argued for an order type that would be invisible to participants (O’Donnell interview). Our key observation from the interviews that recounted these discussions is the attention that the participants devoted to debate the specialist role. “What does it mean to be a specialist?” asked Leibowitz philosophically, in conversation with us. “Being a floor broker or a specialist,” Rutigliano told us, “was a ballet … I get goose bumps thinking about it.” It was, a floor governor and ex-specialist summed up, “what I did best” (Barry interview). Indeed, the director of floor operations of a leading floor brokerage firm, Gordon Charlop, went as far as enrolling into a doctoral program in management and writing a dissertation about the specialist. He argued that at the NYSE, “the move away from the distinctive floor trading system to an electronically mediated one shows signs of isomorphic forces at work” (Charlop, 2009:ii). In short, the management team at the Exchange were not just part of social structure, but also reflexive observers of it. The new system that emerged from these debates was labeled New Generation Model, and launched in November 2008. The new system introduced a clear separation in time between automatic and manual trading. It also introduced a number of measures aimed increasing specialist activity during the automated trading. We consider these below. 28 DRAFT – PLEASE DO NOT CITE The new specialists. The Exchange started by relaxing some restrictions that specialists confronted, allowing them to use algorithms to interact with the algorithmic order book. To do so, the Exchange removed the agency function of the specialists: they would no longer represent customers’ orders, and would only act as principals. In turn, specialists gave up on their advance look at the order book. These changes pushed the specialist away from price setting to a more peripheral but tenable role as participant. The Exchange also altered the economics of the specialists, introducing parity and subsidies. Parity was aimed at incentivizing specialist participation, and subsidies would ensure the specialist system survived. Given the fundamental nature of these changes, their name was changed to “designated market maker” (DMM). The Exchange also gave the new specialists new tools to interact with the algorithmic order book. For this it relied on interface design expert Brad Paley. Paley increased the usability of the algorithms by building a graphic user interface that allowed for the use of different algos simultaneously. This interface allowed the specialist to alter the size of the disparity that triggered buy and sell orders. The narrower the disparity, the more “aggressive” the strategy, like “fishermen who fish with different net sizes” (Paley interview). Paley also included two extra grey boxes on the specialists’ screen showing the most recent large trades and the largest holders of the stock, giving the new specialists a form of social cue. In this way, Paley’s work sought to algorithmically reproduce the opportunities for sensemaking that specialists had in a manual trading floor. This “steering wheel for algos” would, Paley hoped, restore the specialists’ ability to compete by increasing the usability of the algos. But the transition to algorithmic trading at the specialist post was a long and complex process. We grasped the dimension of the challenge in June of 2009 during a visit to the specialist post of Bank of America. There, we met and observed the work of the designated market maker for Goldman Sachs, Peter Giachi. Giachi stood at his post, but instead of talking to the floor brokers who walked up to him as we saw Robert Hardy do --not a single broker did during the hour we visited the post-- he focused on six screens in front of him. His trading strategy reproduced the mean-reversion approach he used in a manual environment, 29 DRAFT – PLEASE DO NOT CITE but now the algorithm did the information processing. “Before,” he explained, he would see “sell sell, and suddenly sell, buy, sell, sell, buy, and go ‘this is it, this is it.’” Now, his algorithm replicated the approach: when the price of a stock moved more than three dollars away from the VWAP, it bought. Yet Giachi did not seem willing to delegate his trading to the algorithm completely. “I’ve got seventeen algos,” he explained. “They’re carrying the noise back and forth. They have no mind of their own. But what they allow me to do is wait till I can commit capital.” Giachi, we concluded, was using technology as an aid to his manual skills. The integration of algorithms and specialists had restored the latter’s ability to interact with the order book, but it had yet to yield trading strategies that leveraged the distinct possibilities of the algos. Two years after our visit to Giachi, Bank of America sold the post where Giachi worked. The buyer was Getco Llc. The move came as a surprise, given Getco’s standing as a leading algorithmic market maker: why would the company expand into the trading floor? The firm, a Getco executive explained to us, was not planning on making changes to the specialist’s market-making, but rather to focus on a new area -- client services. The firm, [Is hoping to do] A much better job of servicing the clients of the post, the forty companies that each DMM is a specialist for. These companies need a lot of communicating with the DMM because their investor relations officer often wants to know what’s going on with the stock. More than one half of the time is service. We have three full time people upstairs doing it. Indeed, he explained, one of the key advantages of the NYSE over Nasdaq in attracting listings was the existence of a live person on the floor. Getco was thus building on the sensemaking function of the specialists, but shifting the focus from internal communication (with floor brokers) to external communication, with listed companies. Another reason for Getco’s move appeared to be gaining legitimacy from the presence on the floor: as a Getco spokesman told The Financial Times, “it’s important to demonstrate that we’re in this for the long haul and align ourselves with an institution that’s been around for a couple hundred years” (Demos 2012). This second reason is particularly interesting, as it can be seen as an attempt by algorithmic traders to enter a voluntary regime of norm enforcement by locating 30 DRAFT – PLEASE DO NOT CITE themselves on the floor of the NYSE, pointing to a possible role for the NYSE as an institution that privately enforces norms for high frequency traders. In addition to new tools, the New Generation Model introduced other changes to the specialists’ job. These integrated norm enforcement within algorithmic trading. The specialists’ quoting obligations were retained, although reduced; and the Exchange created a new version of the new specialist, called Supplemental Liquidity Provider, that did not have to be on the floor and had similar (though lighter) obligations as the designated market makers on the floor. The decision to have market makers outside the floor was a logical implication of automation: in a world of algorithms, actors no longer need to be on the same space to trade. By allowing traders outside the floor, and especially by subjecting them to obligations, the Exchange leveraged algorithms for the purpose of norm enforcement. The NYSE also made an unsuccessful effort to recreate block trading. It developed a new type of order that would not be visible in the book, the Non-Displayed Reserve Order (O’Donnell interview), and developed block trading platforms known as MatchPoint, as well as a feature known as BlocTalk. But these efforts, which can be seen as attempts at reproducing partial disclosure, were of no avail as trading in large blocks of shares had shifted to dark pools. Indeed, by 2012 the combined market share of dark pools and internalized order matches within broker dealers added up to an unprecedented one third of total volume in US equities trading (See MacKenzie and Pardo-Guerra 2013: 45). The new floor brokers. As with the specialists, the Exchange re-equipped floor brokers. Instead of buying and selling from the specialist as they used to, brokers were to do so from the algorithmic book directly. To that end, the NYSE allowed brokers to transform their rudimentary handheld terminal (a tablet-like portable computer) to support execution algorithms. In this way, brokers went from using the handheld for trade annotation to using it for trade execution, putting them on a similar footing as off-floor brokers. We observed these changes in action at the broker’s booth of VDM Institutional Brokerage in August 2009. There, we saw Benedict Willis use his NYSE-designed handheld (known as eBroker) with a stylus. “By standing here, he said, “I am technically in every one 31 DRAFT – PLEASE DO NOT CITE of the crowds.” Instead of walking frantically from one specialist post to another, Willis just tapped on the screen. These new handhelds thus reproduced some of the sensemaking possibilities of a trading crowd at the post with a messaging application. In one of his screens, Benedict had a list of tags with the badges of the other floor brokers who were buyers or sellers in a given stock. He explained, You can just tap a button, you can look at a stock and find out who the players were in it and then you can actually tap the line where the broker’s badge is and you’ll get like a messaging window. Benedict demonstrated this by opening up a screen, writing a question mark and sending it to a colleague. After a few seconds, the colleague replied with another handwritten note, saying “just stray. sorry,” meaning that he did not have any specific information, and apologized for not being able to provide insight. Nevertheless, the handheld and a brokers’ crowd are not complete substitutes. While a crowd creates unintended communication trajectories (Hutchins 1995), the handheld requires brokers to purposefully communicate with each other. Indeed, the lack of crowds at the specialist post was a cause for concern for Benedict Willis. The management team at the NYSE concurred: as one official explained, “at the pods it’s a busy beehive, but cross pollination is not happening,” suggesting that brokers and specialists were not engaging in sensemaking during routine trading (Pastina interview). In a related decision, the Exchange relaxed the internal rules about what brokers could trade on the floor. Traditionally floor brokers were barred from trading with others outside the floor; by relaxing this restriction the NYSE allowed brokers to buy or sell from other exchanges. We saw the effects of the new policy at the brokerage booth of Rosenblatt Securities. During our visit, we were struck by the large number of screens -- covering the entire wall of the booth. We soon found out why: Rosenblatt traded in the different exchanges from the floor of the NYSE, and different trading venues required different screens. “If I am only at one center [exchange] I am missing all this volume, I’ve got a problem,” explained Gordon Charlop, Director of floor operations. The brokers, we eventually understood, were re-aggregating the liquidity that was fragmented across exchanges because of Reg-NMS. 32 DRAFT – PLEASE DO NOT CITE Market open and close. The New Generation Model also preserved the market open and close auctions as they were in 2003. These auctions, with prices shouted at the post, offered partial disclosure of the interests of the actors in the crowd, allowing the specialists to open the stock at the price they thought was going to reach during the first minutes of trading rather than at that where the last buy and sell orders crossed. According to an ex-specialist, this prevented manipulation and led to opening prices that were “more representative” of supply and demand (O’Donnell interview). Specialists traditionally benefited from these auctions because they could see who bid at various prices, and could use the information in matching orders. Preserving the manual open and close auction also gave the new specialists a critical source of revenue, around 70 percent of their total income, according to Rutigliano. It also facilitated sensemaking during regular trading hours by preserving a routine of daily interaction between brokers and specialists, much as soldiers develop preparedness during peacetime by constantly reproducing battle situations. New use of space. The NYSE complemented the new tools discussed so far by building new booths for brokers, replacing their vertical phone stations with horizontal desks and with stools that allowed them to sit rather than stand. The new booths, known as “market pods,” also gave brokers room to use four screens rather than one. The goal, according to Lou Pastina, was to achieve better integration between automation and social interaction on the floor. “They can trade on their screens most of the day,” he explained, “and then for the open, close and when something happens, they can simply take their handhelds and go to the post where the action is happening.” A more cynical view, voiced by a floor participant, was that the Exchange was putting in the booths as a response to the dwindling numbers on the floor: the new booths “give the appearance that the place if full.” The Exchange also redesigned the specialist posts by eliminating obsolete plasma screens previously used for manual auctions, resulting in a less cluttered post. To these, the Exchange added a Starbucks coffee bar, highlighting the less hectic nature of work in an automated environment. Folding 33 DRAFT – PLEASE DO NOT CITE As we have detailed so far, the introduction of the New Generation Model in 2008 marked a shift from the NYSE’s first attempt to automate, Hybrid. More importantly, the New Generation Model managed to preserve the role of the specialists and floor brokers. According to NYSE officials, the new specialists increased their participation from three percent in 2007 to thirteen percent in 2009, and brokers tripled their participation from two to six percent during the same time (Rutigliano interview). Given the subsidies handed out to specialists, it is unclear on the basis of our data whether the New Generation Model is profitable or sustainable for the Exchange, but we can safely conclude that it kept the specialists and brokers alive. How was folding accomplished? We infer the mechanisms behind folding from the differences and similarities between the Hybrid and the New Generation Model. These two were similar in three key respects: both had a trading floor, both could shift from algorithmic to manual trading during crises, and both preserved manual trading during the market open and close. This suggests that a dual trading mode (manual and algorithmic) was a key part of the folding that we observed. At the heart of the NYSE’s efforts at folding thus lie two different and expensive material structures: a trading floor and a data center. And standing between these two are the Liquidity Replenishment Points, an organizational provision that shifts trading mode between manual and algorithmic. In addition to similarities, there were also differences between the two rounds of automation at the NYSE, and these hint at additional theoretical features of successful folding. First, the New Generation Model eliminated manual trading from normal trading hours, suggesting that a strict separation in time is necessary for algorithms to coexist with humans. Second, the state-of-the-art data center that the Exchange built in Mahwah (the size of three football stadiums) brought its algorithmic order matching engine up to speed with rival exchanges. Both of these aspects, separation and the greater speed, pertain to the pace of trading. For folding to be viable, the speed of the algorithmic mode needs to be in line with that of other exchanges, and manual and automatic trading must be separated so that their speeds are not out of line. The other differences between the Hybrid and the New Generation 34 DRAFT – PLEASE DO NOT CITE Model point to the need for new tools and a more integrative platform. The NYSE equipped brokers and specialists to engage in algorithmic trading during normal trading hours, suggesting that retooling –building a dedicated material basis-- is required for intermediaries to be active in algorithmic mode. Finally, the Exchange changed its rules to develop the floor into a more inclusive platform, loosening restrictions, and introducing a new form of off-floor specialist . Sociomaterial duplication. Our comparison between the Hybrid and New Generation Model suggests that the core mechanism behind folding is a reorganization of agency around two distinct modes, manual and automatic. These are separated in time but not in place: during crises and the open and the close, the Exchange functions exactly as it did traditionally, based on the specialist auction at the post; during normal trading hours the Exchange functions somewhat like other automated exchanges, matching orders algorithmically. The key difference is that brokers and specialists participate in this market from the trading floor but, then again, not as they do during manual trading but in a different role. The manual and the automatic thus do not coexist in time, in line with their different rhythms. The result is not just material duplication, but what we call --following the terminology of Orlikowski and Scott 2008-- sociomaterial duplication. By this we mean the use of seemingly redundant technologies for the purpose of sustaining multiple social structures. This mechanism is in some sense similar to the dual-engine system used by some oil tankers: one low-consumption engine for sailing in the open sea; and another one, less efficient but more versatile, for approaching the harbor and docking. A second component of this reorganization of agency is the creation of synergies between the two modes, manual and algorithmic. These centered on providing the intermediaries on the floor the means to continue working in algorithmic mode. This required a change in practices: as Leibowitz put it, during the day they “become high frequency traders,” that is, engage with the algorithmic matching engine rather than among themselves. To make this possible, the Exchange reduced the specialists’ role from price-setters to 35 DRAFT – PLEASE DO NOT CITE premium users, and gave them parity and subsidies. We see these changes as components of sociomaterial duplication, because they allow the actors to operate in a dual mode. Separability of the material and the social The efforts at folding at the NYSE speak to the aforementioned debate over the separability of the social and the material . Because the NYSE sought to preserve the specialist system in a different technology, its attempt can be read as a natural experiment that tests whether social structure exists with independence of the material setting. Specifically, this test can be seen as follows: were the Exchange able to extend the roles of specialist and broker to the algorithmic mode, one could conclude that a given social structure can exist in different material settings, and thus that the social is separate from the material. If on the other hand the Exchange was unable to extend its structure to a different material setting, one could conclude that the social and the material cannot be separated. The NYSE’s natural experiment offers two lessons. First, because the NYSE’s attempt at folding was based on duplicating its modes of operation, the Exchange cannot be said to have extended the specialist structure in an automated setting. As noted, the specialists and brokers were internally described as operating as high-frequency traders, and do not interact in call auctions at the post during normal opening hours (“cross pollination is not happening”). We interpret these as a fundamentally different role for the intermediaries. This suggests that the specialist role was inseparable from manual auctions on the floor, and hints at the possibility that the social and the material are indeed inseparable. The natural experiment provides a second lesson. Even if the social and material appeared to be inseparable, the social structure appeared to have a separate existence from its material basis. Indeed, one of our more puzzling observations was the extent to which the management team at the Exchange was focused on the survival of “the specialist”. That these executives would pay such attention to an abstract role suggests that social structure existed in what Feldman and Pentland (2003) call the “ostensive” realm, that is, apart from any specific material setting. Such ostensive (as opposed to material) existence was critically important because it guided the efforts of the Exchange in designing its automation. Thus in contrast to 36 DRAFT – PLEASE DO NOT CITE Latour (2005), who sees the social only as an outcome of material associations and not as an explanatory cause for economic phenomena, our observations at the Exchange suggest that the social, as in the abstract role called “the specialist,” was an explanatory cause for social action, providing a roadmap for the NYSE in its automation. We conclude that social structure is thus not simply an emerging consequence of action, but an orienting guide for it. Such ability of social structure to shape action does not play out in a “latent” form or through the “hidden social forces” criticized by Latour (1986), but through the ability of organizational actors to be reflexive about their own social world. It is not that social structure shapes society because some academic sociologist says so, but because of the actors themselves mobilized structure, and specifically roles such as “the specialist” it to achieve their own objective, which in this case was the preservation of the intermediary functions performed by the NYSE. We expand on this last point below. Effects of folding Our analysis so far has emphasized the process of automation, but not yet considered its outcome. Did folding prove effective for the Exchange? This question shifts the focus of from the roles to the functions played by the intermediaries at the Exchange: were they able to continue providing the coordination, sensemaking and norm enforcement they provided in 2003? Our analysis suggests the NYSE retained some (but not all) of the social properties of the original trading floor. In manual trading, the New Generation Model preserved the sensemaking advantages of partial disclosure during the open and the close, and during crises. The New Generation Model also retained norm enforcement in the form of the positive obligations of the new specialists and the secondary liquidity providers. Finally, it sacrificed the coordinating functions of price discovery and the matching of blocks. In automated trading, the Exchange preserved norm enforcement, coordinated buyers and sellers through the matching algorithm as algorithmic exchanges do, and offered no form sensemaking. In short, the NYSE preserved its original functions in manual mode, and lost a significant part of them in automated mode (see Table 2). ---- Insert Table 2 ---- 37 DRAFT – PLEASE DO NOT CITE In commercial terms, folding appears to have had a somewhat positive effect on the performance of the NYSE: while its share of the US equities market since the New Generation Model was introduced in 2008 did not return to the NYSE peak of 82 percent in 2003, market share remained constant at around 25 percent. Automation may thus have contributed to stop the market share decline of the NYSE, but not to reverse it. Similarly, automation did not contribute to improve the market capitalization of the parent company of the NYSE, the NYSE Euronext Group, which fell by half in the aftermath of Reg-NMS and never recovered. But it is difficult to draw conclusions from the evolution of the share price. The failure to boost stock valuation on the part of the New Generation Hybrid can also be due to the vastly smaller importance of equities trading in the Group’s overall profits, and is consistent with recent news that the management of the Exchange agreed to an acquisition by the Intercontinental Exchange (ICE) (Tabb, 2012). --- Table 2 here --The Flash Crash. The commercial impact of the redesigned NYSE is not the only way to measure its success, and perhaps not the most relevant one. The redesign was put to critical test during the Flash Crash of May 6, 2010. The official report on the crash blamed a Kansas-based fund that set the parameters of its Sell Algorithm too aggressively (SEC-CFTC 2011). The rapid selloff of 75,000 E-Mini contracts prompted what organizational theorists would describe as a breakdown in sensemaking. Indeed, as high frequency trading funds absorbed part of the selling volume (with a net long position of 3,300 contracts), their volume of transactions went up to as much as 140,000 E-Mini contracts. This is usual for high frequency funds as they routinely issue numerous order cancellations in the process of trading. Such high volume of transactions had an unexpected effect on the Sell Algorithm. As the official report explains, “the Sell Algorithm … responded to the increased volume by increasing the rate at which it was feeding the orders into the market, even though orders that it already sent to the market were arguably not yet fully absorbed” (SEC/CFTC 2011: 3). In other words, the mistaken response of the Sell Algorithm to the trades of the other algorithmic traders flooded the market, prompting a sharp decrease in price. At the root of the problem 38 DRAFT – PLEASE DO NOT CITE was the use of a decision rule that proved fatal on the part of the Sell Algorithmic: it used trading volume as a proxy for liquidity, whereas in fact, as the official report argues, “in times of significant volatility, high trading volume is not necessarily a reliable indicator of market liquidity” (SEC/CFTC 2011:3). The fatal interaction between decision rules in various algorithmic participants can be seen as an algorithmic breakdown in sensemaking. The Flash Crash offers an opportunity to understand the effect of automation on the American equities market. As noted above, a primary goal of the automation was disintermediation. This has been largely achieved, with more than three quarters of traded volume now handled by algorithmic order matching at Nasdaq, Bats or dark pools and internalizers. But because one of the stock exchanges, the NYSE, retained its intermediary structure, the impact of the Flash Crash on the various exchanges can shed light on the merits of an intermediary structure. Simply put: which exchanges performed better during the Flash Crash? In that regard, the performance of the NYSE during the crisis appears to be vastly superior to that of algorithmic exchanges. We focus on one key figure: order cancellations. Such was the dislocation of prices during the Flash Crash that the SEC decided to cancel all trades beyond a 20 percent band of the prevailing price twenty minutes before the crash. This led to massive cancellations in all exchanges, except for the NYSE. As stated by Jane Kissane, legal counsel of the NYSE, in a letter to the SEC (Kissane 2010: 4): In the aftermath of May 6, other exchanges … engaged in a much criticized process of cancelling approximately 15,000 trades as ‘clearly erroneous.’ In contrast, not a single NYSE trade (excluding NYSE Arca, its electronic version) was required to be cancelled. This lack of cancellations was not due to disengagement: the NYSE’s market share between 2:30 pm and 3 pm was 26 percent, as compared to 21 percent on prior days (Kissane 2010: 5). As a result of the NYSE’s shift to manual model, the counsel adds, prices on the NYSE were far less volatile than prices on electronic exchanges. One interesting case is Arca, the electronic version of the NYSE. This experienced similar trade cancelations as other algorithmic exchanges, which is further confirmation of the superiority of the shift to manual 39 DRAFT – PLEASE DO NOT CITE trading. It is not that the NYSE did better because of its location or brand (Arca had all these), but because of its reliance on the intermediaries on the floor. We elaborate on this point. In explaining the superior performance of the NYSE, officials point to the Exchange’s ability to switch from automated to manual auctions with the Liquidity Replenishment Points, which were highly active during the crash. While on a normal trading day there around are 50 LRPs activated, on the day of the Flash Crash there were more than 70,000 (Pellechia interview). In a much-discussed article in Tabb Forum, microstructure specialist Stephen Wunsch (2010) concurs: The partially manual LRPs allowed the Big Board to apply some measure of oldfashioned reasonability tests to price formation. As a consequence, no NYSE trades printed at zero or anywhere close to it. Unlike all the other stock exchanges, the NYSE did not have to break any trades. As Wunsch notes, the reason for the NYSE’s superior performance was its ability to sustain sensemaking. Humans on the floor could draw on social cues and their prior experience to establish that the sudden drop in the Dow Jones was purely due to factors internal to the market, with no economic news that could justify it. As soon as the crash hit stocks like Accenture and Procter and Gamble, floor brokers were running to the post of the designated market maker and conferring among themselves. “What the market makers had to remember was,” an Exchange official explains, “what’s happening everywhere else is not real” (Pastina interview; our emphasis). A related reason for the greater stability of the NYSE was norm enforcement. As Wunsch (2010) emphasizes, once the problems started, high-frequency traders withdrew liquidity from the system: Their high frequency market makers, sensing trouble, disappeared. With little else in their books, the market orders pushed prices to where the stub quotes were, producing ridiculous trade prices. With no floor governors or other manual processes to spot the difference between real trades and market structure failure, the electronic NMS printed them all. This argument was echoed by NYSE officials. For instance, the low price in one of the more erroneously traded stocks, Procter and Gamble, was $39 in other exchanges but $56 at the NYSE. The reason for the difference, according to an Exchange official, was that unlike 40 DRAFT – PLEASE DO NOT CITE market makers at database exchanges, NYSE specialists had a positive obligation to commit capital (Mecane interview). Back in 2008, he explained, One of the flaws of electronic markets is that in general people don’t have obligations with respect to the market so they come and go as they please. So if they get nervous about a situation, a macroeconomic event, a political event, they go away. In other words, the absence of obligations can create bouts of illiquidity. Indeed, in the wake of the crash, the SEC sought to emulate the advantages of the NYSE’s Liquidity Replenishing Points by mandating individual-stock circuit breakers across exchanges in the US (Metha 2011). Unfortunately, those do not emulate the judgment provided by the specialists, and were often unnecessarily triggered (Wunsch 2011). In sum, the NYSE performed far better than the algorithmic exchanges during the Flash Crash. It increased its overall participation, its designated market makers honored their obligations, its prices were less volatile, and it did not cancel any trades. Such superior performance points to the weakness of the purely automated model, and to the limits of the information-processing conception of securities trading held by Black (1971) and others. Instead, it points to the strengths of the market intermediary and the sociological notion that markets beset by opportunism and uncertainty need intermediaries. The specialists and floor brokers at the NYSE provided coordination, sensemaking and norm enforcement that helped market actors confront radical uncertainty and limit the individual incentive to pull out under crisis. DISCUSSION AND CONCLUSION Our study contributes to economic sociology by engaging with the scholarly debate on the effects of automation in markets. Existing studies have portrayed intermediation as a automation as a dilution of social relations. Our study of the NYSE proposes instead the notion of folding, that is, a design of automation that preserves the original social structure of a market or organization. Folding, our study suggests, can be accomplished by incorporating new technologies while preserving the original ones and having a mechanism that shift between the two. The duplicative approach followed by the NYSE also points to the difficulties of separating a given structure of social relations from the original material basis 41 DRAFT – PLEASE DO NOT CITE that sustained it. At the same time, the prominence of social structure in the discussions on automation at the Exchange suggests that structure shapes action, rather than being a consequence of it. Our study also offers public policy implications. The redesign of the NYSE was put to test during the Flash Crash of May 6, 2010. The solidity of the Exchange’s performance under crisis suggests that the SEC’s vision of markets as information processing devices may lack mechanisms to deal with uncertainty and opportunism. By contrast, our study points to the value of intermediary structures and offers a guideline for the introduction of algorithms in markets that preserve this structures. Future attempts at automating a securities market, as in the ongoing reform of the American equities market or in Europe’s MiFID, should take into account the functions performed by the original intermediaries. 42 DRAFT – PLEASE DO NOT CITE REFERENCES Abolafia, M. 1996. Making Markets: Opportunism and Restraint on Wall Street. Cambridge, MA: Harvard University Press. Barley, S. R. 1986. Technology as an occasion for structuring: observations on CT scanners and the social order of radiology departments. Administrative Science Quarterly, 31:78 108. Baker, W. 1984. The Social Structure of a National Securities Market. American Journal of Sociology Vol. 89, No 4: 775-81. Barley, S. R. 1986. Technology as an occasion for structuring: evidence from observations of CT scanners and the social order of radiology departments, Administrative Science Quarterly, 31: 78-108. Beunza, D. and D. Stark. 2004. Tools of the trade: the socio-technology of arbitrage in a Wall Street trading room. Industrial and Corporate Change 13(2): 369. Black, F. 1971. Toward a fully automated stock exchange, part I. Financial Analysts Journal, 27 (4): 28. Arnuk, S. and J. Saluzzi. 2010. Broken Markets: How High Frequency Trading and Predatory Practices on Wall Street Are Destroying Investor Confidence and Your Portfolio. FT Press, Upper Saddle River, NJ. Brooks, J. 1969. Once in Golconda: A True Drama of Wall Street 1920–1938. New York: Harper & Row. Burt, R. S. 1992. Structural Holes: The Social Structure of Competition. Cambridge: Harvard University Press. Callon, M. 1986. Some Elements of a Sociology of Translation: Domestication of the Scallops and the Fishermen of St Brieuc Bay. Pp. 196-233 in Power, Action and Belief: A New Sociology of Knowledge, edited by John Law. London: Routledge & Kegan Paul. _____, 1998. The Laws of the Markets. London: Blackwell Publishers. Carruthers, B. and Stinchcombe, A. 1999. The social structure of liquidity. Theory and Society 28 (3): 353-82. Christie, W. and Schultz, P. 1994. Why do NASDAQ market makers avoid odd-eighth quotes? Journal of Finance, 49: 1813-1840. Colesanti, J. 2008. Not Dead Yet: How New York's Finnerty Decision Salvaged the Stock Exchange Specialist. Journal of Civil Rights and Economic Development. 23(1): 1-34. D’Adderio, L. 2008. The Performativity of Routines: Theorizing the Influence of Artifacts and Distributed Agencies on Routines Dynamics. Research Policy 37(5)769-789 David, P. 2010. May 6th – Signals from a Very Brief but Emblematic Catastrophe on Wall Street. Working Paper. http://ssrn.com/abstract=1641419. Accessed on December 2011. Demos, T. 2011. Getco acquires BofA marketmaking slots. Financial Times. November 30. Available at http://www.ft.com/cms/s/0/fd600ff4-1b77-11e1-8b1100144feabdc0.html#axzz2HOXK9l9a accessed December 2012 43 DRAFT – PLEASE DO NOT CITE Domowitz, I. 1993. A taxonomy of automated trade execution systems. Journal of international money and finance, 12 (6), p. 607. Fligstein, N. 2010. “Response to Kenneth Zimmerman.” Economic Sociology_The European Electronic Newsletter 11:53. Garcia-Parpet, M-F. 2007. “The Social Construction of a Perfect Market: The Strawberry Auction at Fontaines-en-Sologne,” pp. 20-53 in D. MacKenzie, F. Muniesa and L. Siu (eds.), Do Economists Make Markets? On the Performativity of Economics (Princeton University Press, Princeton, NJ.) Gasparino, C. .2007. King of the Club: Richard Grasso and the Survival of the New York Stock Exchange, New York: Collins Business. Glaser, B. G., and A. L. Strauss. 1967. The Discovery of Grounded Theory: Aldine. Glosten, L., and P. Milgrom .1985. “Bid, Ask, and Transaction Prices in a Specialist Market with Heterogeneously Informed Traders,” Journal of Financial Economics, 13, 71–100. Grant, J. and Stafford, J. 2011. Studies say no link between HFT and volatility. Sept 8, 2011. http://www.ft.com/cms/s/0/38452490-da07-11e0-b199-00144feabdc0.html#axzz1j4CUW4r9 Hasbrouck, J. and Saar, G. 2010. Low-Latency Trading. Johnson School Research Paper Series No. 35-2010. Available http://ssrn.com/abstract=1695460 or http://dx.doi.org/10.2139/ssrn.1695460 accessed in December 2012. Hendershott, T., Jones, C. M. and Menkveld, A. J. 2011. Does algorithmic trading improve liquidity? Journal of Finance 66, 1–33. Hutchins, E. 1995. "How a cockpit remembers its speeds." Cognitive Science, 19: 265-288. Khurana, R. 2002. Market Triads: A Theoretical and Empirical Analysis of Market. International Journal of Theor. Soc. Behav. 2002, 32, 239–262. Kissane, J. 2010. File No. 265-26, Joint CFTC-SEC Advisory Committee on Emerging Regulatory Issues http://www.sec.gov/comments/265-26/265-26-26.pdf Knorr Cetina, Karin and Urs Bruegger. 2002. Global Microstructures: The Virtual Societies of Financial Markets. American Journal of Sociology 107(4): 905-950. _____ and Alexandru Preda. 2007. “The Temporalization of Financial Markets: From Network to Flow.” Theory, Culture and Society 24: 116-138. Latour, B. 1987. Science In Action: How to Follow Scientists and Engineers Through Society, Harvard University Press, Cambridge Mass., USA. Latour, B. 1991. Technology is society made durable. In Law, J. (ed.), A Sociology of Monsters. Essays on Power, Technology and Domination, Routledge, London. Latour, B. 2005. Re-assembling the social: An introduction to actor-network theory. Oxford University Press, Oxford. Saar, Gideon, 2010, Specialist Markets, in Encyclopedia of Quantitative Finance, eds Rama 44 DRAFT – PLEASE DO NOT CITE Cont, John Wiley & Sons. Securities and Exchange Commission and Commodities and Futures Trading Commission. 2010. Findings Regarding the Market Events of May 6, 2010, Report of the staffs of the CFTC and SEC to the Joint Advisory Committee on Emerging Regulatory Issues, September 30. Lessig, L. 2000. Code and Other Laws of Cyberspace. Basic Books. New York. MacKenzie, Donald. 2006. An Engine, Not a Camera: Not a Camera: How Financial Models Shape Markets. Cambridge, MA: MIT Press. _____ 2012. Mechanizing the Merc: The Chicago Mercantile Exchange and the Rise of HighFrequency Trading. Working paper. http://www.sps.ed.ac.uk/__data/assets/pdf_file/0006/93867/Merc11.pdf Accessed July 2012. Mehrling, P. 2005. Fischer Black and The Revolutionary Idea of Finance. Hoboken, NJ: John Wiley and Sons. Mirowski, P. 2002. Machine dreams economics becomes a cyborg science, Cambridge University Press: Cambridge. Muniesa, F. 2004. Assemblage of a market mechanism. Journal of the Center for Information Studies (5): 11-19. _____, F. 2007. ’Market technologies and the pragmatics of prices.’ Economy and Society, 36(3): 377-395. _____ 2011. "Is a stock exchange a computer solution? Explicitness, algorithms and the Arizona Stock Exchange", International Journal of Actor-Network Theory and Technological Innovation 3(1): 1-15 Orlikowski, W. J. 1992. The Duality of Technology: Rethinking the Concept of Technology in Organizations. Organization Science, 3(3): 398-427. Pardo-Guerra, J. P. 2010. Creating flows of interpersonal bits: the automation of the London Stock Exchange, 1955-1990. Economy and Society, 39(1), p. 84. Feldman, M. and Brian T. Pentland. 2003. Reconceptualizing organizational routines as a source of flexibility and change. Administrative Science Quarterly, 48: 94-118. Pitluck, A. 2008. The Social Production of Anonymity in Markets, paper presented at the ASA meeting, 2008. Scott, Susan V and Barrett, Michael I. 2005. Strategic risk positioning as sensemaking in crisis: the adoption of electronic trading at the London International Financial Futures and Options Exchange. The journal of strategic information systems, 14 (1). pp. 45-68. Stone, Harold S. 1972. Introduction to Computer Organization and Data Structures (1972 ed.). McGraw-Hill, New York. Simmel, Georg. 1902 [1950]. The Sociology of Georg Simmel. Toronto, Ontario: Free Press. Sobel, R. 1975. NYSE: a history of the New York Stock Exchange: 1935–1975. New York: 45 DRAFT – PLEASE DO NOT CITE Weybright and Talley. Tripsas, M., and Gavetti, G. 2000. Capabilties, cognition, and inertia: Evidence from digital imaging. Strategic Management Journal, 21(10/11): 1147-1162. Wagner, W. 2004. The Market-Maker in the Age of the ECN. Jounal of Investment Management 2(1): 4-15 Wunsch, Stephen. 2008. “Challenges to the sell-side.” Chapter in Wayne Wagner (ed.), Meeting the Noble Challenges of Funding Pensions, Deficits, and Growth. Wiley Finance: New York. _____ 2010. War on Wealth: The SEC, the National Market System and the Flash Crash. _____ 2011. ‘Straitjacket’ (parts 1-3) Available on http://www.tabbforum.com/opinions/ Accessed on 9.1.2011. Zaloom, Caitlin. 2001. “Ambiguous Numbers: Trading Technologies and Interpretation in Financial Markets”.American Ethnologist 30(2):258–72. Zaloom, C. 2006. Out of the Pits: Traders and Technology from Chicago to London. Chicago: University of Chicago Press. Mahoney J. 2004. Comparative-historical methodology. Annual Review of Sociology 30(1):81–101. Mahoney J, Rueschemeyer D, eds. 2003. Comparative Historical Analysis in the Social Sciences. Cambridge: Cambridge Univ. Press. Paige J. 1999. Conjuncture, comparison, and conditional theory in macrosocial inquiry. American Journal of Sociology 105:781–800. Riley D. 2006. Waves of historical sociology. International Journal of Comparative Sociology 47:379–86. Skocpol T, ed. 1984. Vision and Method in Historical Sociology. Cambridge: Cambridge Univ. Press. 46 DRAFT – PLEASE DO NOT CITE Table 1. Market Share of the NYSE. Note: Data is for market share for all us equities turnover, Tape A. Source: Thompson Reuters Year 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 Market share 80.00% 80.61% 78.14% 77.92% 76.07% 76.03% 71.00% 44.27% 29.41% 27.16% 27.78% 26.86% 23.80% 47 DRAFT – PLEASE DO NOT CITE Table 2: Functions of the NYSE’s New Generation Model. Automated trading (9.30 am to 4.00 pm) Manual trading (market open and close, and LRP activation) Pacing No Yes Buffering Yes Yes Matching No Yes Sensemaking No Yes Norm enforcement Yes Yes Coordination 48 DRAFT – PLEASE DO NOT CITE APPENDIX A. Glossary of financial terms. Parity: the ability of a market maker such as the NYSE specialist to participate in an order at the same price (on par) as the customer. Look-back option: a financial option that allows investors to "look back" at the underlying prices occurring over the life of the option and then exercise based on the underlying asset's optimal value. Stub quote: An offer to buy or sell a stock at a price so far away from the prevailing market that it is not intended to be executed, such as an order to buy at a penny or an offer to sell at $100,000. 49