Modernist tools for a-modernist ontologies – yet another example of Cutting-edge equivocation? This paper is premised by a growing discomfort on my part concerning the rhetorical presentation of digital methods in relation to social science and humanist research promoted by among others Bruno Latour. This discomfort relates to how the relevance and purpose of these tools and methods are presented and argued for and the epistemological and ontological assumptions that these arguments build upon – and not the least their relation to constructivist STS and ANT. In her article Cutting-Edge Equivocation: Conceptual Moves and Rhetorical strategies in contemporary anti-epistemology, Barbara Hernnstein Smith points out the problem notably performed by among others the feminist STS scholar Donna Haraway of attempting to steer between on the one hand a realist, objectivist position and a relativist, constructivist position on the other. This strategy, often applauded by peers, which attempts to avoid the pitfalls of the extremes, whether realist or relativist, unfortunately is not without costs, Smith argues. One of the costs are “devil blackening” – that is generalizing and simplifying the different positions considerably: realism turns into naïve, positivist and determinist and relativism into anything goes and we construct the world as we see fit. But moreover, Smith also points, with respect to Haraway’s argument in relation to situated knowledges, to what she calls a ‘conceptual instability’. Conceptual instability occurs with the attempt to steer between and try to establish a moderate and arguably sensible ground away from the acclaimed naïve extremes. The conceptual instability occurs when Haraway for instance in her much cited quotation from situated knowledge argues that “our problem is how to have simultaneously an account of radical historical contingency for all knowledge claims….. and a no-nonsense commitment to faithful accounts of a ‘real’ world.. “. According to Smith, the problem is that Haraway seeks to establish a position that immediately considered might be thought of as intellectually sophisticated, however further scrutinized it might in fact turn out to be anything but. How, we may ask, is it possible to have it both ways? To both subscribe to a notion of radical historical contingency of all knowledge claims and 1 simultaneously insist on the idea that some knowledge claims are superior and more faithful accounts of the real world than others? Smith points out that the very act of putting the word ‘real’ in quotation marks, seems nicely to express this conceptual instability – the attempt at both subscribing to a notion of an objective, outthere reality and denouncing it at the same time. When reading through and teaching the contributions developed by among other Bruno Latour and Tomasso Venturini in relation to digital methods similar concerns arises. In the following I will point out and discuss a few examples of conceptual instability; which I have encountered in their presentations and promotion – I dare to say - of digital methods. Now and then - Expert knowledge trickling down? In the presentation of the MACOSPOL project, short for “Mapping Controversies in Science and Technology for Politics”, which is funded by a huge ERC grant and includes several universities besides Science Po, Paris, Latour motivates the projects by giving the following diagnosis of our contemporary condition: “The problem with information – the old way – was that it was trickling down from very authoritative sources…. to the general public. Now this is no longer the case. You have many different people producing information. Information is different from the different sources and what is new is that very often the experts themselves disagree with one another, which means that the poor citizen is now bombarded by very very different types of information.” (MACOSPOL teaser time 0.35-1.00) There are quite many things to remark in this quotation. Firstly, Latour applies a somewhat crude diagnostic of now and then. One where there was once a time where things were in a specific manner and now they have changed radically, not at all alike from the way Latour characterizes the logic of the moderns in his 1993 book: We have never been modern. What Latour argues in WHNBM is that the modernist reasoning and progressionist project is fueled exactly by creating and establishing crude and radical distinctions between past and present, a past to be remedied by a 2 present; or between the moderns and their counterpart others such as indigenous people. The problem of this type of reasoning is of course not that it is ‘modernist’ what ever that might mean anyway, but it’s straw man like qualities, in which the research project gains leverage and relevance by resting on crude and overly simplistic distinctions. Following Hernnstein Smith, we should be cautious to consider such diagnoses as intellectually sophisticated and well-conceived, when actually they are maybe the exact opposite: crude and ill-conceived. One might of course also argue, that this is simply a matter of presenting and popularizing the MACOSPOL project in a somewhat common sensical manner, as expected and necessitated in order to receive funding, but it is of course in this regard worthwhile considering the other performative consequences of such articulations, the degree by which they also affect how research is carried out and knowledge produced and circulated. I will return to this in the concluding part of the paper. Moreover, the quotation also include another interesting feature, the idea that back in the days of ‘old information’, information and arguably knowledge ‘was trickling down from the experts and other authoritative sources’ whereas nowadays (and mainly due to the internet arguably) information comes from a range of sources and with greater uncertainty. What is noteworthy here is not the idea that in the past there where arguably less information to go around – this idea is somehow well accepted, although it of course raises concerns about what qualifies as information and information production. What is more problematic is the notion that information arguably was ‘trickling down’. So back in the old days, knowledge was diffused from the expert throughout society and the docile laymen received and accepted it? What is at stake here is the notion of translation central to ANT, which states that knowledge and technologies are translated and thus transformed by those that ‘overtakes’ it. Translation is central to ANT, since it challenges structuralist and determinist ideas of how society and the social is formed and continuously and iteratively transformed. Or as Latour suggests in: Reassembling the social: An introduction to ANT: it is only the poorly executed analysis of the social that identify diffusion – or intermediaries – that is actors that simply receive and 3 circulate knowledge without translating and transforming it – whereas the thorough (hence the well executed ANT analysis) recognises translation and mediators (actors that translates). However, it seems that the principle of translation is strategically suspended by Latour when he refers to the old days, back in those days, diffusion worked and there were only mediators? So after all, structuralism is not principally misconstrued, as one might have thought if having read the main parts of Latours work, it only comes with an expiration date? Did it cease working – or functioning – around 1960’ties or 70’ties and conveniently with the rise of postmodernity? Last, in the quotation Latour also suggest that in the old days knowledge production was uncontroversial, whereas today controversies arise all over. Again, it seems that this argument rests on a strategic simplification and omission of many insights concerning the controversial nature of knowledge production due to its contingent, uncertain, heterogeneous and material qualities. Central tenets of both ANT and STS is thus seemingly dismissed, forgotten or strategically displaced by Latour in this, granted brief and popular presentation of MACOSPOL. Taking digital methods and the arguments relating to their relevance into account, Latour and Venturini argues somewhat immodestly for the great promise of digital methods. They state: “Up to now, access to collective phenomena has always been both incomplete and expensive. Compared to their colleagues in the natural sciences, social scientists have always been relatively poorly equipped. While physicists could follow billions of atoms in their accelerators and biologists could grow millions of microbes under their microscopes, social scientists could not simultaneously maintain breadth and depth in their observations. Their methods offered them a bipartite view of social existence, as they could either focus on specific interactions or skim the surface of global structures…. This structuralist vision is due to a great extent to the fact that the social sciences have never had methods to reconnect micro and macro and show how global 4 phenomena are built by the assemblage of local interactions. Digital technology promises to revolutionize this situation, providing the social sciences the possibility of following each thread of interaction and showing how social life is woven together by their assemblage. (Latour & Venturini p. 2-3)” In the above, we find the argument that digital methods may ‘fill the gap’ between the micro and macro and the term quali-quantitative methods is also coined in this relation. The argument basically goes that a divide exists between studying the grand structures of society with tools such as statistics based on crude and simplistic aggregation of the many minuscule actions and positions of the actors, but with the benefit of scope, on the one hand. In contrast hereto we find the studying of the particular actors and their actions and intentions thoroughly for the benefit of detail and sophisticated insight, but at the cost of scope and generalizability. In Venturini and Latours presentation, the social sciences thus comes across as somewhat amputated and accordingly disappointed, arguably because they cannot do what the natural sciences is able to with their methods and instruments. There are several things to note about this argument. First of all, the implied difference between the social sciences and the natural sciences seems premised by an inherently and surprisingly classical idea of the natural sciences as having privileged and complete access to the phenomena they study. In the account made by Latour and Venturini the natural sciences seems to possess the infamous god-eye view on their object. A notion of the natural science that seems far from how they are presented in constructivist contributions in STS and ANT such as Laboratory Life and Science in Action (authored as we know by respectively Woolgar & Latour and Latour). We thus also see the belittling of the social sciences as inferior to the natural sciences; a hierarchization of the sciences that adheres to classical philosophical rationalist thinking about the sciences and their ‘value’ and validity. But we also find another in-build assumption, namely the idea that phenomena are all alike. That, there are no differences between, say studying social interaction between human beings in a society and bacteria in a petri dishes. Although, there are good reasons to treat different phenomena 5 symmetrically and thus accordingly as equally complex/simple – and we might add that at least in this respect the authors are consistent with ANT in their arguments, another STS scholar and evidently a great inspiration for Latours work – Isabelle Stengers, has pointed out that one of the reasons for the hierarchization of the sciences with high energy physics presumably ranging at the top of the hierarchy, is the idea that every object and phenomena is considered to be equally complex, which entails that when Newtonian mechanics can predict the movement of planetary elements or falling object, then the failure to predict human behavior and societal development falls on the social sciences and their lack of intellectual capacity or methodological rigor or adequate tools and techniques. It is not considered to be due to the complexity of the phenomena or more adequately due to the specific way the different disciplines address and inquire their objects. In fact, Stengers suggests, it is exactly due to the classical representationalist and positivist ideals of the hard sciences and their according production of tools and methods that seeks to delimit and simplify their object that simple and predictable objects emerge. Thirdly, it is difficult not to notice the implicit dream of unified science lurking in the above argument – not to mention the technooptimist appraisal of digital methods as the tool that can reconcile the deep gap (and trauma) of the social sciences. We thus see an inherently representationalist notion of the social sciences that if and when we can reconcile the micro – macro gap then we will be able to produce ‘full’ and arguably adequate, correct representations of the formation of the social. So the ‘problem’ is not the performative aspects of knowledge production, where every attempt at investigating and representing is inevitably also an intervention and a specific way of addressing, configuring, cutting and ‘doing’ an object that will always be accompanied by and paid for by absences, ways of not seeing and asking, as pointed out throughout the history of ideas and which counts prominent scholars and scientists such as Niels Bohr, Ludwig Fleck, Michel Foucault together with the whole ensemble of contemporary STS scholars such as Hernnstein Smith, Stengers, Haraway, Barad, Pickering, Woolgar, Law, Bloor, Barnes, Collins, Mol and Latour. No, the problem as posed by Latour and Venturini is the good old (and we might add inherently modernist, positivist and progressionist): we just haven’t had the right 6 tools for the job to give us the full picture before, but we will at some point and luckily according to Latour and Venturini – these tools seems to be – almost - readily at hand. But this said and relating to Smith critique of contemporary intellectual inconsistency, this is also where Venturini and Latour strategically and at certain points are cautionary. So when we have been presented for the great promise of these tools for tracing the social in the making and thus fulfilling – according to Latour – Gabriel Tardes project - we are afterwards told that obviously these tools do not deliver any full or panoptic views of the matter. Latour & Venturini state: “Thanks to digital traceability, researchers no longer need to choose between precision and scope in their observations: it is now possible to follow a multitude of interactions and, simultaneously, to distinguish the specific contribution that each one makes to the construction of social phenomena. Born in an era of scarcity, the social sciences are entering an age of abundance. In the face of the richness of these new data, nothing justifies keeping old distinctions. Endowed with a quantity of data comparable to the natural sciences, the social sciences can finally correct their lazy eyes and simultaneously maintain the focus and scope of their observations.” And then shortly after: “No research method offers a panoptic vision of collective existence and qualiquantitative methods are no exception. Digital methods can only offer an oligoptic vision of society (Latour and Hermant, 1988), exactly as traditional methods. However, for the first time in the history of the social sciences, this vision will at least be continuous spanning from the tiniest micro-interaction to the largest macrostructure. …. Social existence is not divided in two levels, as traditional methods led us to believe. Micro- interactions and macro-structures are only two different ways of looking at the same collective canvas, like the warp and weft of the social fabric. There - in the unity generated by the multiplication of differences, in the stability produced by the accumulation of mutations, in the harmony hatching from controversies, in the equilibrium relying on thousands on fractures - lie the marvel of communal existence. Qualitative and quantitative methods have too long hid this spectacle from us. Digital methods will open our eyes.” So here one wonders how does this add up? How can you both argue that a gap has been filled or about to – two opposite ends has been joined and then a couple of lines disclaim this and suggest that in fact nothing is changed, we still only have access to partial viewing. It seems, as Smith also suggested with regards to Haraway, that some partial or oligoptic views are better than others, or that some oligoptic 7 views are less oligoptic than others? But this obviously would imply that some partial views are closer to full views and thus sees more and although this does necessitate that one would subscribe to the notion of ideal full panoptic view – or the god-eye view as Haraway suggests – it would still entail that some views see more than others – and thus we return to a realist and representationalist ontology. Latour & Venturini’s cautionary moderation of their argument does not to my understanding actually lead to a well-balanced argument in favor of promoting digital methods while simultaneously being duly respectful of the constructivist and partial qualities of research. On the contrary, the argument seems to be an ill-conceived attempt at having the cake and eating it to. In sum, what I have been trying to argue here is that there are substantial conceptual instabilities with regards to how digital methods is being presented and especially when taking into consideration the acclaimed strong heritage with ANT. OK, and so what is the problem with this? Is it not just a matter of phrasing digital methods research in popular, easy-to-go-around pitching formats and then when having secured the funding, digital methods researchers can do the interesting, abstract stuff. Maybe this is the case, maybe it is just empty gestures and strictly form and no content, simply a matter of practicing managerialist ANT, forming alliances, talking the talk, and mobilizing the network. What does it matter whether presenting digital methods is inconsistent? Inconsistency and heterogeneity is what networks are made of. However, this may very well be so, but what is at stake here for my part is exactly to point out that assumed alliance between constructivist ANT and digital methods attempted by Latour and Venturini’s is consistent. The two do not – at least conceptually - go neatly together. out does not mean that we should not convenient way for STS-researchers to phrase this in order to preserve their integrity. However, to apply such a logic rests exactly on the notion of the possibility of separating and compartmentalizing on the one hand the form (the popular easy-to-around versions) and the content (the actual, sophisticated and abstract research). Exactly, the same dichotomous move 8 that so much STS research has made explicit do not actually reflect practice whether it is in the natural sciences, in work or technological systems design. So maybe this type of reasoning is mainly convenient – and not well thought. Examples: situated knowledge concept presented by students – situated knowledge as advocating a making explicit ones position and being accountable. A putting at risk ones position – an attempt at explicating how ones knowledge and position on a given matter is grounded and contextualized. By students it is sometimes translated into an excuse or a defensive position, due to the fact that we are all situated, this means that our knowledge claims could have been different and thus that what we propose should not be taken for more than just a specific position, but consequently also a creation that cannot be challenged due to its exclusive specificity: “others might have made something else out of this but because we are situated in the manner we are this is what came out of our unique position”. Consequently, we see how one by way of an argument that is intended to a creation of a decisively strong position produced as a consequence of. 9