Articulating the World - WesFiles

advertisement
ARTICULATING THE WORLD:
EXPERIMENTAL SYSTEMS AND CONCEPTUAL UNDERSTANDING
Joseph Rouse, Wesleyan University
Abstract: Attention to scientific practice offers a novel response to philosophical queries
about how conceptual understanding is empirically accountable. The locus of the issue is
thereby shifted, from perceptual experience to experimental and fieldwork interactions.
More important, conceptual articulation is shown to be not merely “spontaneous” and
intralinguistic, but instead involves a establishing a systematic domain of experimental
operations. The importance of experimental practice for conceptual understanding is
especially clearly illustrated by cases in which entire domains of scientific investigation
were first made accessible to articulated conceptual understanding. We thereby see more
clearly how experimental systems themselves, and not merely the theories and models
they make possible, have an intentional directedness and “representational” import.
Why does the philosophy of scientific practice matter to philosophy? This question is
prompted by dramatic change in the place of philosophy of science within the discipline. Both
logical empiricist philosophy of science and its early post-empiricist successors were influential
for other philosophical work on mind, language, knowledge, and even ethics. More recent work
in the philosophy of science now provides a better understanding of the diverse sciences. Yet
these philosophical advances in understanding the sciences remain isolated from the
philosophical mainstream in metaphysics, epistemology, and the philosophy of language and
mind. In the United States, where the predominant meta-philosophy is some version of
naturalism, this disconnection is ironic. Science sets the horizons for philosophical inquiry, but
recent philosophy of science has played a minimal role in shaping the conception of science that
other philosophers invoke.
My question about the philosophical significance of the philosophy of scientific practice
gains urgency in this context. Will the philosophy of scientific practice merely become a
narrower specialist niche within an already isolated sub-discipline of philosophy? Or does
attention to scientific practice promise to restore the philosophy of science to a more central
place in philosophy?
I believe that the philosophy of scientific practice can indeed make important
contributions to philosophy more generally, in part by challenging naive conceptions of science
often taken for granted elsewhere. Here I shall only consider a single prominent issue. Kant
famously proclaimed that “Thoughts without content are empty, intuitions without concepts are
blind” (1929, A51/B75). Yet the question of how concepts acquire content from their relation to
experience has troubled philosophy ever since Kant’s proclamation. Logical empiricist and early
post-empiricist philosophy of science notoriously struggled with this issue. More recently, the
question has gained renewed prominence from John McDowell’s (1994) influential lectures on
Mind and World. McDowell argued that contemporary philosophy has failed to negotiate safe
passage between two dangerous attractors. Quine and other empiricists invoke conceptions of
experience as “Given” that allow it no bearing upon our conceptual understanding. Davidson,
Rorty, and other pragmatists circumvent this Myth of the Given, but only by treating conceptual
spontaneity as merely internally coherent. If their views were correct, McDowell argues,
conceptual thought could only be a “frictionless spinning in a void,” utterly disconnected from
accountability to the world (1994, 67).
It is important not to conflate McDowell’s or Kant’s question with skepticism about the
justification of empirical knowledge. Before we can ask about the empirical justification of a
claim, we must understand what it claims. Most philosophers distinguish these issues of
conceptual articulation and empirical justification by a division of labor. They treat conceptual
articulation as an entirely linguistic or mathematical activity of developing and regulating
inferential relations among sentences or equations. Experimentation and observation then
address only the justification of the resulting claims. Quine (1953) succinctly expressed this
division of labor in “Two Dogmas of Empiricism.” His famous image depicted scientific
theory as a self-enclosed fabric or field that only encounters experience at its periphery. If
experience then conflicts with our theoretical predictions, we must go back to make internal
adjustments to our theories and try again. Yet that division of labor between internal conceptual
development and external empirical testing is the central target of McDowell’s criticism.
How might a philosophy of scientific practice contribute to a better response to
McDowell’s and Kant’s concerns? First, it transforms the problem by understanding the
sciences’ accountability to the world in terms of experimental and fieldwork practices rather
than perceptual receptivity. Second, I shall argue, this transformation then shows that conceptual
articulation is not merely a matter of spontaneous thought in language or mathematics, and thus
not merely intralinguistic; instead, experimental practice itself can contribute to the articulation
of conceptual understanding.
To develop this claim, I begin by asking you to think about a well-known remark by Ian
Hacking:
In nature there is just complexity, which we are remarkably able to analyze. We do so by
distinguishing, in the mind, numerous different laws. We also do so by presenting, in the
laboratory, pure, isolated phenomena. (1983,226)
By “phenomena,” Hacking means events in the world rather than appearances to the mind, and
he claims that most phenomena are created in the laboratory rather than found in nature.
Experimental work does not simply strip away confounding complexities to reveal underlying
nomic simplicity; it creates new complex arrangements as indispensable background to any
foregrounded simplicity. Yet I think most philosophical readers have not taken Hacking’s
suggested parallel between phenomena and laws as modes of analysis sufficiently seriously. We
tend to think only laws or theories allow us to analyze and understand or explain nature’s
complex occurrences. Creating phenomena may help discern relevant laws or construct
illuminating theories, but they can only indicate possible directions for analysis. Conceptual
development must take place “theoretically.” Yet I think this treatment of laboratory phenomena
as merely indicative means to the verbal or mathematical articulation of theory is mistaken. It is
not enough to acknowledge that experimentation also has its own ends. Experimental practice
can be integral rather than merely instrumental to achieving conceptual understanding.
Understanding the conceptual role of experimental practice requires us to look beyond
particular phenomena to consider the development and exploration of what Hans-Jörg
Rheinberger (1997) calls “experimental systems.” I long ago described such systems as
“microworlds: systems of objects constructed under known circumstances and isolated from
other influences so that they can be manipulated and kept track of, ... [allowing scientists to]
circumvent the complexity [with which the world more typically confronts us] by constructing
artificially simplified ‘worlds’” (1987, 101). Some illustrative experimental systems or
microworlds include the Morgan group’s system for mapping genetic mutations in Drosophila
melanogaster, the many setups in particle physics that direct a source of radiation toward a
shielded target and detector, or the work with alcohols and their derivatives that Ursula Klein
(2003) argued were the beginnings of experimental organic chemistry. These are not verbal,
mathematical or pictorial representations of some actual or possible situation in the world. They
are not even physical models, like the machine-shop assemblies that Watson and Crick
manipulated to discover 3-dimensional structures for DNA. They are instead novel, reproducible
arrangements of some aspect of the world.
Today, I consider a special class of experimental systems. Heidegger, whose writings
about science emphasize the practice of scientific research, forcefully characterized the role I am
attributing to these systems:
The essence of research consists in the fact that knowing establishes itself as a “forgingahead” (Vorgang) within some realm of entities in nature or history. ... Forging-ahead,
here, does not just mean procedure (Verfahren), how things are done. Every forging-
ahead already requires a circumscribed domain in which it moves. And it is precisely the
opening up of such a domain that is the fundamental process in research. (1950, 71; 2002,
59, tr. modified)
What does it mean to open up a scientific domain, and how are such openings related to the
construction of experimental systems? Popular presentations of scientific progress often
emphasize the replacement of error and superstition by scientific knowledge. Yet in many areas
of scientific work, the very phenomena at issue were previously inaccessible. Earlier generations
could not be in error about these matters, because they never encountered them, and thus could
have little or nothing to say about them. The establishment of new experimental systems
opened new possibilities for conceptual articulation, where previously there was, in Hacking’s
apt phrase, “just complexity.” Some salient examples of domains opened by new experimental
systems include genetics, by the Morgan group’s Drosophila system correlating crossover
frequencies with variations in chromosomal cytology (Kohler 1994); quantitative temperature,
through the development of intercalibrated practices of thermometry, so nicely described by
Hasok Chang (2004); interstellar distances, through Leavitt’s and Shapley’s tracking of periodluminosity relations in Cepheid variables; the functional significance of intra-cellular structure
uncovered with ultracentrifuges and electron microscopes (Bechtel 1993, Rheinberger 1995); or
sub-atomic structure, first intimated by Rutherford’s targeting of gold leaf with beams of alpha
particles. Prior to the development of those experimental practices, these corresponding aspects
of the natural world lacked the manifest differences needed to sustain conceptual development.
What changed the situation was not just new kinds of data, or newly imagined ways of thinking
about things, but new interactions that articulate the world itself differently.
To understand this claim, we must recognize that experimental systems always have a
broader “representational” import. It is no accident that biologists speak of the key components
of their experimental systems as model organisms, and that scientists more generally speak of
experimental models. The cross-breeding of mutant strains of Drosophila with stock breeding
populations, for example, was neither interesting for its own sake, nor merely a peculiarity of one
species of Drosophila. The Drosophila system was instead understood, rightly, to show
something of fundamental importance about genetics more generally; indeed, I shall argue, it
constituted genetics as a distinct research field.
As created artifacts, laboratory phenomena and experimental systems have a distinctive
aim. Most artifacts, including the apparatus within an experimental system, are used to
accomplish some end. The end of an experimental system itself, however, is not what it does,
but what it shows. Experimental systems are novel re-arrangements of the world that allow
some features that are not ordinarily manifest and intelligible to show themselves clearly and
evidently. Sometimes such arrangements isolate and shield relevant interactions from
confounding influences. Sometimes they introduce signs or markers into the experimental field,
such as radioisotopes, genes for antibiotic resistance, or correlated detectors. Understanding this
aspect of experimentation requires that we reverse the emphasis from traditional empiricism:
what matters is not what the experimenter observes, but what the phenomenon shows.
Catherine Elgin (1991) develops this point by distinguishing the features or properties an
experiment exemplifies from those that it merely instantiates. In her example, rotating a
flashlight 90 degrees merely instantiates the constant velocity of light in different inertial
reference frames. The Michelson/Morley experiment exemplifies that constancy.1 Elgin thereby
emphasizes the symbolic function of experimental performances, and suggests parallels between
their cognitive significance and that of paintings, novels, and other artworks. She claims that a
1
Although I will not belabor the point here, it is relevant to my subsequent treatment of
experimental systems that, strictly speaking, the Michelson/Morley experiment does not
instantiate the constant velocity of light in different inertial frames, since the experiment is
conducted in an accelerated rather than an inertial setting.
fictional character such as Nora in A Doll’s House, for example, can strikingly exemplify a
debilitating situation, which the lives of many actual women in conventional bourgeois
marriages merely instantiate.
Elgin’s distinction between actual experiments and fictional constructions gives priority
to instantiation over exemplification. Nora’s life is fictional, and is therefore only
metaphorically constrained. Light within the Michelson interferometer, by contrast, really does
travel at constant velocities in orthogonal directions. The constancy of light’s velocity is already
‘there’ in the world, awaiting only the articulation of concepts that allow us to recognize it.
Unexemplified and therefore unconceptualized features of the world would then be like the
statue of Hermes that Aristotle thought exists potentially within a block of wood. Their
emergence awaits only the sculptor’s (or scientist’s) trimming away of extraneous surroundings.2
In retrospect, with a concept clearly in our grasp (or better, with us already in the grip of
that concept), the presumption that it applies to already-extant features of the world is
2
Aristotle, Metaphysics IX, ch 6, 1048a. Hacking’s initial discussion of the creation of
phenomena criticized just this conception of phenomena as implicit or potential components of
more complex circumstances:
We tend to feel [that] the phenomena revealed in the laboratory are part of God’s
handiwork, waiting to be discovered. Such an attitude is natural from a theorydominated philosophy. ... Since our theories aim at what has always been true of the
universe—God wrote the laws in His Book, before the beginning— it follows that the
phenomena have always been there, waiting to be discovered. I suggest, in contrast, that
the Hall effect does not exist outside of certain kinds of apparatus. ... The effect, at least
in a pure state, can only be embodied by such devices. (Hacking1983, 225-226)
unassailable. Of course there were mitochondria, spiral galaxies, polypeptide chains and tectonic
plates before anyone discerned them, or even conceived their possibility. Yet this retrospective
standpoint, in which the concepts are already articulated and the only question is where they
apply, crucially mis-locates important aspects of scientific research. In Kantian terms,
researchers initially seek reflective rather than determinative judgments. Scientific research must
articulate concepts with which the world can be perspicuously described and understood, rather
than simply apply those already available. To be sure, conceptual articulation does not begin de
novo. Yet in science, one typically recognizes such prior articulation as tentative and opentextured, at least in those respects that the research aims to explore.
When a domain of research has not yet been conceptually articulated, the systematic
character of experimental operations becomes especially important. Domain-constitutive systems
must have sufficient self-enclosure and internal complexity to allow relevant features to stand
out through their mutual interrelations. That scientific experimentation typically needs an
interconnected experimental system is now widely recognized in the literature.3 Yet the
importance of experimental systematicity is still commonly linked to questions of justification.
Thus, Ludwik Fleck (1979) long ago claimed that, “To establish proof, an entire system of
experiments and controls is needed, set up according to an assumption or style and performed by
an expert” (Fleck 1979, 96, my emphasis). Epistemic justification was likewise the issue for
Hacking’s (1992) discussion of the “self-vindication” of the laboratory sciences. Their selfvindicating stability, he argued, is achieved in part by the mutually self-referential adjustment of
theories and data.
I am making a different claim: typically, new domains are opened to contentful
3
Notable defenses of the systematic character of experimental systems and traditions include
Rheinberger 1997, Galison 1987, 1997, Klein 2003, Kohler 1994, Chang 2004.
conceptual articulation at all by creating systematically intraconnected “microworlds .”4
“Genes,” for example, changed from merely hypothetical posits to the locus of a whole field of
inquiry (“genetics”) by the Morgan group’s correlations of meiotic cross-over frequencies of
mutant traits with visible transformations in chromosomal cytology, in flies cross-bred with a
standardized breeding population. As a different example, Ursula Klein showed that carbon
chemistry likewise became a domain of inquiry, distinct from the merely descriptive study of
various organically-derived materials, through the systematic, conceptually articulated tracking
of ethers and other derivatives of alcohol (Klein 2003). Leyden jars and voltaic cells played
similar roles for electricity. What is needed to open a novel research domain is typically the
display of an intraconnected field of reliable differential effects: not merely creating phenomena,
but creating an experimental practice.
This constitution of a scientific domain accounts for the conceptual character of the
distinctions that function within the associated scientific field. Consider what it means to say that
the Drosophila system developed initially in Morgan’s laboratory at Columbia was about
genetics. We need to be careful here, for we cannot presume the identity and integrity of
4
Fleck, at least, was not unaware of the role of experimental systems in conceptual articulation,
although he did not quite put it in those terms. One theme of his study of the Wassermann
reaction was its connection to earlier vague conceptions of “syphilitic blood,” both in guiding the
subsequent development of the reaction, and also thereby articulating more precisely the
conceptual relations between syphilis and blood. He did not, however, explicitly connect the
systematicity of experimental practice with its conceptual-articulative role. Hacking was
likewise also often concerned with conceptual articulation (especially in the papers collected in
Hacking 2002 and 1999), but this concern was noticeably less evident in his discussions of
laboratory science (e.g., Hacking 1983, ch. 12, 16; 1992).
genetics as a domain. The word ‘gene’ predates Morgan’s work by several years, and the notion
of a particulate, germ-line “unit” of heredity emerged earlier from the work of Mendel, Darwin,
Weismann, Bateson and others. Yet the conception of genes as the principal objects of study
within the domain of genetics marks something novel. Prior conceptions of heredity did not and
could not distinguish genes from the larger processes of organismic development in which they
functioned. What the Drosophila system initially displayed, then, was a field of distinctively
genetic phenomena. The differential development of organisms became part of the experimental
apparatus that articulated genes, by connecting relative chromosomal locations, characteristic
patterns of meiotic crossover, and phenotypic outcomes.
What the Drosophila system thus did was to allow a much more extensive inferential
articulation of the concept of a gene. Concepts are marked by their availability for use in
contentful judgments, whose content is expressed inferentially.5 For example, a central
achievement of Drosophila genetics was the identification of phenotypic traits with
5
My emphasis upon inferential articulation as the definitive feature of conceptualization is
strongly influenced by Brandom 1994, 2000, 2002, with some important critical adjustments (see
Rouse 2002, ch. 5-7). Inferential articulation is not equivalent to linguistic expression, since
conceptual distinctions can function implicitly in practice without being explicitly articulated in
words at all. Scientific work is normally sufficiently self-conscious that most important
conceptual distinctions are eventually marked linguistically. Yet the central point of this paper is
to argue that the inferential articulation of scientific concepts must incorporate the systematic
development of a domain of phenomena within which objects can manifest the appropriate
conceptual differences. The experimental practices that open such a domain thereby make it
possible to form judgments about entities and features within that domain, but the practices
themselves already articulate “judgeable contents” prior to the explicit articulation of judgments.
chromosomally-located “genes.” Such judgments cannot simply correlate an attributed trait to
what happens at a chromosomal location, because of their inferential interconnectedness.
Consider the judgment in classical Drosophila genetics that the Sepia gene is not on chromosome
4.6 This judgment does not simply withhold assent to a specific claim; it has the further content
that either the Sepia gene has some other chromosomal locus, or that Sepia mutants vary in more
(or less) than one “gene”. Such judgments, that is, indicate a more-or-less definite space of
alternatives. Yet part of the content of the “simpler” claim that Sepia is on chromosome 3 is the
consequence that it is not on chromosome 4. Any single judgment in this domain presupposes
the intelligibility of an entire conceptual space of interconnected traits, loci, and genes (including
the boundaries that delimit that space).7
To open such a conceptual space, experimental systems need not be typical or
representative of the domain. Consider once more Drosophila melanogaster as an experimental
organism. As the preeminent model system for classical genetics, Drosophila was quite atypical.
As a human commensal, it is relatively cosmopolitan and genetically less-diversified than
alternative model organisms. More important, Robert Kohler has shown that for D.
melanogaster to function as a model system, its atypical features had to be artificially enhanced.
6
The distinctively revealing character of negative descriptions was brought home to me by
Hanna and Harrison 2004, ch. 10.
7
Classical-genetic loci within any single chromosome are especially cogent illustrations of my
larger line of argument, since prior to the achievement of DNA sequencing, any given location
was only identifiable by its relations to other loci on the same chromosome. The location of a
gene was relative to a field of other genetic loci, which are in turn only given as relative
locations.
Much of its residual “natural” genetic diversity had to be removed from experimental breeding
stocks (Kohler 1994, ch. 1, 3, 8). Drosophila is even more anomalous in its recently acquired
role as a model system for evolutionary-developmental biology. Drosophila is now the textbook
model for the development, developmental genetics, and evolution of animal body plans
generally (Carroll, et al. 2001, esp. ch. 2-4). Yet the long syncytial stage of Drosophila
development is extraordinary even among Arthropods. In fact, however, domain-constitutive
systems need not even instantiate the features they exemplify. The Michelson-Morley
experiment, after all, exemplifies what happens to light in inertial reference frames, but the
experiment itself is gravitationally accelerated rather than inertial.
Experimental systems mediate the empirical accountability of verbally articulated
concepts to the world, which allows the use of those concepts to be more than just McDowell’s
“frictionless spinning in a void” (1994, 66). Mary Morgan and Margaret Morrison (1999) have
compellingly characterized theoretical models as partially autonomous mediators between
theories and the world. I am claiming that scientific understanding is often doubly mediated;
experimental systems mediate between the kinds of models Morgan and Morrison describe, and
the circumstances to which scientific concepts ultimately apply. The explication of these models
within the microworld of an experimental system is what allows them to have intelligible
applications elsewhere. Moreover, in many cases, the experimental model has to come first. It
introduces relatively well-behaved circumstances that can be tractably modeled in other ways,
such as a Drosophila chromosome map.
To understand the significance of this claim, we need to ask what “well-behaved
circumstances” means here. Nancy Cartwright (1999, 49-59) has raised similar issues by talking
about mediating models in physics or economics as “blueprints for nomological machines.”
Nomological machines are arrangements and shielding of various components, so that their
capacities reliably interact to produce regular behavior. I want to expand her conception to
include not just regular behavior, but conceptually articulable behavior more generally.
I nevertheless worry about her metaphors of blueprints and machines. The machine
metaphor suggests an already determinate purposiveness, something the machine is a machine
for. With purposes specified, Cartwright’s normative language in discussing nomological
machines becomes straightforward: she speaks of successful operation, running properly, or of
arrangements that are fixed or stable enough. Yet where do the purposes and norms come from?
The need for such norms is the most basic reason to think about experimental systems as opening
research domains by mediating between theoretical models and worldly circumstances. They
help articulate the norms for circumstances to be “well-behaved,” and for nomological machines
(or experiments with them) to run “properly” or “successfully.” Scientific concepts, then, both
articulate and are accountable to norms of intelligibility, expressed in these notions of proper
behavior and successful functioning.
For theoretical models and the concepts they employ, Cartwright and Ronald Giere
(1988) seek to regulate their normativity in terms of either their empirical adequacy, or their
“resemblance” to real systems. In discussing the domain of the concept of ‘force’, for example,
Cartwright (1999, 28) claims that,
When we have a good-fitting molecular model for the wind, and we have in our theory ...
systematic rules that assign force functions to the models, and the force functions
assigned predict exactly the right motions, then we will have good scientific reason to
maintain that the wind operates via a force.
Giere in turn argues that theoretical models like those for a damped harmonic oscillator only
directly characterize fictional, abstract entities of which the models are strictly true, whose
relation to real systems is one of relevant similarity:
The notion of similarity between models and real systems ... immediately reveals— what
talk about approximate truth conceals— that approximation has at least two dimensions:
approximation in respects, and approximation in degrees. (1988, 106)
For my concerns, however, considerations of similarity or empirical adequacy come too late.
What is at issue are the relevant respects of possible resemblance, or what differences in degree
are degrees of. These matters could be taken for granted in mechanics, because the relevant
experimental systems were long ago established and stabilized, and mutually adjusted with the
relevant idealized models. The pendula, springs, falling objects, and planetary trajectories that
constitute mechanics as a domain were mostly already in place.
That is not the case when scientists begin to formulate and explore a new domain of
phenomena. For example, Mendelian ratios of inheritance obviously predated Morgan, but
spatialized “linkages” between heritable traits were novel. The discovery that the white-eyed
mutation was a “sex-linked” trait was an anchoring point within the emerging field of mutations,
much as the freezing and boiling points of water helped anchor the field of temperature
differences. Yet as Chang (2004, ch. 1) has shown in the latter case, these initially “familiar”
phenomena could not be taken for granted; to serve as anchors for a conceptually articulated
space of temperature differences, the phenomena of boiling and freezing required canonical
specification. Such specification required practical mastery of techniques and circumstances as
much or more than explicit definition.8 Indeed, my point is that practical and discursive
articulation of the phenomena had to proceed together. Likewise with the development and
refinement of the instruments through which such phenomena could become manifest, such as
thermometers for temperature differences or breeding stocks for trait-linkages.
8
Chang (2004) points out that in the case of state changes in water, ironically, ordinary
“impurities” such as dust or dissolved air, and surface irregularities in its containers, helped
maintain the constancy of boiling or freezing points; removing the impurities and cleaning the
contact surfaces allowed water to be “supercooled” or “superheated.” My point still holds,
however, that canonical circumstances needed to be defined in order to specify the relevant
concept, in this case temperature.
Recognizing that domain-opening experimental systems help constitute the norms for
their own assessment may seem to raise the spectre of idealism. Do they merely stipulate
standards for the application of a concept, without being accountable to further normative
considerations? No. Indeed, that is why recognizing the role of experimental practice in
articulating scientific concepts and norms is especially important. Chang’s (2004) study of
thermometry practices illustrates one important reason why such norms are not merely
stipulated. There are many ways to produce regular and reliable correlates to changes in heat
under various circumstances. Much work went into developing mercury, alcohol, or air
thermometers along with their analogues at higher and lower temperatures. Yet it not enough
just to establish a reliable, reproducible system for the thermal expansion or contraction of some
canonical substance, and use it to stipulate degrees of heat (or cold). The substantial variations
in measurement among different standard systems suggested a norm of temperature independent
of any particular measure, however systematic and reproducible it was. Such a norm, once it
was coherently articulated, introduced order into these variations by establishing a standard for
assessing its own correctness. That the development of a standard is itself normatively
accountable is clear from the possibility of failure: perhaps there would have been no coherent,
systematic way to correlate the thermal expansion of different substances within a single
temperature scale.
The most dramatic display of the defeasibility of experimental domain constitution,
however, comes when domain-constituting systems are abandoned or transformed by
constitutive failure, or forced re-conceptualization. Consider the abandonment in the 1950's of
the Paramecium system as a model organism for microbial genetics.9 Paramecium was dealt a
9
For detailed discussion, see Nanney1983. Sapp (1987) sets this episode in the larger context of
debates over cytoplasmic inheritance.
double blow. Its distinctive advantages for the study of cytoplasmic inheritance became moot
when the significance of supposed differences between nuclear and cytoplasmic inheritance
dissolved. More important from my perspective, however, was the biochemical reconceptualization of genes. The identification of genes as enzyme-makers emerged from the
study of biochemically-deficient mutants. Such mutations could only be displayed in organisms
like Neurospora that could grow on a variable nutrient medium. Despite extensive effort,
Paramecium would not grow on a biochemically controllable medium, and hence could not
display auxotrophic mutations. In Elgin’s terms, the cytogenetic patterns in Paramecium could
now only instantiate the distinctive manifestations of genes, and could no longer exemplify them.
A different kind of failure occurs when the “atypical” features of an experimental system
become barriers to adequate conceptual articulation. For example, standardization of genetic
background made the D. melanogaster system the exemplary embodiment of chromosomal
genetics. Yet this very standardization blocked any display of population-genetic variations and
their significance for evolutionary genetics. Theodosius Dobzhansky had to adapt the
techniques of Drosophila genetics to a different species to display the genetic diversity of natural
populations (Kohler 1994, ch. 8). For a currently controversial example, Jessica Bolker (1995)
has argued that the very features that recommend the standard model organisms in
developmental biology may be systematically misleading. Laboratory work encourages using
organisms with rapid development and short generations; these features in turn correlate with
embryonic prepatterning and developmental canalization. The choice of experimental systems
thereby materially conceives development as a relatively self-contained process. A
reconceptualization of development as ecologically-mediated may therefore require its
exemplification in different experimental practices, which will likely employ different
organisms.
I have been arguing that attention to scientific practice provides important new resources
for philosophy of language and mind. Recurrent philosophical anxieties about the
groundlessness of conceptual thought are undercut once we recognize how scientific thinking is
embedded within experimental practice. Conceptual domains in science typically emerge from
new practical articulations of the world itself. Yet reconnecting the study of scientific practice to
philosophy more generally may also transform philosophy of science in constructive ways. Our
dominant cultural and philosophical images of science still place the retrospective justification of
scientific knowledge at the forefront. Attending to the prospective role of scientific research in
articulating the world conceptually and practically could replace this conception with a
provocative and promising new “scientific image.” Defending that claim, however, must be left
for another occasion.
Acknowledgements: This paper originated as a keynote address to the inaugural meeting of the
Society for the Philosophy of Science in Practice at the University of Twente in August 2007. I
thank the organizers of the Society for this opportunity, and the audience at the meeting for their
many helpful questions and comments.
19
REFERENCES
20
Aristotle 1941. Metaphysica. In The Basic Works of Aristotle. Tr. R. McKeon. New York:
Random House.
Bolker, Jessica 1995. Model Systems in Developmental Biology. BioEssays 17:451-455.
Brandom, Robert 1994. Making It Explicit. Cambridge: Harvard University Press.
______________ 2000. Articulating Reasons. Cambridge: Harvard University Press.
______________ 2002. Tales of the Mighty Dead. Cambridge: Harvard University Press.
Carroll, Sean, Grenier, Jennifer, and Weatherbee, Scott 2001. From DNA to Diversity. Malden,
MA: Blackwell Science.
Cartwright, Nancy 1999. The Dappled World. Cambridge: Cambridge University Press.
Chang, Hasok 2004. Inventing Temperature. Oxford: Oxford University Press.
Elgin, Catherine 1991. Understanding: Art and Science. In Philosophy and the Arts. Ed. P.
French, T. Uehling, and H. Wettstein, 196-209. Midwest Studies in Philosophy, vol. 16.
Notre Dame: University of Notre Dame Press.
Fleck, Ludwik 1979. Genesis and Development of a Scientific Fact. Tr. T. Trenn and R. Merton.
Chicago: University of Chicago Press.
Galison, Peter 1987. How Experiments End. Chicago: University of Chicago Press.
___________ 1997. Image and Logic. Chicago: University of Chicago Press.
Giere, Ronald 1988. Explaining Science. Chicago: University of Chicago Press.
Hacking, Ian 1983. Representing and Intervening. Cambridge: Cambridge University Press.
___________ 1992. The Self-Vindication of the Experimental Sciences. In Science as Practice
and Culture. Ed. A Pickering, 29-64. Chicago: University of Chicago Press.
___________ 1999. The Social Construction of What? Cambridge: Harvard University Press.
___________ 2002. Historical Ontology. Cambridge: Harvard University Press.
Heidegger, Martin 1950. Holzwege. Frankfurt am Main: Klostermann. Tr. Julian Young, Off
the Beaten Track. Cambridge: Cambridge University Press. 2002.
21
Kant, Immanuel 1929. Critique of Pure Reason, Tr. Norman Kemp Smith. London: MacMillan.
Klein, Ursula 2003. Experiments, Models, Paper Tools. Stanford: Stanford University Press.
Kohler, Robert 1994. Lords of the Fly. Chicago: University of Chicago Press.
McDowell, John 1994. Mind and World. Cambridge: Harvard University Press.
Morgan, Mary, and Morrison, Margaret, eds. 1999. Models as Mediators. Cambridge:
Cambridge University Press.
Nanney, David 1983. The Cytoplasm and the Ciliates. Journal of Heredity 74: 163-170.
Quine, Willard van Orman 1953. Two Dogmas of Empiricism. In From a Logical Point of View.
Cambridge: Harvard University Press.
Rheinberger, Hans-Jörg 1995. From Microsomes to Ribosomes: ‘Strategies’ of Representation,
1935-55. Journal of the History of Biology 48:49-89.
___________________ 1997. Toward a History of Epistemic Things. Stanford: Stanford
University Press.
Rouse, Joseph 1987. Knowledge and Power. Ithaca: Cornell University Press.
____________ 2002. How Scientific Practices Matter. Chicago: University of Chicago Press.
Sapp, Jan 1987. Beyond the Gene. Oxford: Oxford University Press.
Download