越南人tú學台語ê時可能ê發音錯誤

advertisement
The Rhetoric of Orthographic Determinism in Chinese
Script Debate
Ruel A. Macaraeg
Department of Linguistics, University of Texas at Arlington
Perhaps the greatest consideration when weighing alternatives among
various proposals about Chinese language planning is in regard to the relative
merits afforded by different types of writing systems – characters, syllabaries, and
phonemic alphabets. There is widespread belief among commentators in this
debate that writing systems exert a formative effect on perception, cognition, and
ultimately socio-cultural achievement, and the structural similarities of such beliefs
to linguistic determinism invites observers of this phenomenon to describe it as
“orthographic determinism.”
This discussion will make explicit these similarities by examining the form
and content found in deterministic arguments about writing systems, with an
emphasis on the absolutist nature of the rhetoric. An attempt will be made to
trace the development of deterministic thought along separate lines from early
linguistic studies into modern research and popular commentary. The discourse
on writing systems will be shown to revolve around a core of key assumptions,
regardless of the particular form of writing system advocated.
An understanding of orthographic determinism is essential to any
explanation for why certain arguments about Chinese language planning exert
powerful appeals despite a lack of supporting evidence. It is hoped that
recognition of the issues surrounding orthographic determinism will help move
the debate of Chinese script reform toward more substantive methodologies and
encourage more informed policy-making in the affected countries.
Introduction
We linguists who have directed our attention to the debate over Chinese script reform
are faced with a most frustrating situation. Typically, when governments seek informed
guidance on the formation of public policies relating to a specific issue, they consult experts
in that field. Those experts use their specialized knowledge and skills to formulate
objectives, plans of action, and methods of evaluation. Script reform is first and foremost a
linguistic issue, so one would expect linguists to be the driving force behind government
policies concerning scripts.
In practice, however, this is often not the case. Because language is such a powerful
marker of social, cultural, and political affiliations, linguists are hard-pressed to monopolize
the field. Anyone with ideological opinions will likely have opinions about language as
well, and this includes people with decision-making power over language and script
planning. Because language is something everyone uses, it is not always obvious that
expertise is needed to make informed decisions about language policies. Linguists, then,
are in the awkward position of having to justify their expertise to both governments and
their citizens on a subject whose level of importance would invite immediate deference in
other fields.
2004 台灣羅馬字國際研討會論文集 Tâi-oân
Lô-má-j„ Kok-chè Gián-thó-höe
Lün-bûn-ch…p
Script planning is indeed a subject of great importance. A well-designed script
encourages greater acquisition and retention of literacy, more efficient information
processing, and better communication within a population; these in turn affect a country’s
productivity and standard of living. With such high stakes, the consequences of inadequate
expertise in linguistics can be socially catastrophic. Yet this is the state of affairs with
virtually all governmental script planning today – linguistic issues are swept aside by more
salient and rhetorically useful social commentary and appeals to cultural and national
identity. Because of this, script policies do not keep pace with the needs of the
constituencies they are ostensibly designed to serve.
Chinese is the most significant of the many languages dealing with script reform issues
today. It constitutes the largest language community in the world, more than a billion
people, in numerous spoken varieties. All of these variants use a morpheme-based
character script which is acknowledged as extremely difficult to master (morphemes are an
open and infinite set, where as the phonemes of alphabets and syllables of syllabaries are
closed and relatively small sets). Recognition of this fact led many progressive thinkers
during China’s late Imperial and early Republican years (late 19th – early 20th century) to
propose the reformation of Chinese script. The most radical of these were several systems
for representing Modern Standard Chinese in a Romanized script.
Romanization failed in both the People’s Republic of China and the Republic of China
on Taiwan. With a strong central government, relatively small population, and generally
high standard of living, Taiwan was able to enforce compulsory education and thus achieve
mass literacy in spite of the difficulties inherent in character script. The Mainland, lacking
these advantages, adopted an official intention to convert to Romanized script (pinyin) but in
practice only implemented a policy of character simplification, leaving the basic nature of
Chinese script intact.
With reported literacy rates now at all-time highs, advocates of Chinese character
scripts now reject the idea that character scripts are less efficient than phonemic or syllabic
ones, and speak instead to presumed advantages afforded by the kinds of mental processes
that characters promote in order to defend their continued use. Alphabetic reformers, in
response, insist that phonemes are more effective in engaging these same mental processes.
Such arguments are non-linguistic, and are so broad as to be dubious by any academic
standard. Yet they appeal to non-linguists, both government officials and ordinary citizens,
on several levels. Ideas about the inherent superiority of cultural creations such as scripts
flatter people’s sense of cultural pride, a characteristic that governments have always found
useful for promoting national loyalty. They are generally free of the technical jargon
typical of linguistics, being instead in an idiom typical of popular science or history writing.
Perhaps most of all, these ideas are popular because they are oversimplified – they propose
that one factor alone can account for all the important differences observed in cultural
achievement.
Theories such as this, centered around the all-important influence of a single variable,
are called deterministic. Linguists are already familiar with their own particular brand of
determinism, the strong version of the Sapir-Whorf Hypothesis of linguistic relativity.
This theory of linguistic determinism asserts that characteristics of an individual’s native
language dictate the way in which he or she perceives the physical world. According to
Whorf (1956),
We dissect nature along lines laid down by our native languages. The categories
and types that we isolate from the world of phenomena we do not find there
because they stare every observer in the face; on the contrary, the world is
presented in a kaleidoscopic flux of impressions which has to be organized by our
minds--and this means largely by the linguistic systems in our minds (p. 213).
13-2
〈The Rhetoric of Orthographic Determinism in Chinese Script Debate〉Ruel A. Macaraeg
The shape of language structures imposes constraints on how sensory data are
interpreted, and by extension thought processes are conditioned by the linguistic structures
responsible for sorting out this data. Among these linguistic structures, Whorf saw
grammar as having the principal determinative role:
[U]sers of markedly different grammars are pointed by their grammars toward
different types of observations and different evaluations of externally similar acts
of observation, and hence are not equivalent as observers but must arrive at
somewhat different views of the world (1956:221).
Arguments assigning a primary determinative role to orthography rather than
grammar have been made which together constitute a clear orthographic analogy to
linguistic determinism, even if the term “orthographic determinism” has not hitherto been
coined to acknowledge them. Character-based scripts, embodying the opposite extreme
from phonemic alphabets along the scale of writing systems, have been an obvious target
upon which to focus deterministic attention.
While linguistic determinism nowadays has become obsolete as a theory, orthographic
determinism continues to thrive. It represents one of the many ways in which
pseudo-linguistic rhetoric diverts governmental and popular attention from the truly
meaningful aspects of Chinese script reform debate. In order for linguists to respond
effectively to this challenge, we must first become aware of what orthographic determinism
is – identify what its assumptions are, recognize how its arguments are formulated, and
understand the basis for its rhetorical appeal. Substantive discussion about script reform,
either in China or Taiwan, cannot proceed until we can educate the lay public to the fact that
orthographic determinism is primarily a rhetorical device rather than a viable research
methodology.
The following sections will describe orthographic determinism as it manifests on both
sides of the debate on Chinese script reform. We will then identify the common discursive
patterns shared by both sides and discuss their relevance toward the creation of a viable
research methodology into the measurement of script efficiency.
Character Determinism
Orthographic determinism has been used by both sides in the character/alphabet
debate and, paradoxically, both make the claim that the writing system they advocate
promotes greater analytic thought. The high performance of East Asian students in fields
such as mathematics and science is proverbial in the discourse on American education, and
popular wisdom commonly attributes this success to the same abilities seemingly fostered in
the acquisition and use of character writing – expansive memory, attention to visual detail,
and intuition about extended semantic derivations from minimal graphic cues.
In this sense, a large number of both Western and Eastern non-linguists (primarily
commentators on education and sociology) who have considered the influence of
character-based orthography on complex thought processes would subscribe to a form of
orthographic determinism favoring characters over alphabets. Various studies report
Chinese literates scoring higher on science (Flaugher 1971), verbal memory (Huang and Liu
1978), and mathematical skill (Stevenson, Chen, and Lee 1993).
An example of a language specialist making a positive deterministic claim about
character orthography can be found in Coulmas (1989). He reports having given a
cryptogram in an invented script and unspecified language to a class of graduate students
and receiving only one successful decipherment – from a Japanese:
When I gave this piece of text to a class of graduate students at Georgetown
University as an assignment with the intent that they should find out as many of
the systematic properties of the system, frequencies of occurrence and
13-3
2004 台灣羅馬字國際研討會論文集 Tâi-oân
Lô-má-j„ Kok-chè Gián-thó-höe
Lün-bûn-ch…p
distribution of letters, etc., I was quite surprised when one of the students
presented me with the complete decipherment the following week. […] I do
not consider it a coincidence that the successful decipherer was Japanese. Rather
it seems to me that the early exposure to minute graphical differences in writing
accompanying the acquisition of the Japanese writing system is an excellent
training for every visual task (p. 222, n.2).
Of course, visual attentiveness alone would not have been enough to decipher the
passage, and left unsaid is the assumed ability of this reader (and others) of Japanese to infer
the functional properties of graphemic features in the given passage. This requires the
rational “left-brain” thought processes of analysis and synthesis, and Coulmas implies that
such abilities are nurtured and sustained through character-based orthography.
Alphabetic Determinism
The contrary position of orthographic determinism makes a comparable claim for
alphabets in the shaping of complex thought. Rather than focus on the success of
individuals or even groups of individuals, orthographic determinists of the phonemic
persuasion have tended to see the effects of writing systems in the respective achievements
of the civilizations which employ them; Western science takes a teleological place in their
arguments. According to this view, the cognitive abilities required to analyze phonemic
writing are foundational to the synthetic reasoning skills needed for an internalized
understanding of the scientific process. Without these abilities, true science – based on
inductive reasoning and empirical measurement – cannot exist in any literate community.
The implication, obviously, is that by discarding their character-based orthographies, East
Asia can make that crucial transition into a fully phonemic writing system and thence enact
a genuine scientific revolution, raising living standards and improving the general quality of
life there.
As with those who favor characters, many determinists on record favoring alphabets
are not linguists by training.1 Their views, nevertheless, obviate assumptions also held by
their linguist partisans and are important in revealing the kinds of language-referenced
arguments that find resonance with non-specialist populations. Robert K. Logan’s Alphabet
Effect (1986) provides a book-length example. Drawing on the work of cultural
commentator Marshall MacLuhan2, famous for his dictum “the medium is the message,”
Logan extends the idea to written language media, arguing that the implicit message of the
alphabetic medium is abstract, reductionist logic while that of the character medium is
concrete, holistic intuition. This “alphabet effect,” he writes,
is a subliminal phenomenon. There is more to using the alphabet than just
learning how to read and write. Using the alphabet, as we shall soon discover,
also entails the ability to: 1) code and decode, 2) convert auditory signals or
sounds into visual signs, 3) think deductively, 4) classify information, and 5) order
words through the process of alphabetization. These are the hidden lessons of
the alphabet that are not contained (or at least not contained to the same degree)
in learning the Chinese writing system. These are also the features of the use of
the phonetic alphabet that give rise to the alphabet effect (p. 21).
The deterministic implications of this alphabet effect are further elucidated:
Not only has the alphabet performed admirably as a tool for literacy, [sic] it has
also served as a model for classification. It has played an instrumental role in
1
For examples of some who are, see Havelock (1976), Ong (1982), de Kerckhove (1986), Goody
(1986), de Kerckhove and Lumsden eds. (1988), and Olson (1994).
2
McLuhan himself (1962) said: “Cultures can rise far above civilization artistically but without the
phonetic alphabet they remain tribal, as do the Chinese and Japanese” (p. 47).
13-4
〈The Rhetoric of Orthographic Determinism in Chinese Script Debate〉Ruel A. Macaraeg
the development of the logical style of analysis that is characteristic of the
Western way of thinking. Learning how to read and write with the alphabet has
brought us more than literacy and a model for classification. It has provided us
with a conceptual framework for analysis and has restructured our perceptions of
reality [emphasis added]. All of these effects take place independent of what we
read. The information that is coded is not important; it is the act of coding itself
that has been so influential and acted as a springboard for new ideas. Other
writing systems exist, but none have provided such fertile ground for abstract
ideas nor such an efficient tool for organizing information (pp. 18-19).
The remainder of the book develops a dichotomized view of European and Chinese
civilizations, the former gaining in material progress due to its rigorous and systematic
deconstruction of phenomena, the latter losing its way amidst mere experiential concepts.
Logan’s thesis is weakened by his grasp of characters as “ideograms” or
pictures-as-words, a misunderstanding shared by most people not familiar with the nature of
Chinese writing. Still, his general position was taken up recently by William C. Hannas
(2003), who is a linguist as well as a specialist in East Asian technology issues. The
Writing on the Wall: How Asian Orthography Curbs Creativity begins with an indictment of
technology transfer methods conducted by East Asian governments and corporations, the
result of these cultures’ inability to develop science indigenously. Labeling this “Asia’s
creativity problem,” he develops a similar deterministic hypothesis based on the effects of
orthography: “If writing encourages abstract thought, then it stands to reason that the more
abstract a writing system is, the more strongly it influences the development of abstract
thinking” (p. 151). The book becomes a manifesto for the Latinization of Chinese, Japanese,
and Korean writing, emphatically stated in its conclusion:
As long as character-based scripts remain the primary channel of literacy for East
Asians, the cognitive effects associated with these systems will weigh on their
users as before. […] How much simpler it would be if East Asians were to
accept what many of their own best thinkers have argued for nearly a century,
namely, that character-based writing, practically speaking, is bankrupt and should
be replaced with an alphabet. There is no reason why the alphabetic notations
now used in China, Japan, and Korea cannot assume a wider role in the
orthographic lives of East Asians. Allowing the two writing systems to vie in
representing the national language would eliminate transitional problems by
substituting user choice for government fiat. Animated by its proximity to
speech, the literary norm would adapt on its own to the needs of phonetic writing,
freeing east Asians from the anachronism of character-based writing and
removing this obstacle to creativity” (p. 293).
Though less concerned with contrasting alphabets against Chinese characters than the
preceding works, one other recent book deserves mention because it was a national
bestseller in the United States: The Alphabet and the Goddess: The Conflict between Word and
Image by Leonard Shlain (1998). Here again the author is a non-linguist, but adds his
credentials as a medical researcher to the now-familiar hemispheric lateralization account to
explain behavioral and cultural aspects of Western civilization – in his case, negative aspects
such as the rise of religious dogmatism, patriarchy, and misogyny as well as the more
commonly cited positives aspects of science and technology.
Conclusion
In summing up the foregoing brief observations of orthographic determinism,
important rhetorical patterns are obviated. Those who bring orthographic determinism to
the script choice debate – regardless of whether they support characters or alphabets – state
their arguments in terms of the script’s ability to foster higher-order thinking (analysis,
13-5
2004 台灣羅馬字國際研討會論文集 Tâi-oân
Lô-má-j„ Kok-chè Gián-thó-höe
Lün-bûn-ch…p
synthesis, and evaluation). These enhanced thinking skills, in turn, are credited with
achievement in intellectual endeavors, particularly science and technology.
Simultaneously, proponents on both sides seek to substantiate their positions with
reference to the hemispheric specialization (“left-brain” vs. “right-brain” thinking) of the
respective cognitive skills assumed to be activated by reading a particular script (e.g.,
Hardyck, Tzeng, and Wang 1978). Such evidence continues to be quoted in the most
current works making an orthographic deterministic case, despite the fact that brain
lateralization as a concept has been in decline for over a decade in other cognitive and
behavioral disciplines (Efron 1990).
As we said earlier, deterministic theories in general tend to be unsupported because
they are over-simplistic, attempted to account for too much with too little. Further, when
opposing sides use virtually identical arguments, it casts suspicion on the well-formedness of
the entire research perspective. Orthographic determinism fails on both these counts.
We are left to ask, then: Does orthographic determinism have anything of value to
contribute to the script reform debate in Chinese?
Upon reflection, we find that it does. Because scholars arguing one deterministic
position or the other have gone to the trouble of accumulating evidence that different scripts
correlate to measurable differences along various perceptual and cognitive dimensions, it
provides us with the first suggestions that the Chinese script reform debate can and should
be quantified. Quantitative data are the foundation for controlled experimental study,
which is the only valid way in which different scripts can be compared with any degree of
confidence. So even as we turn away from orthographic determinism itself as a productive
avenue for research into Chinese script reform, we must acknowledge that it bears within it
the seeds of a new methodology – quantitative and experimental – that is better than any yet
applied to this important problem.
13-6
〈The Rhetoric of Orthographic Determinism in Chinese Script Debate〉Ruel A. Macaraeg
References
Coulmas, Florian 1989. The Writing Systems of the World. New York: Blackwell.
De Kerckhove, Derrick 1986. Alphabetic literacy and brain processes. Visible
Language 20 (3), 274-293.
De Kerckhove, Derrick and Charles J. Lumsden eds. 1988. The Alphabet and the Brain:
The Lateralization of Writing. Berlin: Springer-Verlag.
Efron, Robert 1990. The Decline and Fall of Hemispheric Specialization. Hillsdale, NJ:
Lawrence Erlbaum Associates.
Flaugher, R. L. 1971. Patterns of Test Performance by High School Students of Four
Ethnic Identities. Princeton, NJ: Educational Testing Service.
Goody, Jack 1986. The Logic of Writing and the Organization of Society. Cambridge:
Cambridge University Press.
Hannas, William C. 2003. The Writing on the Wall: How Asian Orthography Curbs
Creativity. Philadelphia: University of Pennsylvania Press.
Hardyck, C; Tzeng, O.J.L; and W. S.-Y. Wang 1978. Cerebral lateralization of function
and bilingual decision processes: Is thinking lateralized? Brain and Language 5,
56-71.
Havelock, E. A. 1976. Origins of Western Literacy: Four Lectures Delivered at the
Ontario Institute for Studies in Education. Toronto: Ontario Institute for Studies in
Education.
Huang, J.T. and I. M. Liu 1978. Paired-associate learning proficiency as a function of
frequency count, meaningfulness, and imagery value in Chinese two-character ideograms.
Chinese Psychological Journal 20, 5-17.
Logan, Robert K. 1986. The Alphabet Effect. New York: William Morrow and Co.
McLuhan, Marshall 1962. The Gutenberg Galaxy: The Making of Typographical Man.
Toronto: University of Toronto Press.
Olson, David R. 1994. The World on Paper: The Conceptual and Cognitive Implications
of Writing and Reading. Cambridge: Cambridge University Press.
Ong, Walter J. 1982. Orality and Literacy: The Technologizing of the World. London:
Methuen.
Shlain, Leonard 1998. The Alphabet vs. the Goddess: The Conflict between Word and
Image. New York: Penguin Arkana.
Stevenson, Harold, Chuansheng Chen, and Shin-Ying Lee 1993. Mathematics
achievement of Chinese, Japanese, and American children: Ten years later. Science 259,
53-58.
Whorf, Benjamin Lee 1956. Language, Thought and Reality. Cambridge: M.I.T. Press.
_____________________________________________________________
Ruel A. Macaraeg
Mailing Address: 1525 Chukka Drive #413, Arlington, TX 76012-6556 USA
Department of Linguistics, University of Texas at Arlington
Graduate Student
Phone: (214) 226-3641
Email: ram9413@exchange.uta.edu
13-7
2004 台灣羅馬字國際研討會論文集 Tâi-oân
Lô-má-j„ Kok-chè Gián-thó-höe
Lün-bûn-ch…p
13-8
Download