Zellig Harris
From Wikipedia, the free encyclopedia
This article or section needs to be wikified to meet Wikipedia's quality standards. Please help improve this article with relevant internal links. (August 2007) |
Sections should be added to this article, to conform with Wikipedia's Manual of Style. Please discuss this issue on the talk page. This article has been tagged since January 2008. |
Zellig Sabbetai Harris (October 23, 1909 - May 22, 1992) was an American linguist, mathematical syntactician, and methodologist of science. Originally a Semiticist, he is best known for his work in structural linguistics and discourse analysis and for the discovery of transformational structure in language, all achieved in the first 10 years of his career and published within the first 25. His contributions in the subsequent 35 years, including sublanguage grammar, operator grammar, and a theory of linguistic information, are perhaps even more remarkable.
[edit] Early Career
Harris was born in Balta, now Odessa oblast, Ukraine, and in 1913 at the age of four came with his family to Philadelphia, Pennsylvania. A student in the Oriental Studies department, he received his bachelor's (1930), master's (1932), and doctoral (1934) degrees from the University of Pennsylvania. He began teaching at Penn in 1931, and would go on to found the linguistics department there in 1946, the first such department in the country.
[edit] Relation to "Bloomfieldian" Structuralism
Harris's early publications brought him to the attention of Edward Sapir, who strongly influenced him and who came to regard him as his intellectual heir. Harris also greatly admired Leonard Bloomfield for his work and as a person. He did not formally study with either.
It is widely believed that Harris carried Bloomfieldian ideas of linguistic description to their extreme development: the investigation of discovery procedures for phonemes and morphemes, based on the distributional properties of these units. His Methods in Structural Linguistics (1951) is the definitive formulation of descriptive structural work as he had developed it up to about 1946. This book made him famous, but is sometimes misinterpreted, from a generativist point of view, as a synthesis of a "neo-Bloomfieldian school" of structuralism. In the late 1940s and the 1950s, he was viewed by his colleagues as a person exploring the consequences of pushing methodological principles right to the edge. His viewed his work as articulating methods for verifying that results, however reached, are validly derived from the data. This was in line with virtually all serious views of science at the time; Harris's methods corresponded to what Hans Reichenbach called "the context of justification," not to what he called "the context of discovery." He had no sympathy for the view that to be scientific, a linguistic analyst must progress stepwise from phonetics, to phonemics, to morphology, and so on, without "mixing levels."
Fundamental to this is his recognition that phonemic contrast cannot be derived from distributional analysis of phonetic notations, but rather that the fundamental data of linguistics are speaker judgments of phonemic contrast. He formulated a method of controlled experiment, called the pair test, in which informants distinguish repetition from contrast (Methods p. 32). It is probably accurate to say that phonetic data are regarded as fundamental in all other approaches to linguistics. For example, Chomsky (1964:78) "assume[s] that each utterance of any language can be uniquely represented as a sequence of phones, each of which can be regarded as an abbreviation for a set of features". Recognizing the primacy of speaker perceptions of contrast enabled remarkable flexibility and creativity in Harris's linguistic analyses which others without that improved foundation labelled "game playing" and "hocus-pocus".
[edit] Major Contributions in the 1940s
Signal contributions summarized in Methods in Structural Linguistics include discontinuous morphemes, componential analysis of morphology and of long components in phonology, a substitution-grammar of phrase expansions that is related to immediate-constituent analysis, and above all a detailed specification of validation criteria for linguistic analysis. These criteria lend themselves to differing forms of presentation which have sometimes been taken as competing, but for Harris they are complementary, analogously to intersecting parameters in optimality theory. Consequently, Harris's way of working toward an optimal presentation for this purpose or that was often taken to be "hocus-pocus" with no expectation that there was any truth to the matter. The book includes the first formulation of generative grammar.
Among his most illuminating works in the 1940s are restatements of analyses by other linguists, done with the intention of bringing out the invariant properties of the phenomena. This anticipates later work on linguistic universals. His central methodological concern beginning with his earliest publications was to avoid obscuring the essential characteristics of language behind unacknowledged presuppositions, such as are inherent in conventions of notation or presentation.
[edit] Metalanguage and Notational Systems
Much later, he clarified the basis of this concern, observing that such hidden presuppositions are dependent upon prior knowledge of and use of language. Since the object of investigation is language itself, properties of language cannot be presupposed without question-begging. Natural language demonstrably contains its own metalanguage, in which we talk about language itself. Natural language cannot be based in a metalanguage external to that intrinsic metalanguage, and any dependence on a priori metalinguistic notions obscures an understanding of the true character of language. Notations for presentation of grammatical information must therefore be kept minimally complex and maximally transparent.
[edit] Linguistics as Applied Mathematics
Deriving from this insight, his aim was to constitute linguistics as a product of mathematical analysis of the data of language, an endeavor which he explicitly contrasted with attempts by others to treat language structure as a projection of language-like systems of mathematics or logic.
[edit] Linguistic Transformations and Linguistic Information
As early as 1939 he began teaching his students about linguistic transformations and the regularizing of texts in discourse analysis. This aspect of his extensive work in diverse languages such as Kota, Hidatsa, and Cherokee, and of course Modern Hebrew, as well as English, did not begin to see publication until his "Culture and Style" and "Discourse Analysis" papers in 1952. Then in a series of papers beginning with "Co-occurrence and Transformations in Linguistic Structure" (1957) he put formal syntax on an entirely new, generative basis.
Harris argued, following Sapir and Bloomfield, that semantics is included in grammar, not separate from it, form and information being two faces of the same coin. (In connection with the comment on presuppositions and metalanguage, above, note that any specification of semantics other than that given in language can only be stated in a metalanguage external to language.) But grammar as so far developed could not yet treat of individual word combinations, but only of word classes. A sequence or ntuple of word classes (plus invariant morphemes, termed constants) specifies a subset of sentences that are formally alike. He investigated mappings from one such subset to another in the set of sentences. In linear algebra, a mapping that preserves linear combinations is called a transformation, and that is the term that Harris introduced into linguistics.
Noam Chomsky was Harris's student, beginning as an undergraduate in 1946. Because of this, some linguists and historians have questioned whether Chomsky's transformational grammar is as revolutionary as it has been usually considered. The two scholars developed their concepts of transformation on different premisses. Chomsky early adapted Post production systems as a formalism for generating language-like symbol systems, and used this as a notation for presentation of immediate-constituent analysis. He called this phrase structure grammar. He then adapted it for presentation of Harris's transformations, restated as operations mapping one phrase-structure tree to another. This led later to his redefinition of transformations as operations mapping an abstract deep structure into a surface structure.
Harris determined early that paraphrase is inadequate as a criterion for linguistic analysis. In the 1957 "Co-Occurrence and Transformation" paper his criterion for transformational relationship between two sentence-forms was that inter-word co-occurrence restrictions should be preserved under the mapping. If two sentence-forms are transforms, then acceptable word choices for one also obtain for the other. Even while the 1957 publication was in press it was clear that preservation of word cooccurrence could not resolve certain problems, and in the 1965 "Transformational Theory" followup, the criterion for transformation was the preservation of relative acceptability of the satisfiers of each sentence-form so paired.
Harris's transformational analysis refined the word classes found in the 1946 "From Morpheme to Utterance" grammar of expansions. By recursively defining semantically more and more specific subclasses according to the combinatorial privileges of words, one may progressively approximate a grammar of individual word combinations. This relation of progression was subsequently more direct and straightforward in a grammar of substring combinability resulting from string analysis. Interword dependencies suffice to determine mappings in the set of sentences, with no need for hierarchies of abstract structures such as are demanded by phrase structure grammar.
[edit] Operator Grammar
Work on the set of transformations, factoring them into elementary sentence-differences as transitions in a derivational sequence, led to a partition of the set of sentences into two sublanguages: an informationally complete sublanguage with neither ambiguity nor paraphrase, vs. the set of its more conventional and usable paraphrases ("The Two Systems of Grammar: Report and Paraphrase" 1969). Morphemes in the latter may be present in reduced form, even reduced to zero; their fully explicit forms are recoverable by undoing deformations and reductions of phonemic shape that he termed "extended morphophonemics". Thence, in parallel with the generalization of linear algebra to operator theory, came Operator Grammar. Here at last is a grammar of the entry of individual words into the construction of a sentence. When the entry of an operator word on its argument word or words brings about the string conditions that a reduction requires, it may be carried out; most reductions are optional. Operator Grammar resembles predicate calculus, and has affinities with Categorial Grammar, but these are findings after the fact which did not guide its development or the research that led to it. Recent work by Stephen Johnson on formalization of operator grammar adapts the "lexicon grammar" of Maurice Gross for the complex detail of the reductions.
[edit] Sublanguage and Linguistic Information
In his work on sublanguage analysis, Harris showed how the sublanguage for a restricted domain can have a pre-existent external metalanguage, expressed in sentences in the language but outside of the sublanguage, something that is not available to language as a whole. In the language as a whole, restrictions on operator-argument combinability can only be specified in terms of relative acceptability, and it is difficult to rule out any satisfier of an attested sentence-form as nonsense, but in technical domains, especially in sublanguages of science, metalanguage definitions of terms and relations restrict word combinability, and the correlation of form with meaning becomes quite sharp. It is perhaps of interest that the test and exemplification of this in The Form of Information in Science (1989) vindicates in some degree the Sapir-Whorf hypothesis. It also expresses Harris's lifelong interest in the further evolution or refinement of language in context of problems of social amelioration (e.g., "A Language for International Cooperation" [1962], "Scientific Sublanguages and the Prospects for a Global Language of Science" [1988]), and in possible future developments of language beyond its present capacities.
Harris's linguistic work culminated in the companion books A Grammar of English on Mathematical Principles (1982) and A Theory of Language and Information (1991). Mathematical information theory concerns only quantity of information; here for the first time is a theory of information content. In the latter work, also, Harris ventured to propose at last what might be the "truth of the matter" in the nature of language, what is required to learn it, its origin, and its possible future development. His discoveries vindicate Sapir's recognition, long disregarded, that language is pre-eminently a social artifact.
[edit] Legacy
Harris's enduring stature derives from the remarkable unity of purpose which characterizes his oeuvre. His rigor and originality, as well as the richness of his scientific culture, allowed him to take linguistics to ever new stages of generality, often ahead of his time. He was always interested in the social usefulness of his work, and applications of it abound, ranging from medical informatics, to translation systems, to speech recognition, to the automatic generation of text from data as heard, for example, on automated weather radio broadcasts. Many workers continue to extend lines of research that he opened. His students in linguistics include among many, many others, Joseph Applegate, Ernest Bender, Noam Chomsky, William Evan, Lila Gleitman, Michael Gottfried, Maurice Gross, James Higginbotham, Stephen B. Johnson, Aravind Joshi, Michael Kac, Edward Keenan, Richard Kittredge, Leigh Lisker, Fred Lukoff, Paul Mattick, James Munz, Bruce E. Nevin, Jean-Pierre Paillet, John R. Ross, Naomi Sager, Morris Salkoff, Thomas Ryckman, and William C. Watt.
He was also influential with many students and colleagues, though in a less public way, in work on the amelioration of social and political arrangements. His last book -- The Transformation of Capitalist Society -- summarizing his findings, was published posthumously.
[edit] Works
A complete bibliography of Harris's writings is available. A selection of Harris's works follows:
- 1936. A Grammar of the Phoenician Language. Ph.D. dissertation. American Oriental Series, 8.
- 1939. Development of the Canaanite Dialects: An Investigation in Linguistic History. American Oriental Series, 16.
- 1946. "From Morpheme to Utterance". Language 22:3.161-183.
- 1951. Methods in Structural Linguistics
- 1962. String Analysis of Sentence Structure
- 1968. Mathematical Structures of Language
- 1970. Papers in Structural and Transformational Linguistics
- 1976. Notes du Cours de Syntax (in French)
- 1981. Papers on Syntax
- 1982. A Grammar of English on Mathematical Principles
- 1988. Language and Information (ISBN 0-231-06662-7)
- 1989. The Form of Information in Science: Analysis of an immunology sublanguage (ISBN 90-277-2516-0)
- 1991. A Theory of Language and Information: A Mathematical Approach (ISBN 0-19-824224-7)
- 1997. The Transformation of Capitalist Society (ISBN 0-8476-8412-1)
- 2002. "The background of transformational and metalanguage analysis." Introduction to The Legacy of Zellig Harris: Language and Information into the 21st Century: Vol. 1: Philosophy of science, syntax, and semantics, John Benjamins Publishing Company (CILT 228).
[edit] External links
- Zellig Harris Home Page
- Noam Chomsky: A Life of Dissent: Zellig Haris, Avukah, and Hashomer Hatzair
- Penn's Department of Asian & Middle Eastern Studies, successor to the Oriental Studies department
- Penn's Department of Linguistics (the first in the U.S.)
- A review of The Transformation of Capitalist Society