Syntax
From Wikipedia, the free encyclopedia
This article needs additional citations for verification. Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (May 2007) |
In linguistics, syntax (from Ancient Greek συν- syn-, "together", and τάξις táxis, "arrangement") is the study of the principles and rules for constructing sentences in natural languages. In addition to referring to the discipline, the term syntax is also used to refer directly to the rules and principles that govern the sentence structure of any individual language, as in "the syntax of Modern Irish". Modern research in syntax attempts to describe languages in terms of such rules. Many professionals in this discipline attempt to find general rules that apply to all natural languages. The term syntax is also sometimes used to refer to the rules governing the behavior of mathematical systems, such as logic, artificial formal languages, and computer programming languages.
Contents |
[edit] Early history
Works on grammar were being written long before modern syntax came about; the Aṣṭādhyāyī of Pāṇini is often cited as an example of a pre-modern work that approaches the sophistication of a modern syntactic theory.[1] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.
For centuries, work in syntax was dominated by a framework known as grammaire générale, first expounded in 1660 by Antoine Arnauld in a book of the same title. This system took as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought. That way, coincidentally, was exactly the way it was expressed in French.
However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language, and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as a most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.
The Port-Royal grammar modeled the study of syntax upon that of logic (indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale[2]). Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "Subject – Copula – Predicate". Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Graffi (2001).
[edit] Modern theories
There are a number of theoretical approaches to the discipline of syntax. Many linguists (e.g. Noam Chomsky) see syntax as a branch of biology, since they conceive of syntax as the study of linguistic knowledge as embodied in the human mind. Others (e.g. Gerald Gazdar) take a more Platonistic view, since they regard syntax to be the study of an abstract formal system.[3] Yet others (e.g. Joseph Greenberg) consider grammar a taxonomical device to reach broad generalizations across languages. Some of the major approaches to the discipline are listed below.
[edit] Generative grammar
The hypothesis of generative grammar is that language is a structure of the human mind. The goal of generative grammar is to make a complete model of this inner language (known as i-language). This model could be used to describe all human language and to predict the grammaticality of any given utterance (that is, to predict whether the utterance would sound correct to native speakers of the language). This approach to language was pioneered by Noam Chomsky. Most generative theories (although not all of them) assume that syntax is based upon the constituent structure of sentences. Generative grammars are among the theories that focus primarily on the form of a sentence, rather than its communicative function.
Among the many generative theories of linguistics are:
- Transformational Grammar (TG) (now largely out of date)
- Government and binding theory (GB) (common in the late 1970s and 1980s)
- Minimalism (MP) (the most recent Chomskyan version of generative grammar)
Other theories that find their origin in the generative paradigm are:
- Generative semantics (now largely out of date)
- Relational grammar (RG) (now largely out of date)
- Arc Pair grammar
- Generalized phrase structure grammar (GPSG; now largely out of date)
- Head-driven phrase structure grammar (HPSG)
- Lexical-functional grammar (LFG)
[edit] Categorial grammar
Categorial grammar is an approach that attributes the syntactic structure not to rules of grammar, but to the properties of the syntactic categories themselves. For example, rather than asserting that sentences are constructed by a rule that combines a noun phrase (NP) and a verb phrase (VP) (e.g. the phrase structure rule S → NP VP), in categorial grammar, such principles are embedded in the category of the head word itself. So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a functor which requires an NP as an input and produces a sentence level structure as an output. This complex category is notated as (NP\S) instead of V. NP\S is read as " a category that searches to the left (indicated by \) for a NP (the element on the left) and outputs a sentence (the element on the right)". The category of transitive verb is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. This is notated as (NP/(NP\S)) which means "a category that searches to the right (indicated by /) for an NP (the object), and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence).
Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.
[edit] Dependency grammar
Dependency grammar is a different type of approach in which structure is determined by the relations (such as grammatical relations) between a word (a head) and its dependents, rather than being based in constituent structure. For example, syntactic structure is described in terms of whether a particular noun is the subject or agent of the verb, rather than describing the relations in terms of trees (one version of which is the parse tree) or other structural system.
Some dependency-based theories of syntax:
[edit] Stochastic/probabilistic grammars/network theories
Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism. Some theories based within this approach are:
[edit] Functionalist grammars
Functionalist theories, although focused upon form, are driven by explanation based upon the function of a sentence (i.e. its communicative function). Some typical functionalist theories include:
- Functional grammar (Dik)
- Prague Linguistic Circle
- Systemic functional grammar
- Cognitive grammar
- Construction grammar (CxG)
- Role and reference grammar (RRG)
[edit] See also
[edit] Syntactic terms
- Adjective
- Adjunct
- Adverb
- Antecedent-contained deletion
- Appositive
- Article
- Aspect
- Auxiliary verb
- Case
- Clause
- Closed class word
- Comparative
- Complement
- Compound noun and adjective
- Differential Object Marking
- Conjugation
- Conjunction
- Dangling modifier
- Declension
- Determiner
- Dual (form for two)
- Expletive
- Function word
- Gender
- Gerund
- Infinitive
- Measure word (classifier)
- Modal particle
- Movement paradox
- Modifier
- Mood
- Noun
- Number
- Object
- Open class word
- Parasitic gap
- Part of speech
- Particle
- Person
- Phrase
- Phrasal verb
- Plural
- Predicate (also verb phrase)
- Predicative (adjectival or nominal)
- Preposition
- Personal pronoun
- Pronoun
- Restrictiveness
- Sandhi
- Sentence (linguistics)
- Singular
- Subject
- Superlative
- Tense
- Uninflected word
- Verb
- Voice
- Wh-movement
- Word order
[edit] Notes
- ^ Fortson IV, Benjamin W. (2004). Indo-European Language and Culture: An Introduction. Blackwell, 186. ISBN 1-4051-0315-9 (hb); 1-4051-0316-7 (pb). “[The Aṣṭādhyāyī] is a highly precise and thorough description of the structure of Sanskrit somewhat resembling modern generative grammar…[it] remained the most advanced linguistic analysis of any kind until the twentieth century.”
- ^ Arnauld, Antoine (1683). La logique, 5th ed., Paris: G. Desprez, 137. “Nous avons emprunté…ce que nous avons dit…d'un petit Livre…sous le titre de Grammaire générale.”
- ^ Ted Briscoe, 2 May 2001, Interview with Gerald Gazdar. Retrieved 2008-06-04.
[edit] References
- Brown, Keith; Jim Miller (eds.) (1996). Concise Encyclopedia of Syntactic Theories. New York: Elsevier Science. ISBN 0-08-042711-1.
- Carnie, Andrew (2006). Syntax: A Generative Introduction. Oxford: Wiley-Blackwell. ISBN 1405133848.
- Freidin, Robert; Howard Lasnik (eds.) (2006). Syntax, Critical Concepts in Linguistics. New York: Routledge. ISBN 0-415-24672-5.
- Graffi, Giorgio (2001). 200 Years of Syntax. A Critical Survey, Studies in the History of the Language Sciences 98. Amsterdam: Benjamins. ISBN 90-272-4587-8.
[edit] External links
- The syntax of natural language (Beatrice Santorini & Anthony Kroch, University of Pennsylvania)
- Various syntactic constructs used in computer programming languages
- The journal Syntax