|
|
Line 1: |
Line 1: |
| '''Syntax''', originating from the [[Greek language|Greek]] words συν (''syn'', meaning "co-" or "together") and τάξις (''táxis'', meaning "sequence, order, arrangement"), can in linguistics be described as the study of the rules, or "patterned relations" that govern the way the words in a sentence come together. It concerns how different words (which, going back to [[Dionysios Thrax]], are categorized as [[noun]]s, [[adjective]]s, [[verb]]s, etc.) are combined into [[clause]]s, which, in turn, are combined into sentences. | | '''Syntax''', originating from the [[Greek language|Greek]] words συν (''syn'', meaning "co-" or "together") and τάξις (''táxis'', meaning "sequence, order, arrangement"), can in linguistics be described as the study of the rules, or "patterned relations" that govern the way the words in a sentence come together. It concerns how different words (which, going back to [[Wikipedia:Dionysios Thrax]], are categorized as [[Wikipedia:noun]]s, [[Wikipedia:adjective]]s, [[Wikipedia:verb]]s, etc.) are combined into [[Wikipedia:clause]]s, which, in turn, are combined into sentences. |
|
| |
|
| There exist innumerable theories of ''formal syntax'' — theories that have in time risen or fallen in influence. All theories of syntax at least share two commonalities: First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and [[semantic]] meaning. The earliest framework of [[semiotics]] was established by [[Charles W. Morris]] in his [[1938]] book ''Foundations of the Theory of Signs''. Syntax is defined, within the study of signs, as the first of its three subfields (the study of the interrelation of the signs). The second subfield is [[semantics]] (the study of the relation between the signs and the objects to which they apply), and the third is [[pragmatics]] (the relationship between the sign system and the user).
| | For further information on Syntax, please see [[Wikipedia:Syntax|Wikipedia]]. |
|
| |
|
| In the framework of [[transformational-generative grammar]] (of which ''[[Government and binding|Government and Binding Theory]]'' and ''Minimalism'' are recent developments), the structure of a [[Sentence (linguistics)|sentence]] is represented by ''phrase structure trees'', otherwise known as ''phrase markers'' or ''tree diagrams''. Such trees provide information about the sentences they represent by showing how, starting from an initial category ''S'' (or, for [[ID/LP grammar]], ''Z''), the various [[syntactic categories]] (e.g. [[noun phrase]], [[verb phrase]], etc.) are formed.
| |
|
| |
| There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are [[Phrase structure grammar]]s and [[ID/LP grammar]]s, the latter having a slight explanatory advantage over the former.{{citation needed}}
| |
|
| |
| [[Dependency grammar]] is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. [[Algebraic syntax]] is a type of dependency grammar.
| |
|
| |
| A modern approach to combining accurate descriptions of the grammatical patterns of
| |
| language with their function in context is that of [[systemic functional grammar]], an approach originally developed by Michael A.K. Halliday in the 1960s and now pursued actively in all continents. Systemic-functional grammar is related both to feature-based approaches such as Head-driven phrase structure grammar and to the older functional traditions of European schools of linguistics such as British Contextualism and the Prague School.
| |
|
| |
| [[Tree adjoining grammar]] is a grammar formalism which has been used as the basis for a number of syntactic theories.
| |
|
| |
| ==''Syntax'' in computer science==
| |
| Another meaning of the term '''syntax''' has been evolved in the field of [[computer science]], especially in the subfield of [[programming languages]], where the set of allowed [[reserved word]]s and their parameters and the correct ''word order'' in the [[expression]] is called the syntax of language. This application of the word can apply to natural languages, as well, as through Latin's inflectional case endings.
| |
|
| |
| In computer languages, syntax can be extremely rigid, as in the case of most assembler languages, or less rigid, as in languages that make use of "keyword" parameters that can be stated in any order. The syntax of expressions can be specified with parse trees. The analysis of [[programming language]] syntax usually entails the transformation of a linear sequence of ''tokens'' (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical ''syntax tree'' ([[abstract syntax tree|abstract syntax trees]] are one convenient form of syntax tree).
| |
|
| |
| This process, called ''[[parsing]]'', is in some respects ''analogous to'' syntactic analysis in [[linguistics]]; in fact, certain concepts, such as the [[Chomsky hierarchy]] and [[context-free grammar|context-free grammars]], are common to the study of syntax in both linguistics and computer science.
| |
|
| |
| ==See also==
| |
| *[[Phrase]]
| |
| *[[Phrase structure rules]]
| |
| *[[x-bar syntax]]
| |
| *[[Syntactic categories]]
| |
| *[[Grammar]]
| |
| *[[Algebraic syntax]]
| |
|
| |
| [[Category:Grammar]]
| |
| [[Category:Semiotics]]
| |
| [[Category:Syntax|*]]
| |
|
| |
|
| {{wikipedia}} | | {{wikipedia}} |
| | Wikipedia's [[Wikipedia:Syntax|Syntax]] article. |
Syntax, originating from the Greek words συν (syn, meaning "co-" or "together") and τάξις (táxis, meaning "sequence, order, arrangement"), can in linguistics be described as the study of the rules, or "patterned relations" that govern the way the words in a sentence come together. It concerns how different words (which, going back to Wikipedia:Dionysios Thrax, are categorized as Wikipedia:nouns, Wikipedia:adjectives, Wikipedia:verbs, etc.) are combined into Wikipedia:clauses, which, in turn, are combined into sentences.
For further information on Syntax, please see Wikipedia.
This article incorporates text from Wikipedia, and is available under the GNU Free Documentation License.
For the original article please see the "external links" section.
Wikipedia's Syntax article.