Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Syntax

In linguistics, syntax is the study of the rules, or "patterned relations", that govern the way the words in a sentence come together. Syntax originates from the Greek words συν (syn, meaning "together") and ταξις (taxis, meaning sequence/order). It concerns how different words (which, going back to Dionysios Thrax, are categorized as nouns, adjectives, verbs, etc.) are combined into clauses, which, in turn, are combined into sentences.

Contents

In semiotics

In the earliest framework of semiotics (established by C.W. Morris in his 1938 book Foundations of the Theory of Signs) the syntax is defined within the study of signs as the first of its three subfields, syntax, the study of the interrelation of the signs; the second being semantics, the study of the relation between the signs and the objects to which they apply; and the third being pragmatics, the relationship between the sign system and the user).

In transformational-generative grammar

In the framework of transformational-generative grammar (of which Government and Binding Theory and Minimalism are recent developments), the structure of a sentence is represented by phrase structure trees , otherwise known as phrase markers or tree diagrams. Such trees provide information about the sentences they represent by showing how, starting from an initial category S (or, for ID/LP grammar , Z), the various syntactic categories (e.g. noun phrase, verb phrase, etc.) are formed.

There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are Phrase structure grammars and ID/LP grammars , the latter having a slight explanatory advantage over the former.

In other grammars

Dependency grammar is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is a type of dependency grammar.

Tree-adjoining grammar is a grammar formalism which has been used as the basis for a number of syntactic theories.

See also: Phrase, Phrase structure rules, x-bar syntax, Syntactic categories, Grammar, Algebraic syntax

In computer science

The usage of syntax in computer science has evolved from its related usage in linguistics, especially in the subfield of programming language design. The set of allowed reserved words and their parameters and the correct word order in the expression is called the syntax of the language. The ubiquitous syntax error generated by various programming language results when the computer cannot find a valid interpretation according to its preprogrammed rules of syntax for the code it has been requested to run, frequently due to a typo.

In computer languages, syntax can be extremely rigid, as in the case of most assembler languages, or less rigid, as in languages that make use of "keyword" parameters that can be stated in any order.

The analysis of programming language syntax usually entails the transformation of a linear sequence of tokens (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical syntax tree (abstract syntax trees are one convenient form of syntax tree).

This process, called parsing, is in some respects analogous to syntactic analysis in linguistics; certain concepts, such as the Chomsky hierarchy and context-free grammars, are common to the study of syntax in both linguistics and computer science.

See also

The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy