Linear grammar
From Wikipedia, the free encyclopedia
The introduction to this article provides insufficient context for those unfamiliar with the subject. Please help improve the article with a good introductory style. |
In computer science, a grammar is linear if all of its productions' right hand sides have at most one variable. As such, linear grammars are more general than regular grammars.
Two special cases of linear grammars are left-linear grammars (also called left regular grammars) and right-linear grammars (also called right regular grammars). The set of regular grammars is the union of all right-linear grammars and left-linear grammars.
A language is produced by a left-linear grammar if and only if it is a regular language. A language is also regular if and only if it is produced by a right-linear grammar. However, grammars that mix right-linear and left-linear productions may produce languages that are not regular.
The equivalence between the languages described by right-linear(left-linear) grammars and regular languages can be shown by constructing a finite automaton which accepts the same language as the right-linear(left-linear) grammar and vice-versa.
[edit] Left-Linear Grammars
In left-linear grammars, also known as left regular grammars, all productions have one of the two forms:
A → B t* or A → t*
where A and B are any nonterminal symbols, and t* is any string of terminal symbols. That is, the left-hand side must consist of a single variable, and the right-hand side must consist of an optional single variable followed by any number of terminal symbols.
[edit] Right-Linear Grammars
In right-linear grammars, also known as right regular grammars, all productions have one of the two forms:
A → t* B or A → t*
where A and B are any nonterminal symbols, and t* is any string of terminal symbols. That is, the left-hand side must consist of a single variable, and the right-hand side must consist of any number of terminal symbols optionally followed by a single variable.