Left recursion
From Wikipedia, the free encyclopedia
In computer science, left recursion is a special case of recursion.
A formal grammar that contains left recursion cannot be parsed by a recursive descent parser. In contrast, left recursion is preferred for LALR parsers because it results in lower stack usage than right recursion.
Contents |
[edit] Definition
"A grammar is left-recursive if we can find some non-terminal A which will eventually derive a sentential form with itself as the left-symbol."[http://www.cs.may.ie/~jpower/Courses/parsing/parsing.pdf#search='indirect%20left%20recursion' JPR02
[edit] Immediate left recursion
Immediate left recursion occurs in rules of the form
Where α and β are sequences of nonterminals and terminals, and β doesn't start with A.
Example : The rule
is immediately left-recursive. The recursive descent parser for this rule might look like :
- function Expr() {
- Expr(); match('+'); Term();
- }
and would, evidently, fall into infinite recursion.
[edit] Indirect left recursion
Indirect left recursion in its simplest form could be defined as :
Possibly giving the derivation
More generally, for the non-terminals A0,A1,...,An, indirect left recursion can be defined as being of the form :
...
Where α1,α2,...,αn are sequences of nonterminals and terminals.
[edit] Removing left recursion
[edit] Removing immediate left recursion
The general algorithm to remove immediate left recursion follows. Several improvements to this method have been made, including the ones described in this paper.
For each rule of the form
Where :
- A is a left-recursive nonterminal
- α is a sequence of nonterminals and terminals that is not null ()
- β is a sequence of nonterminals and terminals that does not start with A.
Replace the A-production by the production :
And create a new nonterminal
This newly created symbol is often called the "tail", or the "rest".
[edit] Removing indirect left recursion
If the grammar has no ε-productions (no productions of the form ) and is not cyclic (no derivations of the form for any nonterminal A), this general algorithm may be applied to remove indirect left recursion :
Arrange the nonterminals in some (any) fixed order A1, ... An.
- for i = 1 to n {
- for j = 1 to i – 1 {
-
- let the current Aj productions be
-
- replace each production by
-
- remove direct left recursion for Ai
-
- }
- for j = 1 to i – 1 {
- }
[edit] Pitfalls
The above transformations remove left-recursion by creating a right-recursive grammar; but this changes the associativity of our rules. Left recursion makes left associativity; right recursion makes right associativity. Example : We start out with a grammar :
After having applied standard transformations to remove left-recursion, we have the following grammar :
Parsing the string 'a + a + a' with the first grammar in an LALR parser (which can recognize left-recursive grammars) would have resulted in the parse tree :
Expr / \ Expr + Term / | \ \ Expr + Term Factor | | | Term Factor Int | | Factor Int | Int
This parse tree grows to the left, indicating that the '+' operator is left associative, representing (a + a) + a.
But now that we've changed the grammar, our parse tree looks like this :
Expr ---
/ \
Term Expr' --
| / | \
Factor + Term Expr' ------
| | | \ \
Int Factor + Term Expr'
| |
Factor ε
|
Int
We can see that the tree grows to the right, representing a + ( a + a). We have changed the associavity of our operator '+', it is now right-associative. While this isn't a problem for the associativity of addition with addition it would have a signifcantly different value if this were subtraction.
The problem is that normal arithmetic requires left associativity. Several solutions are: (a) rewrite the grammar to be left recursive, or (b) rewrite the grammar with more nonterminals to force the correct precedence/associativity, or (c) if using YACC or Bison, there are operator declarations, %left, %right and %nonassoc, which tell the parser generator which associativity to force.