In computer science, left recursion is a special case of recursion.
In terms of context-free grammar, a non-terminal r
is left-recursive if the left-most symbol in any of r
’s ‘alternatives’ either immediately (direct left-recursive) or through some other non-terminal definitions (indirect/hidden left-recursive) rewrites to r
again.
Contents |
"A grammar is left-recursive if we can find some non-terminal A which will eventually derive a sentential form with itself as the left-symbol."[1]
Immediate left recursion occurs in rules of the form
where and are sequences of nonterminals and terminals, and doesn't start with . For example, the rule
is immediately left-recursive. The recursive descent parser for this rule might look like:
function Expr() { Expr(); match('+'); Term(); }
and a recursive descent parser would fall into infinite recursion when trying to parse a grammar which contains this rule.
Indirect left recursion in its simplest form could be defined as:
possibly giving the derivation
More generally, for the nonterminals , indirect left recursion can be defined as being of the form:
where are sequences of nonterminals and terminals.
A formal grammar that contains left recursion cannot be parsed by a LL(k)-parser or other naive recursive descent parser unless it is converted to a weakly equivalent right-recursive form. In contrast, left recursion is preferred for LALR parsers because it results in lower stack usage than right recursion. However, more sophisticated top-down parsers can implement general context-free grammars by use of curtailment. In 2006, Frost and Hafiz describe an algorithm which accommodates ambiguous grammars with direct left-recursive production rules.[2] That algorithm was extended to a complete parsing algorithm to accommodate indirect as well as direct left-recursion in polynomial time, and to generate compact polynomial-size representations of the potentially-exponential number of parse trees for highly-ambiguous grammars by Frost, Hafiz and Callaghan in 2007.[3] The authors then implemented the algorithm as a set of parser combinators written in the Haskell programming language.[4]
The general algorithm to remove immediate left recursion follows. Several improvements to this method have been made, including the ones described in "Removing Left Recursion from Context-Free Grammars", written by Robert C. Moore.[5] For each rule of the form
where:
replace the A-production by the production:
And create a new nonterminal
This newly created symbol is often called the "tail", or the "rest".
As an example, consider the rule
This could be rewritten to avoid left recursion as
The last rule happens to be equivalent to the slightly shorter form
If the grammar has no -productions (no productions of the form ) and is not cyclic (no derivations of the form for any nonterminal A), this general algorithm may be applied to remove indirect left recursion :
Arrange the nonterminals in some (any) fixed order .
The above transformations remove left-recursion by creating a right-recursive grammar; but this changes the associativity of our rules. Left recursion makes left associativity; right recursion makes right associativity. Example : We start out with a grammar :
After having applied standard transformations to remove left-recursion, we have the following grammar :
Parsing the string 'a + a + a' with the first grammar in an LALR parser (which can recognize left-recursive grammars) would have resulted in the parse tree:
Expr / | \ Expr + Term / | \ \ Expr + Term Factor | | | Term Factor Int | | Factor Int | Int
This parse tree grows to the left, indicating that the '+' operator is left associative, representing (a + a) + a.
But now that we've changed the grammar, our parse tree looks like this :
Expr --- / \ Term Expr' -- | / | \ Factor + Term Expr' ------ | | | \ \ Int Factor + Term Expr' | | | Int Factor | Int
We can see that the tree grows to the right, representing a + ( a + a). We have changed the associativity of our operator '+', it is now right-associative. While this isn't a problem for the associativity of addition, it would have a significantly different value if this were subtraction.
The problem is that normal arithmetic requires left associativity. Several solutions are: (a) rewrite the grammar to be left recursive, or (b) rewrite the grammar with more nonterminals to force the correct precedence/associativity, or (c) if using YACC or Bison, there are operator declarations, %left, %right and %nonassoc, which tell the parser generator which associativity to force.