Sequent calculus

Sequent calculus is, in essence, a style of formal logical argumentation where every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautology is inferred from other conditional tautologies on earlier lines in a formal argument according to rules and procedures of inference, giving a better approximation to the style of natural deduction used by mathematicians than David Hilbert's earlier style of formal logic where every line was an unconditional tautology. (This is the essence of the idea, but there are several over-simplifications here. For example, there may be non-logical axioms upon which all propositions are implicitly dependent. Then sequents signify conditional theorems in a first-order language rather than conditional tautologies.)

Sequent calculus is one of several extant styles of proof calculus for expressing line-by-line logical arguments.

In other words, natural deduction and sequent calculus systems are particular distinct kinds of Gentzen-style systems. Hilbert-style systems typically have a very small number of inference rules, relying more on sets of axioms. Gentzen-style systems typically have very few axioms, if any, relying more on sets of rules.

Gentzen-style systems have significant practical and theoretical advantages compared to Hilbert-style systems. For example, both natural deduction and sequent calculus systems facilitate the elimination and introduction of universal and existential quantifiers so that unquantified logical expressions can be manipulated according to the much simpler rules of propositional calculus. In a typical argument, quantifiers are eliminated, then propositional calculus is applied to unquantified expressions (which typically contain free variables), and then the quantifiers are reintroduced. This very much parallels the way in which mathematical proofs are carried out in practice by mathematicians. Predicate calculus proofs are generally much easier to discover with this approach, and are often shorter. Natural deduction systems are more suited to practical theorem-proving. Sequent calculus systems are more suited to theoretical analysis.

Introduction

In proof theory and mathematical logic, sequent calculus is a family of formal systems sharing a certain style of inference and certain formal properties. The first sequent calculi, systems LK and LJ, were introduced in 1934/1935 by Gerhard Gentzen[1] as a tool for studying natural deduction in first-order logic (in classical and intuitionistic versions, respectively). Gentzen's so-called "Main Theorem" (Hauptsatz) about LK and LJ was the cut-elimination theorem,[2][3] a result with far-reaching meta-theoretic consequences, including consistency. Gentzen further demonstrated the power and flexibility of this technique a few years later, applying a cut-elimination argument to give a (transfinite) proof of the consistency of Peano arithmetic, in surprising response to Gödel's incompleteness theorems. Since this early work, sequent calculi, also called Gentzen systems,[4][5][6][7] and the general concepts relating to them, have been widely applied in the fields of proof theory, mathematical logic, and automated deduction.

Hilbert-style deduction systems

One way to classify different styles of deduction systems is to look at the form of judgments in the system, i.e., which things may appear as the conclusion of a (sub)proof. The simplest judgment form is used in Hilbert-style deduction systems, where a judgment has the form

B\,

where B is any formula of first-order-logic (or whatever logic the deduction system applies to, e.g., propositional calculus or a higher-order logic or a modal logic). The theorems are those formulae that appear as the concluding judgment in a valid proof. A Hilbert-style system needs no distinction between formulae and judgments; we make one here solely for comparison with the cases that follow.

The price paid for the simple syntax of a Hilbert-style system is that complete formal proofs tend to get extremely long. Concrete arguments about proofs in such a system almost always appeal to the deduction theorem. This leads to the idea of including the deduction theorem as a formal rule in the system, which happens in natural deduction.

Natural deduction systems

In natural deduction, judgments have the shape

A_1, A_2, \ldots, A_n \vdash B

where the A_i's and B are again formulae and n\geq 0. Permutations of the A_i's are immaterial. In other words, a judgment consists of a list (possibly empty) of formulae on the left-hand side of a turnstile symbol "\vdash", with a single formula on the right-hand side.[8][9][10] The theorems are those formulae B such that \vdash B (with an empty left-hand side) is the conclusion of a valid proof. (In some presentations of natural deduction, the A_is and the turnstile are not written down explicitly; instead a two-dimensional notation from which they can be inferred is used.)

The standard semantics of a judgment in natural deduction is that it asserts that whenever[11] A_1, A_2, etc., are all true, B will also be true. The judgments

A_1, \ldots, A_n \vdash B

and

\vdash (A_1 \land \cdots \land A_n) \rightarrow B

are equivalent in the strong sense that a proof of either one may be extended to a proof of the other.

Sequent calculus systems

Finally, sequent calculus generalizes the form of a natural deduction judgment to

A_1, \ldots, A_n \vdash B_1, \ldots, B_k,

a syntactic object called a sequent. The formulas on left-hand side of the turnstile are called the antecedent, and the formulas on right-hand side are called the succedent; together they are called cedents. Again, A_i and B_i are formulae, and n and k are nonnegative integers, that is, the left-hand-side or the right-hand-side (or neither or both) may be empty. As in natural deduction, theorems are those B where \vdash B is the conclusion of a valid proof. The empty sequent, having both cedents empty, is defined to be false.[12]

The standard semantics of a sequent is an assertion that whenever every  A_i is true, at least one B_i will also be true.[13] One way to express this is that a comma to the left of the turnstile should be thought of as an "and", and a comma to the right of the turnstile should be thought of as an (inclusive) "or". The sequents

A_1, \ldots, A_n \vdash B_1, \ldots, B_k

and

\vdash (A_1 \land\cdots\land A_n)\rightarrow(B_1 \lor\cdots\lor B_k)

are equivalent in the strong sense that a proof of either one may be extended to a proof of the other.

At first sight, this extension of the judgment form may appear to be a strange complication — it is not motivated by an obvious shortcoming of natural deduction, and it is initially confusing that the comma seems to mean entirely different things on the two sides of the turnstile. However, in a classical context the semantics of the sequent can also (by propositional tautology) be expressed either as

\vdash \neg A_1 \lor \neg A_2 \lor \cdots \lor \neg A_n \lor B_1 \lor B_2 \lor\cdots\lor B_k

(at least one of the As is false, or one of the Bs is true) or as

\vdash \neg(A_1 \land A_2 \land \cdots \land A_n \land \neg B_1 \land \neg B_2 \land\cdots\land \neg B_k)

(it cannot be the case that all of the As are true and all of the Bs are false). In these formulations, the only difference between formulae on either side of the turnstile is that one side is negated. Thus, swapping left for right in a sequent corresponds to negating all of the constituent formulae. This means that a symmetry such as De Morgan's laws, which manifests itself as logical negation on the semantic level, translates directly into a left-right symmetry of sequents — and indeed, the inference rules in sequent calculus for dealing with conjunction (∧) are mirror images of those dealing with disjunction (∨).

Many logicians feel that this symmetric presentation offers a deeper insight in the structure of the logic than other styles of proof system, where the classical duality of negation is not as apparent in the rules.

Distinction between natural deduction and sequent calculus

Gentzen asserted a sharp distinction between his single-output natural deduction systems (NK and NJ) and his multiple-output sequent calculus systems (LK and LJ). He wrote that the intuitionistic natural deduction system NJ was somewhat ugly.[14] He said that the special role of the excluded middle in the classical natural deduction system NK is removed in the classical sequent calculus system LK.[15] He said that the sequent calculus LJ gave more symmetry than natural deduction NJ in the case of intuitionistic logic, as also in the case of classical logic (LK versus NK).[16] Then he said that in addition to these reasons, the sequent calculus with multiple succedent formulas is intended particularly for his principal theorem ("Hauptsatz").[17]

Origin of word "sequent"

The word "sequent" is taken from the word "Sequenz" in Gentzen's 1934 paper.[1] Kleene makes the following comment on the translation into English: "Gentzen says 'Sequenz', which we translate as 'sequent', because we have already used 'sequence' for any succession of objects, where the German is 'Folge'."[18]

The system LK

This section introduces the rules of the sequent calculus LK (which just stands for “klassische Prädikatenlogik”), as introduced by Gentzen in 1934.[19] A (formal) proof in this calculus is a sequence of sequents, where each of the sequents is derivable from sequents appearing earlier in the sequence by using one of the rules below.

Inference rules

The following notation will be used:

Axiom: Cut:

 \cfrac{\qquad }{ A \vdash A} \quad (I)

 
   \cfrac{\Gamma \vdash \Delta, A \qquad A, \Sigma \vdash \Pi} {\Gamma, \Sigma \vdash \Delta, \Pi} \quad (\mathit{Cut})

Left logical rules: Right logical rules:

 \cfrac{\Gamma, A \vdash \Delta} {\Gamma, A \and B \vdash \Delta} \quad ({\and}L_1)

 \cfrac{\Gamma \vdash A, \Delta}{\Gamma \vdash A \or B, \Delta} \quad  ({\or}R_1)

 \cfrac{\Gamma, B \vdash \Delta}{\Gamma, A \and B \vdash \Delta}  \quad ({\and}L_2)

 \cfrac{\Gamma \vdash B, \Delta}{\Gamma \vdash A \or B, \Delta} \quad ({\or}R_2)

 \cfrac{\Gamma, A \vdash \Delta \qquad \Sigma, B \vdash \Pi}{\Gamma, \Sigma, A \or B \vdash \Delta, \Pi} \quad ({\or}L)

 \cfrac{\Gamma \vdash A, \Delta \qquad \Sigma \vdash B, \Pi}{\Gamma, \Sigma \vdash A \and B, \Delta, \Pi} \quad ({\and}R)


  \cfrac{\Gamma \vdash A, \Delta \qquad \Sigma, B \vdash \Pi}{\Gamma, \Sigma, A\rightarrow B \vdash \Delta, \Pi} \quad  ({\rightarrow }L)


   \cfrac{\Gamma, A \vdash B, \Delta}{\Gamma \vdash A \rightarrow B, \Delta} \quad ({\rightarrow}R)


  \cfrac{\Gamma \vdash A, \Delta}{\Gamma, \lnot A \vdash \Delta} \quad  ({\lnot}L)


  \cfrac{\Gamma, A \vdash \Delta}{\Gamma \vdash \lnot A, \Delta} \quad ({\lnot}R)


  \cfrac{\Gamma, A[t/x] \vdash \Delta}{\Gamma, \forall x A \vdash \Delta} \quad  ({\forall}L)


  \cfrac{\Gamma \vdash A[y/x], \Delta}{\Gamma \vdash \forall x A, \Delta} \quad  ({\forall}R)


  \cfrac{\Gamma, A[y/x] \vdash \Delta}{\Gamma, \exists x A \vdash \Delta} \quad  ({\exists}L)


  \cfrac{\Gamma \vdash A[t/x], \Delta}{\Gamma \vdash \exists x A, \Delta} \quad  ({\exists}R)

Left structural rules: Right structural rules:


  \cfrac{\Gamma \vdash \Delta}{\Gamma, A \vdash \Delta} \quad (\mathit{WL})


  \cfrac{\Gamma \vdash \Delta}{\Gamma \vdash A, \Delta} \quad (\mathit{WR})


  \cfrac{\Gamma, A, A \vdash \Delta}{\Gamma, A \vdash \Delta} \quad (\mathit{CL})


  \cfrac{\Gamma \vdash A, A, \Delta}{\Gamma \vdash A, \Delta} \quad (\mathit{CR})


  \cfrac{\Gamma_1, A, B, \Gamma_2 \vdash \Delta}{\Gamma_1, B, A, \Gamma_2 \vdash \Delta} \quad (\mathit{PL})


  \cfrac{\Gamma \vdash \Delta_1, A, B, \Delta_2}{\Gamma \vdash \Delta_1, B, A, \Delta_2} \quad (\mathit{PR})

Restrictions: In the rules ({\forall}R) and ({\exists}L), the variable y must not occur free within \Gamma and \Delta. Alternatively, the variable y must not appear anywhere in the respective lower sequents.

An intuitive explanation

The above rules can be divided into two major groups: logical and structural ones. Each of the logical rules introduces a new logical formula either on the left or on the right of the turnstile \vdash. In contrast, the structural rules operate on the structure of the sequents, ignoring the exact shape of the formulae. The two exceptions to this general scheme are the axiom of identity (I) and the rule of (Cut).

Although stated in a formal way, the above rules allow for a very intuitive reading in terms of classical logic. Consider, for example, the rule ({\and}L_1). It says that, whenever one can prove that \Delta can be concluded from some sequence of formulae that contain A, then one can also conclude \Delta from the (stronger) assumption, that A \and B holds. Likewise, the rule ({\neg}R) states that, if \Gamma and A suffice to conclude \Delta, then from Γ alone one can either still conclude \Delta or A must be false, i.e. {\neg}A holds. All the rules can be interpreted in this way.

For an intuition about the quantifier rules, consider the rule ({\forall}R). Of course concluding that \forall{x} A holds just from the fact that A[y/x] is true is not in general possible. If, however, the variable y is not mentioned elsewhere (i.e. it can still be chosen freely, without influencing the other formulae), then one may assume, that A[y/x] holds for any value of y. The other rules should then be pretty straightforward.

Instead of viewing the rules as descriptions for legal derivations in predicate logic, one may also consider them as instructions for the construction of a proof for a given statement. In this case the rules can be read bottom-up; for example, ({\and}R) says that, to prove that A \and B follows from the assumptions \Gamma and \Sigma, it suffices to prove that A can be concluded from \Gamma and B can be concluded from \Sigma, respectively. Note that, given some antecedent, it is not clear how this is to be split into \Gamma and \Sigma. However, there are only finitely many possibilities to be checked since the antecedent by assumption is finite. This also illustrates how proof theory can be viewed as operating on proofs in a combinatorial fashion: given proofs for both A and B, one can construct a proof for A∧B.

When looking for some proof, most of the rules offer more or less direct recipes of how to do this. The rule of cut is different: It states that, when a formula A can be concluded and this formula may also serve as a premise for concluding other statements, then the formula A can be "cut out" and the respective derivations are joined. When constructing a proof bottom-up, this creates the problem of guessing A (since it does not appear at all below). The cut-elimination theorem is thus crucial to the applications of sequent calculus in automated deduction: it states that all uses of the cut rule can be eliminated from a proof, implying that any provable sequent can be given a cut-free proof.

The second rule that is somewhat special is the axiom of identity (I). The intuitive reading of this is obvious: every formula proves itself. Like the cut rule, the axiom of identity is somewhat redundant: the completeness of atomic initial sequents states that the rule can be restricted to atomic formulas without any loss of provability.

Observe that all rules have mirror companions, except the ones for implication. This reflects the fact that the usual language of first-order logic does not include the "is not implied by" connective \not\leftarrow that would be the De Morgan dual of implication. Adding such a connective with its natural rules would make the calculus completely left-right symmetric.

Example derivations

Here is the derivation of " \vdash A \or \lnot A ", known as the Law of excluded middle (tertium non datur in Latin).

    
      (I)

      A \vdash A
     
  
      (\lnot R)

      \vdash \lnot A , A
     
  
      (\or R_2)

      \vdash A \or \lnot A , A
     
  
      (PR)

      \vdash A , A \or \lnot A
     
  
      (\or R_1)

      \vdash A \or \lnot A , A \or \lnot A
     
  
      (CR)

      \vdash A \or \lnot A
     
   

Next is the proof of a simple fact involving quantifiers. Note that the converse is not true, and its falsity can be seen when attempting to derive it bottom-up, because an existing free variable cannot be used in substitution in the rules (\forall R) and (\exists L).

    
      (I)

      p(x,y) \vdash p(x,y)
     
  
      (\forall L)

      \forall x \left( p(x,y) \right) \vdash p(x,y)
     
  
      (\exists R)

      \forall x \left( p(x,y) \right) \vdash \exists y \left( p(x,y) \right)
     
  
      (\exists L)

      \exists y \left( \forall x \left( p(x,y) \right) \right) \vdash \exists y \left( p(x,y) \right)
     
  
      (\forall R)

      \exists y \left( \forall x \left( p(x,y) \right) \right) \vdash \forall x \left( \exists y \left( p(x,y) \right) \right)
     
   

For something more interesting we shall prove  \left( \left( A \rightarrow \left( B \or C \right) \right) \rightarrow \left( \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \rightarrow \lnot A \right) \right) . It is straightforward to find the derivation, which exemplifies the usefulness of LK in automated proving.

    
      (I)

      A \vdash A
     
  
      (\lnot R)

      \vdash \lnot A , A
     
  
      (PR)

      \vdash A , \lnot A
     
   
  
    
      (I)

      B \vdash B
     
   
  
    
      (I)

      C \vdash C
     
   
  
      (\or L)

      B \or C \vdash B , C
     
  
      (PR)

      B \or C \vdash C , B
     
  
      (\lnot L)

      B \or C , \lnot C \vdash B
     
   
  
    
      (I)

      \lnot A \vdash \lnot A
     
   
  
      (\rightarrow L)

      \left( B \or C \right) , \lnot C , \left( B \rightarrow \lnot A \right) \vdash \lnot A
     
  
      (\and L_1)

      \left( B \or C \right) , \lnot C , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \vdash \lnot A
     
  
      (PL)

      \left( B \or C \right) , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) , \lnot C \vdash \lnot A
     
  
      (\and L_2)

      \left( B \or C \right) , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \vdash \lnot A
     
  
      (CL)

      \left( B \or C \right) , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \vdash \lnot A
     
  
      (PL)

      \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) , \left( B \or C \right) \vdash \lnot A
     
   
  
      (\rightarrow L)

      \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) , \left( A \rightarrow \left( B \or C \right) \right) \vdash \lnot A , \lnot A
     
  
      (CR)

      \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) , \left( A \rightarrow \left( B \or C \right) \right) \vdash \lnot A
     
  
      (PL)

      \left( A \rightarrow \left( B \or C \right) \right) , \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \vdash \lnot A
     
  
      (\rightarrow R)

      \left( A \rightarrow \left( B \or C \right) \right) \vdash \left( \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \rightarrow \lnot A \right)
     
  
      (\rightarrow R)

      \vdash \left( \left( A \rightarrow \left( B \or C \right) \right) \rightarrow \left( \left( \left( B \rightarrow \lnot A \right) \and \lnot C \right) \rightarrow \lnot A \right) \right)
     
   

These derivations also emphasize the strictly formal structure of the sequent calculus. For example, the logical rules as defined above always act on a formula immediately adjacent to the turnstile, such that the permutation rules are necessary. Note, however, that this is in part an artifact of the presentation, in the original style of Gentzen. A common simplification involves the use of multisets of formulas in the interpretation of the sequent, rather than sequences, eliminating the need for an explicit permutation rule. This corresponds to shifting commutativity of assumptions and derivations outside the sequent calculus, whereas LK embeds it within the system itself.

Structural rules

The structural rules deserve some additional discussion.

Weakening (W) allows the addition of arbitrary elements to a sequence. Intuitively, this is allowed in the antecedent because we can always restrict the scope of our proof (if all cars have wheels, then it's safe to say that all black cars have wheels); and in the succedent because we can always allow for alternative conclusions (if all cars have wheels, then it's safe to say that all cars have either wheels or wings).

Contraction (C) and Permutation (P) assure that neither the order (P) nor the multiplicity of occurrences (C) of elements of the sequences matters. Thus, one could instead of sequences also consider sets.

The extra effort of using sequences, however, is justified since part or all of the structural rules may be omitted. Doing so, one obtains the so-called substructural logics.

Properties of the system LK

This system of rules can be shown to be both sound and complete with respect to first-order logic, i.e. a statement A\, follows semantically from a set of premises \Gamma\, (\Gamma \vDash A) iff the sequent \Gamma \vdash A can be derived by the above rules.[20]

In the sequent calculus, the rule of cut is admissible. This result is also referred to as Gentzen's Hauptsatz ("Main Theorem").[2][3]

Variants

The above rules can be modified in various ways:

Minor structural alternatives

There is some freedom of choice regarding the technical details of how sequents and structural rules are formalized. As long as every derivation in LK can be effectively transformed to a derivation using the new rules and vice versa, the modified rules may still be called LK.

First of all, as mentioned above, the sequents can be viewed to consist of sets or multisets. In this case, the rules for permuting and (when using sets) contracting formulae are obsolete.

The rule of weakening will become admissible, when the axiom (I) is changed, such that any sequent of the form \Gamma , A \vdash A , \Delta can be concluded. This means that A proves A in any context. Any weakening that appears in a derivation can then be performed right at the start. This may be a convenient change when constructing proofs bottom-up.

Independent of these one may also change the way in which contexts are split within the rules: In the cases ({\and}R), ({\or}L), and ({\rightarrow}L) the left context is somehow split into \Gamma and \Sigma when going upwards. Since contraction allows for the duplication of these, one may assume that the full context is used in both branches of the derivation. By doing this, one assures that no important premises are lost in the wrong branch. Using weakening, the irrelevant parts of the context can be eliminated later.

Absurdity

One can introduce \bot, the absurdity constant representing false, with the axiom:


  \cfrac{}{\bot \vdash \quad }

Or if, as described above, weakening is to be an admissible rule, then with the axiom:


  \cfrac{}{\Gamma, \bot \vdash \Delta}

With \bot, negation can be subsumed as a special case of implication, via the definition \neg A \iff A \to \bot.

Substructural logics

Main article: Substructural logic

Alternatively, one may restrict or forbid the use of some of the structural rules. This yields a variety of substructural logic systems. They are generally weaker than LK (i.e., they have fewer theorems), and thus not complete with respect to the standard semantics of first-order logic. However, they have other interesting properties that have led to applications in theoretical computer science and artificial intelligence.

Intuitionistic sequent calculus: System LJ

Surprisingly, some small changes in the rules of LK suffice to turn it into a proof system for intuitionistic logic.[21] To this end, one has to restrict to sequents with exactly one formula on the right-hand side, and modify the rules to maintain this invariant. For example, ({\or}L) is reformulated as follows (where C is an arbitrary formula):


  \cfrac{\Gamma, A \vdash C \qquad \Sigma, B \vdash C }{\Gamma, \Sigma, A \or B \vdash C} \quad ({\or}L)

The resulting system is called LJ. It is sound and complete with respect to intuitionistic logic and admits a similar cut-elimination proof. This can be used in proving disjunction and existence properties.

In fact, the only two rules in LK that need to be restricted to single-formula consequents are ({\to}R) and (\neg R)[22] (and the latter can be seen as a special case of the former, via \bot as described above). When multi-formula consequents are interpreted as disjunctions, all of the other inference rules of LK are actually derivable in LJ, while the offending rule is


  \cfrac{\Gamma, A \vdash B \or C}{\Gamma \vdash (A \to B) \or C}

This amounts to the propositional formula (A \to (B \or C)) \to ((A \to B) \or C), a classical tautology that is not constructively valid.

See also

Notes

  1. 1.0 1.1 Gentzen 1934, Gentzen 1935.
  2. 2.0 2.1 Curry 1977, pp. 208–213, gives a 5-page proof of the elimination theorem. See also pages 188, 250.
  3. 3.0 3.1 Kleene 2009, pp. 453, gives a very brief proof of the cut-elimination theorem.
  4. Curry 1977, pp. 189–244, calls Gentzen systems LC systems. Curry's emphasis is more on theory than on practical logic proofs.
  5. Kleene 2009, pp. 440–516. This book is much more concerned with the theoretical, metamathematical implications of Gentzen-style sequent calculus than applications to practical logic proofs.
  6. Kleene 2002, pp. 283–312, 331–361, defines Gentzen systems and proves various theorems within these systems, including Gödel's completeness theorem and Gentzen's theorem.
  7. Smullyan 1995, pp. 101–127, gives a brief theoretical presentation of Gentzen systems. He uses the tableau proof layout style.
  8. Curry 1977, pp. 184–244, compares natural deduction systems, denoted LA, and Gentzen systems, denoted LC. Curry's emphasis is more theoretical than practical.
  9. Suppes 1999, pp. 25–150, is an introductory presentation of practical natural deduction of this kind. This became the basis of System L.
  10. Lemmon 1965 is an elementary introduction to practical natural deduction based on the convenient abbreviated proof layout style System L based on Suppes 1999, pp. 25–150.
  11. Here, "whenever" is used as an informal abbreviation "for every assignment of values to the free variables in the judgment"
  12. Buss 1998, p. 10
  13. For explanations of the disjunctive semantics for the right side of sequents, see Curry 1977, pp. 189–190, Kleene 2002, pp. 290, 297, Kleene 2009, p. 441, Hilbert & Bernays 1970, p. 385, Smullyan 1995, pp. 104–105 and Gentzen 1934, p. 180.
  14. Gentzen 1934, p. 188. "Der Kalkül NJ hat manche formale Unschönheiten."
  15. Gentzen 1934, p. 191. "In dem klassischen Kalkül NK nahm der Satz vom ausgeschlossenen Dritten eine Sonderstellung unter den Schlußweisen ein [...], indem er sich der Einführungs- und Beseitigungssystematik nicht einfügte. Bei dem im folgenden anzugebenden logistischen klassichen Kalkül LK wird diese Sonderstellung aufgehoben."
  16. Gentzen 1934, p. 191. "Die damit erreichte Symmetrie erweist sich als für die klassische Logik angemessener."
  17. Gentzen 1934, p. 191. "Hiermit haben wir einige Gesichtspunkte zur Begründung der Aufstellung der folgenden Kalküle angegeben. Im wesentlichen ist ihre Form jedoch durch die Rücksicht auf den nachher zu beweisenden 'Hauptsatz' bestimmt und kann daher vorläufig nicht näher begründet werden."
  18. Kleene 2002, p. 441.
  19. Gentzen 1934, pp. 190–193.
  20. Kleene 2002, p. 336, wrote in 1967 that "it was a major logical discovery by Gentzen 1934–5 that, when there is any (purely logical) proof of a proposition, there is a direct proof. The implications of this discovery are in theoretical logical investigations, rather than in building collections of proved formulas."
  21. Gentzen 1934, p. 194, wrote: "Der Unterschied zwischen intuitionistischer und klassischer Logik ist bei den Kalkülen LJ und LK äußerlich ganz anderer Art als bei NJ und NK. Dort bestand er in Weglassung bzw. Hinzunahme des Satzes vom ausgeschlossenen Dritten, während er hier durch die Sukzedensbedingung ausgedrückt wird." English translation: "The difference between intuitionistic and classical logic is in the case of the calculi LJ and LK of an extremely, totally different kind to the case of NJ and NK. In the latter case, it consisted of the removal or addition respectively of the excluded middle rule, whereas in the former case, it is expressed through the succedent conditions."
  22. Structural Proof Theory (CUP, 2001), Sara Negri and Jan van Plato

References

External links