Espresso heuristic logic minimizer

From Wikipedia, the free encyclopedia

The Espresso logic minimizer is a widely used[1] computer program using heuristic and specific algorithms for efficiently reducing the complexity of digital electronic gate circuits. Espresso was developed at IBM by Richard L. Rudell. Rudell later published the variant Espresso-MV in 1986 under the title "Multiple-Valued Logic Minimization for PLA Synthesis".[2]. Espresso has inspired many derivatives.

Contents

[edit] Introduction

Digital electronic equipment has deeply penetrated everyone's life. Electronic devices are embedded in all kinds of appliances from coffee makers to automobiles. All such devices are composed of numerous blocks of digital circuits, the combination of which performs the required task. As a result, the efficient implementation of logic functions in the form of logical gate circuits has become an economical key factor in the success of many contemporary industrial products.

[edit] Designing digital logic circuits

All digital systems are composed of two elementary functions: memory elements for storing information and combinational logic gate circuits for translating that information. State machines, like counters, are nothing but a combination of memory elements and combinational gate circuits. Since memory elements are standard components to be selected out of a limited set, in essence designing digital functions comes to implementing the combinational gate circuits for the basic building blocks as well as interconnecting all these building blocks.

In general the implementation of gate circuits is referred to as Logic Synthesis, which basically can be carried out by hand, but usually some formal method by computer is applied. In this article the design methods for combinational gate circuits are briefly summarized.

The starting point for the design of a logic gate circuit is its desired functionality, having derived from the analysis of the system as a whole, the gate circuit is to make part of. The description can be stated in some algorithmic form or by logic equations, but may be summarized in the form of a table as well. The below example shows a part of such a table for a 7-segment driver that translates the binary code for the values of a decimal digit into the signals that cause the respective segments of the display to light up.

  Digit  Code      Segments  A-G
    0    0000      1 1 1 1 1 1 0          -A-
    1    0001      0 1 1 0 0 0 0         F   B
    2    0010      1 1 0 1 1 0 1         |-G-|
    3    0011      1 1 1 1 0 0 1         E   C
    .    ....      . . . . . . .          -D-

The implementation process starts with a logic minimization phase, to be described below, in order to simplify the function table by combining the separate terms into larger ones containing fewer variables.

Next the minimized result may be split up in smaller parts by a factorization procedure and is eventually mapped onto the available basic logic cells of the target technology. This operation is commonly referred to as Logic Optimization.[3]

[edit] Classical minimization methods

Minimizing Boolean functions by hand using the classical Karnaugh maps is a laborious, tedious and error prone process. It isn't suited for more than 6 input variables and practical only for up to 4 variables, while product term sharing for multiple output functions is even harder to be carried out.[4] Moreover, this method doesn't lend itself to be automated in the form of a computer program. However, since modern logic functions are generally not constrained to such a small number of variables, while the cost as well as the risk of making errors is prohibitive for manual implementation of logic functions, the use of computers became indispensable.

The first alternative method to become popular was the tabular method developed by Quine and McCluskey. Starting with the truth table for a set of logic functions, by combining the minterms for which the functions are active - the ON-cover - or for which the function value is irrelevant - the DC-cover - a set of prime implicants is composed. Finally a systematic procedure is followed to find the smallest set of prime implicants the output functions can be realised with.[5][6]

Although this Quine-McCluskey algorithm is very well suited to be implemented in a computer program, the result is still far from efficient in terms of processing time and memory usage. Adding a variable to the function will roughly double both of them, because the truth table length increases exponentially with the number of variables. A similar problem occurs when increasing the number of output functions of a combinational function block. As a result the Quine-McCluskey method is practical only for functions with a limited number of input variables and output functions.

[edit] Espresso algorithm

A radically different approach to this issue is followed in the ESPRESSO algorithm, developed by Brayton e.a. at the University of California, Berkeley.[7] Rather than expanding a logic function into minterms, the program manipulates "cubes", representing the product terms in the ON-, DC- and OFF-covers iteratively. Although the minimization result is not guaranteed to be the global minimum, in practice this is very closely approximated, while the solution is always free from redundancy. Compared to the other methods, this one is essentially more efficient, reducing memory usage and computation time by several orders of magnitude. Its name reflects the way of instantly making a cup of fresh coffee. There is hardly any restriction to the number of variables, output functions and product terms of a combinational function block. In general, e.g. tens of variables with tens of output functions are readily dealt with.

The input for ESPRESSO is a function table of the desired functionality; the result is a minimized table, describing either the ON-cover or the OFF-cover of the function, depending on the selected options. By default the product terms will be shared as much as possible by the several output functions, but the program can be instructed to handle each of the output functions separately. This allows for efficient implementation in two-level logic arrays such as a PLA (Programmable Logic Array) or a PAL (Programmable Array Logic).

The ESPRESSO algorithm proved so successful that it has been incorporated as a standard logic function minimization step into virtually any contemporary logic synthesis tool. For implementing a function in multi-level logic, the minimization result is optimized by factorization and mapped onto the available basic logic cells in the target technology, whether this concerns an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit).

[edit] Software

[edit] Minilog

Minilog is a logic minimization program exploiting this ESPRESSO algorithm. It is able to generate a two-level gate implementation for a combinational function block with up to 40 inputs and outputs or a synchronous state machine with up to 256 states. It is part of the Publicad educational design package, that can be downloaded from the website http://pico1.e.ft.fontys.nl/publicad.html - free Publicad toolkit including Minilog logic minimization program (© W.M.J. de Valk)

[edit] References

  1. ^ J.P. hayes, Digital Logic Design, Addison Wesley, 1993
  2. ^ http://www.eecs.berkeley.edu/Pubs/TechRpts/1986/ERL-86-65.pdf
  3. ^ G. De Micheli, Synthesis and Optimization of Digital Circuits, McGraw-Hill Science Engineering, 1994
  4. ^ D. Lewin, Design of Logic Systems,Van Nostrand (UK),1985
  5. ^ R.H. Katz, Contemporary Logic Design, The Benjamin/Cummings Publishing Company, 1994
  6. ^ P.K. Lala, Practical Digital Logic Design and Testing, Prentice Hall, 1996
  7. ^ R.K. Brayton, A. Sangiovanni-Vincentelli, C. McMullen, G. Hachtel, Logic Minimization Algorithms for VLSI Synthesis, Kluwer Academic Publishers, 1984
Languages