Logic simulation
From Wikipedia, the free encyclopedia
Logic simulation is the use of a computer program to simulate the operation of a digital circuit. Logic simulation is the primary tool used for verifying the logical correctness of a hardware design. In many cases logic simulation is the first activity performed in the process of taking a hardware design from concept to realization. Modern hardware description languages are both simulatable and synthesizable. Designing hardware today is really writing a program in a hardware description language. Performing a simulation is just running that program. When the program (or model) runs correctly, then one can be reasonably assured that the logic of the design is correct, for the cases that have been tested in the simulation.
Contents |
[edit] Levels of abstraction
Because simulation is a general technique, a hardware design can be simulated at a variety of levels of abstraction. Often it is useful to simulate a model at several levels of abstraction in the same simulation run. The commonly used levels of abstraction are gate level, register transfer level (RTL), and behavioral (or algorithmic) level. However, it is possible to incorporate lower levels like transistor level or even lower physical levels as well as higher levels such as transaction levels or domain-specific levels.
[edit] Good points of logic simulation
Simulation is the key activity in the design verification process. That is not to say that it is an ideal process. It has some very positive attributes:
- It is a natural way for the designer to get feedback about their design. Because it is just running a program – the design itself – the designer interacts with it using the vocabulary and abstractions of the design. There is no layer of translation to obscure the behavior of the design.
- The level of effort required to debug and then verify the design is proportional to the maturity of the design. That is, early in the design’s life, bugs and incorrect behavior are usually found quickly. As the design matures, it takes longer to find the errors. This is beneficial early in the design process. It becomes more problematic later.
- Simulation is completely general. Any hardware design can be simulated. The only limits are time and computer resources.
[edit] Limitations of logic simulation
On the negative side, simulation has two drawbacks, one of which is glaring:
- There is (usually) no way to know when you are done. It is not feasible to completely test, via simulation, all possible states and inputs of any non-trivial system.
- Simulation can take an inordinately large amount of computing resources, since typically it uses a single processor to reproduce the behavior of many (perhaps millions of) parallel hardware processes.
Every design project must answer the question “have we simulated enough to find all the bugs?” and every project manager has taped out his design knowing that the truthful answer to that question is either “no” or “I don’t know”. It is this fundamental problem with simulation that has caused so much effort to be spent looking for both tools to help answer the question and formal alternatives to simulation.
Code coverage, functional coverage and logic coverage tools have all been developed to help gauge the completeness of simulation testing. None are complete solutions, though they all help. Formal alternatives have been less successful. Just like in the general software world, where proving programs correct has proven intractable, formal methods for verifying hardware designs have still not proven general enough to replace simulation. That is not surprising, since it is the same problem.
The second drawback motivates most of the research and development in simulation. That is, simulation is always orders of magnitude slower than the system being simulated. If a hardware system runs at 1GHz, a simulation of that system might run at 10-1000 Hz, depending on the level of the simulation and the size of the system. That is a slowdown of from 106 to 108! Consequently, many people have spent a lot of time and effort finding ways to speed up logic simulation.
[edit] Event simulation versus cycle simulation
Event simulation allows the design to contain simple timing information – the delay needed for a signal to travel from one place to another. During simulation, signal changes are tracked in form of events. A change at a certain time triggers an event after a certain delay. Events are sorted by the time when they will occur, and when all events for a particular time have been handled, the simulated time is advanced to the time of the next scheduled event. How fast an event simulation runs depends on the number of events to be processed (the amount of activity in the model).
In cycle simulation, it is not possible to specify delays. A cycle-accurate model is used, and every gate is evaluated in every cycle. Cycle simulation therefore runs at a constant speed, regardless of activity in the model. Optimized implementations may make take advantage of low model activity to speed up simulation by skipping evaluation of gates whose inputs didn't change.
While event simulation can provide some feedback regarding signal timing, it is not a replacement for static timing analysis. In comparison to event simulation, cycle simulation tends to be faster, to scale better, and to be better suited for hardware acceleration / emulation. However, chip design trends point to event simulation gaining relative performance due to activity factor reduction in the circuit (due to techniques such as clock gating and power gating, which are becoming much more commonly used in an effort to reduce power dissipation). In these cases, since event simulation only simulates necessary events, performance may no longer be a disadvantage over cycle simulation. Event simulation also has the advantage of greater flexibility, handling design features difficult to handle with cycle simulation, such as asynchronous logic and incommensurate clocks. Due to these considerations, almost all commercial logic simulators have an event based capability, even if they primarily rely on cycle based techniques.
[edit] Summary
Considering both the advantages and disadvantages of logic simulation, it really is quite a good tool for verifying the correctness of a hardware design. Despite its drawbacks, simulation remains the first choice for proving correctness of a design before fabrication, and its value has been well established.
[edit] References
- Electronic Design Automation For Integrated Circuits Handbook, by Lavagno, Martin, and Scheffer, ISBN 0-8493-3096-3, a survey of the field of EDA. The above summary was derived, with permission, from Volume I, Chapter 16, Digital Simulation, by John Sanguinetti.
[edit] See also
In addition, there are hundreds of articles on various technical details of this subject (Logic simulation). These are normally presented at conferences such as the Design Automation Conference (DAC) and the International Conference on Computer-Aided Design (ICCAD), along with many smaller conferences. The main journal in the field is IEEE Transactions on Computer-Aided Design. Most of these journals and conference proceedings are published by the IEEE or the ACM. You can search the IEEE on-line library and the ACM digital library and view the abstracts for free. Downloading full text requires purchase, society membership, or a site license; many schools and companies have such licenses already.
[edit] Free Logic Simulators
- Digital Works – digital logic simulator (Windows, runs on Linux with wine)
- CEDAR Logic Simulator – digital logic simulator (Windows)
- Icarus Verilog – open-source simulation and synthesis tool for Linux
- ghdl – VHDL simulator