Stack machine

From Wikipedia, the free encyclopedia

In computer science, a stack machine is a model of computation in which the computer's memory takes the form of one or more stacks. The term also refers to an actual computer implementing or simulating the idealized stack machine.

In addition, it can also refer to a real or simulated machine with an instruction set with memory-reference instructions that push values in memory locations onto a stack and pop values from the stack into memory locations and arithmetic instructions that perform arithmetic on values at the top of the stack and replace those values with the result. Programs written for that type of stack machine generally have higher code density than equivalent programs written for other instruction sets.

[edit] Stack model of computation

A stack machine with 1 stack is the same as a deterministic pushdown automaton (DPDA), which is a relatively weak computation model. (Deterministic pushdown automata are strictly weaker than nondeterministic pushdown automata (NPDA), which are in turn strictly weaker than Turing machines.)

A stack machine with multiple stacks, on the other hand, is equivalent to a Turing machine. For example, a 2-stack machine can emulate a TM by using one stack for the tape portion to the left of the TM's current head position and the other stack for the portion to the right.

[edit] Stack machine instruction sets

Machines with a stack-based instruction set can have one or more stacks. The vast majority of stack machines are two-stack machines.[dubious ] The two stacks are usually the data stack and the return stack, the former being used for operations on data and the latter to hold the return addresses for procedure calls. There has been very little research on using more than two stacks.

A machine using processor registers for operands can easily simulate a stack machine. Such a simulation is sometimes called a virtual stack machine. The advantage of a stack-based instruction set architecture over an architecture using registers is that the instruction size is smaller since there is no need to specify operand addresses. This nearly always leads to dramatically smaller compiled programs. Stack machines, however, tend to execute instructions more slowly than register based machines when the CPU can execute instructions faster than they can be delivered from memory. This is because the results of partially computed values are always stored and retrieved from the stack - which is located in external memory - during any calculation.

Commercial implementations of stack machines generally include a small set of special purpose fixed-function registers for addressing the enclosing contexts. Perhaps this is not a "pure" stack machine in some sense, but this does allow a stack machine CPU to be entirely suitable for general purpose computing.

Examples of commercial use of a stack machine include

Note that the Burroughs architecture combines a stack machine with tagged memory (a few bits in every memory word to describe the data type of the operands). Tagged memory requires fewer opcodes, e.g., a single "add" instruction works for any combination of integer and floating point operands. Requiring fewer opcodes means that the entire instruction set can fit into smaller opcodes, reducing the total instruction width.

[edit] External links

In other languages