Workstation

From Wikipedia, the free encyclopedia

Sun SPARCstation  1+, 25mhz RISC processor from early 1990s
Sun SPARCstation 1+, 25mhz RISC processor from early 1990s

A workstation, such as a Unix workstation, RISC workstation or engineering workstation, is a high-end desktop or deskside microcomputer designed for technical applications. Workstations are intended primarily to be used by one person at a time, although they can usually also be accessed remotely by other users when necessary.

Workstations usually offer higher performance than is normally seen on a personal computer, especially with respect to graphics, processing power, memory capacity and multitasking ability.

Workstations are often optimized for displaying and manipulating complex data such as 3D mechanical design, engineering simulation results, and mathematical plots. Consoles usually consist of a high resolution display, a keyboard and a mouse at a minimum, but often support multiple displays and may often utilize a server level processor. For design and advanced visualization tasks, specialized input hardware such as graphics tablets or a SpaceBall can be used. Workstations have classically been the first part of the computer market to offer advanced accessories and collaboration tools such as videoconferencing capability.

Following the performance trends of computers in general, today's average personal computer is more powerful than the top-of-the-line workstations of one generation before. As a result, the workstation market is becoming increasingly specialized, since many complex operations that formerly required high-end systems can now be handled by general-purpose PCs. However, workstations are designed and optimized for situations requiring considerable computing power, where they tend to remain usable while traditional personal computers quickly become unresponsive.

Contents

[edit] Differing design philosophies between personal computers and technical workstations

SGI O2 Workstation
SGI O2 Workstation
Sony NEWS, 2x 25mhz 68030 processor from early 1990s
Sony NEWS, 2x 25mhz 68030 processor from early 1990s

Workstations were a popular type of computer for engineering, science and graphics throughout the 1980s and 1990s. They ultimately came to be associated with RISC CPUs, but earlier were commonly based on Motorola 68000 series microprocessors.

Workstations have followed a different evolutionary path than personal computers. They were originally derived from lower cost versions of minicomputers such as the VAX line, which in turn had been designed to offload smaller compute tasks from the very expensive mainframe computers of the time. They rapidly adopted 32-bit single-chip microprocessors, as opposed to the more expensive multi-chip processors prevalent in early minis. Later generation workstations used 32-bit and 64-bit RISC processors, which offered higher performance than the CISC processors used in personal computers.

Workstations also ran the same multiuser/multitasking operating systems that minicomputers used, most commonly Unix. Finally, they used networking to connect to larger computers for engineering analysis and design visualization. The much lower costs relative to minicomputers and mainframes allowed greater overall productivity for many companies that relied on powerful computers for technical computing work, since individual users now each had a machine to themselves for small to medium size tasks, thereby freeing up larger computers for batch jobs.

Personal computers, in contrast to workstations, were not designed to bring minicomputer performance to an engineer's desktop, but rather were initially intended for hobbyist/home use or office productivity applications; price sensitivity was a primary consideration. The first personal computers used 8-bit single-chip microprocessors, especially the MOS Technology 6502 and Zilog Z80 processors, in the early days of the Apple II, Atari 800, Commodore 64 and TRS-80. The introduction of the IBM PC in 1981, based on Intel's x86 processor design, eventually changed the industry, with most desktop computers that were not PC clones falling by the wayside. Apple was the lone holdout, opting instead to move first to the Motorola 68000 and then to the PowerPC processor line, but as of 2006 have also migrated to x86-based systems. This de facto standardization means some software dating back over 20 years can still be run on current PCs, although operating system variations can often make it difficult to run software tied to an earlier specific OS release.

PC operating systems were originally single tasking; early OSes such as CP/M, TRS-DOS, Apple DOS and MS-DOS only supported running one program at a time. Both Microsoft Windows and Apple's OSes evolved to support co-operative multitasking and then pre-emptive multitasking; in addition, Unix and Unix-like operating system releases for PCs have been on the market for some time.

[edit] Examples of the first workstations

The Xerox Alto minicomputer, first to use a graphical user interface with mouse and origin of ethernet.
The Xerox Alto minicomputer, first to use a graphical user interface with mouse and origin of ethernet.

Perhaps the first computer that might qualify as a "workstation" was the IBM 1620, a small scientific computer designed to be used interactively by a single person sitting at the console. It was introduced in 1959. One peculiar feature of the machine was that it lacked any actual arithmetic circuitry. To perform addition, it required a memory-resident table of decimal addition rules. This saved on the cost of logic circuitry, enabling IBM to make it inexpensive. The machine was code-named CADET, which some people waggishly claimed meant "Can't Add, Doesn't Even Try". Nonetheless, it rented initially for $1000 a month.

In 1965, IBM introduced the IBM 1130 scientific computer, which was meant as the successor to the 1620. Both of these systems came with the ability to run programs written in Fortran and other languages. Both the 1620 and the 1130 were built into roughly desk-sized cabinets. Both were available with add-on disk drives, printers, and both paper-tape and punched-card I/O. A console typewriter for direct interaction was standard on each.

Early examples of workstations were generally dedicated minicomputers; a system designed to support a number of users would instead be reserved exclusively for one person. A notable example was the PDP-8 from Digital Equipment Corporation, regarded to be the first commercial minicomputer. The first computers specifically designed for one user (and so a workstation in the modern sense of the term) were the Lisp machines developed at MIT around 1974. Other early examples include the famous Xerox Star (1981) and the less well known Three Rivers PERQ (1979).

In the early 1980s, new participants in this field included Apollo Computer and Sun Microsystems, who created Unix-based workstations based on the Motorola 68000 processor. Meanwhile DARPA's VLSI project created several spinoff graphics products as well, notably the SGI 3130, and Silicon Graphics' range of machines that followed. It was not uncommon to differentiate the target market for the products, with Sun and Apollo considered to be network workstations, while the SGI machines were graphics workstations.

Workstations tended to be very expensive, typically several times the cost of a standard PC and sometimes costing as much as a new car. However, minicomputers sometimes cost as much as a house. The high expense usually came from using costlier components that ran faster than those found at the local computer store, as well as the inclusion of features not found in PCs of the time, such as high-speed networking and sophisticated graphics. Workstation manufacturers also tend to take a "balanced" approach to system design, making certain to avoid bottlenecks so that data can flow unimpeded between the many different subsystems within a computer. Additionally, workstations, given their more specialized nature, tend to have higher profit margins than commodity-driven PCs.

The systems that come out of workstation companies often feature SCSI or Fibre Channel disk storage systems, high-end 3D accelerators, single or multiple 64-bit processors, large amounts of RAM, and well-designed cooling. Additionally, the companies that make the products tend to have very good repair/replacement plans. However, the line between workstation and PC is increasingly becoming blurred as the demand for fast computers, networking and graphics have become common in the consumer world, allowing workstation manufacturers to use "off the shelf" PC components and graphics solutions as opposed to proprietary in-house developed technology. Some "low-cost" workstations are still expensive by PC standards, but offer binary compatibility with higher-end workstations and servers made by the same vendor. This allows software development to take place on low-cost (relative to the server) desktop machines.

There have been several attempts to produce a workstation-like machine specifically for the lowest possible price point as opposed to performance. One approach is to remove local storage and reduce the machine to the processor, keyboard, mouse and screen. In some cases, these "diskless nodes" would still run a traditional OS and perform computations locally, with storage on a remote server; in other cases, the local device would fill a niche much closer to a terminal than a computer, displaying tasks executing on the remote server. These approaches are intended not just to reduce the initial system purchase cost, but lower the total cost of ownership by reducing the amount of administration required per user.

This approach was actually first attempted as a replacement for PCs in office productivity applications, with the 3Station by 3Com as an early example; in the 1990s, X terminals filled a similar role for technical computing. Sun has also introduced "thin clients", most notably its Sun Ray product line. However, traditional workstations and PCs continue to drop in price, which tends to undercut the market for products of this type.

[edit] What makes a workstation?

Consumer products such as PCs (and even game consoles) today use components that are often at or near the cutting edge of technology; this makes the decision of whether or not to purchase a workstation much less clear-cut for many organizations than it had been in the past. Sometimes these systems are still required, but many places opt for the less expensive, if more fault-prone, PC-level hardware.

It is instructive to take a detailed look at the history of specific technologies which once differentiated workstations from personal computers. The modern reader might be amused at what was considered the target for a high-end workstation in the early 1980s, the so-called "3M computer": a megabyte of memory, a megapixel display (roughly 1000x1000), and a "megaFLOPS" compute performance (at least one million floating point instructions per second).[1] As limited as this seems today, it was at least an order of magnitude beyond the capacity of the personal computer of the time; the original 1981 IBM PC had 16 KB memory, a text-only display, and floating-point performance around 1 kiloFLOPS (30 kiloFLOPS with the optional 8087 math coprocessor). Other desirable features not found in desktop computers at that time included networking, graphics acceleration, and high-speed internal and peripheral data buses.

(Another goal was to bring the price for such a system down under a "megapenny", that is, less than $10,000; this was not achieved until the late 1980s.)

The more widespread adoption of these technologies into mainstream PCs was a direct factor in the decline of the workstation as a separate market segment:

  • High performance CPUs: while RISC in its early days (early 1980s) offered something like an order-of-magnitude performance improvement over CISC processors of comparable cost, one particular family of CISC processors, Intel's x86, always had the edge in market share and the economies of scale that this implied. By the mid-1990s, x86 CPUs had achieved performance on a parity with RISC (albeit at a cost of greater chip complexity), relegating the latter to niche markets for the most part.
  • Hardware support for floating-point operations: optional on the original IBM PC; remained on a separate chip for Intel systems until the 80486DX processor. Even then, x86 floating-point performance continued to lag behind other processors due to limitations in its architecture. Today even low-price PCs now have performance in the gigaFLOPS range, but higher-end systems are preferred for floating-point intensive tasks.
  • Large memory configurations: PCs were originally limited to a 640K memory capacity until the 1982 introduction of the 80286 processor; early workstations provided access to several megabytes of memory. Even after PCs broke the 640K limit, special programming techniques were required to address significant amounts of memory, as opposed to other 32-bit processors such as SPARC which provided straightforward access to nearly their entire 4 GB memory address range. 64-bit workstations and servers supporting an address range far beyond 4 GB have been available since the mid-1990s, a technology just beginning to appear in the PC desktop and server market in the mid-2000s.
  • Operating system: early workstations ran the Unix operating system (OS) or a Unix-like variant or equivalent such as VMS. The PC CPUs of the time had limitations in memory capacity and memory access protection, making them unsuitable to run OSes of this sophistication, but this, too, began to change in the late 1980s as PCs with 32-bit CPUs and integrated MMUs became widely affordable.
  • High-speed networking (10 Mbit/s or better): 10 Mbit/s network interfaces were commonly available for PCs by the early 1990s, although by that time workstations were pursuing even higher networking speeds, moving to 100 Mbit/s, 1 Gb/sec, and 10 Gb/sec. However, economies of scale and the demand for high speed networking in even non-technical areas has dramatically decreased the time it takes for newer networking technologies to reach commodity price points.
  • Large displays (17"-21") and screen resolutions: common among PCs by the late 1990s.
  • High-performance 3D graphics hardware: this started to become increasingly popular in the PC market around the mid-to-late 1990s, mostly driven by computer gaming.
  • High performance/high capacity data storage: early workstations tended to use proprietary disk interfaces until the emergence of the SCSI standard in the mid-1980s. Although SCSI interfaces soon became available for PCs, they were comparatively expensive and tended to be limited by the speed of the PC's ISA peripheral bus (although SCSI did become standard on the Apple Macintosh). SCSI is an advanced controller interface which is particularly good where the disk has to cope with multiple requests at once. This makes it suited for use in servers, but its benefits to desktop PCs which mostly run single-user operating systems are less clear. These days, with desktop systems acquiring more multi-user capabilities (and the increasing popularity of Linux), the new disk interface of choice is Serial ATA, which has throughput comparable to SCSI but at a lower cost.
  • Extremely reliable components: this may remain the distinguishing feature of a workstation today. Although most technologies implemented in modern workstations are also available at lower cost for the consumer market, finding good components and making sure they work compatibly with each other is a great challenge in workstation building. Because workstations are designed for high-end tasks such as weather forecasting, video rendering, and game design, it's taken for granted that these systems must be running under full-load, non-stop for several hours or even days without issue. Any off-the-shelf components can be used to build a workstation, but the lifespans of such components under such rigorous conditions are questionable. For this reason, almost no workstations are built by the customer themselves but rather purchased from a vendor such as BOXX, EUROCOM, Hewlett-Packard, IBM, Sun Microsystems, SGI or Dell.
  • Tight integration between the OS and the hardware: Workstation vendors both design the hardware and maintain the Unix operating system variant that runs on it. This allows for much more rigorous testing than is possible with an operating system such as Windows. Windows requires that 3rd party hardware vendors write compliant hardware drivers that are stable and reliable. Also, minor variation in hardware quality such as timing or build quality can affect the reliability of the overall machine. Workstation vendors are able to ensure both the quality of the hardware, and the stability of the operating system drivers by validating these things in-house, and this leads to a generally much more reliable and stable machine.

These days, workstations have changed greatly. Since many of the components are now the same as those used in the consumer market, the price differential between workstations and consumer PCs is correspondingly much narrower than it once was. For example, some low-end workstations use CISC based processors like the Intel Pentium 4 or AMD Athlon 64 as their CPUs. Higher-end workstations still use more sophisticated CPUs such as AMD Opteron, IBM POWER, MIPS or Sun's UltraSPARC, and run a variant of Unix, delivering a truly reliable workhorse for computing-intensive tasks. (PA-RISC and Alpha CPUs are still sold in workstations but are excluded in the above list as they are reaching their end-of-life soon.)

Some workstations are designed for use with only one specific application such as AutoCAD, Avid Xpress Studio HD, 3D Studio Max, etc. To ensure compatibility with the software, purchasers usually ask for a certificate from the software vendor. The certification process makes the workstation's price jump several notches but for professional purposes, reliability is more important than the cost.

[edit] Workstation Class PCs

A significant segment of the desktop market are computers expected to perform as workstations, but using PC operating systems and components. PC component manufacturers will often segment their product line, and market premium components which are functionally similar to the cheaper "consumer" models but feature a higher level of robustness and/or performance. Notable examples of this are the Xeon and Opteron CPUs, and the Quadro line of video processors.

A workstation class PC may have some of the following features:

  • support for ECC memory
  • a larger number of memory sockets which use registered (buffered) modules
  • multiple processors
  • multiple displays
  • run a "business" or "professional" operating system version


[edit] List of workstations and manufacturers

Note that many of these are extinct.

[edit] Footnote

  1. ^ RFC 782 defined the workstation environment more generally as hardware and software dedicated to serve a single user, and that it provide for the use of additional shared resources.

[edit] See also

This article was originally based on material from the Free On-line Dictionary of Computing, which is licensed under the GFDL.