-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathWorkStation.html
35 lines (34 loc) · 10.7 KB
/
WorkStation.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
<html>
<head>
<meta charset="utf-8" />
</head>
<body>
<div style="BackGround-color:orange;">
<marquee behavior="scroll" direction="left"><font size=6>WorkStation</font></marquee>
</div>
<img src = "imgs\201109141101224e701902dd400.png" align = "left" width = "250" height= "250">
It is a type of computer used for engineering applications (CAD/CAM), desktop publishing, software development, and other types of applications that require a moderate amount of computing power and relatively high quality graphics capabilities. Workstations generally come with a large, high-resolution graphics screen, at large amount of RAM, built-in network support, and a graphical user interface. Most workstations also have a mass storage device such as a disk drive, but a special type of workstation, called a diskless workstation, comes without a disk drive. The most common operating systems for workstations are UNIX and Windows NT. Like personal computers, most workstations are single-user computers. However, workstations are typically linked together to form a local-area network, although they can also be used as stand-alone systems.
A workstation is a high-end microcomputer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. The term workstation has also been used to refer to a mainframe computer terminal or a PC connected to a network.
Both being microcomputers, workstations had offered higher performance than desktop computers, especially with respect to CPU and graphics, memory capacity, and multitasking capability. They are optimized for the visualization and manipulation of different types of complex data such as 3D mechanical design, engineering simulation (e.g. computational fluid dynamics), animation and rendering of images, and mathematical plots. Typically, consoles consist of a high resolution display, a keyboard and a mouse at a minimum, but also offer multiple displays, graphics tablets, 3D mice (devices for manipulating 3D objects and navigating scenes), etc. Workstations were the first segment of the computer market to present advanced accessories and collaboration tools.
Presently, the workstation market is highly commoditized and is dominated by large PC vendors, such as Dell and HP, selling Microsoft Windows or GNU/Linux running on Intel Xeon or AMD Opteron. Alternative Unix-based platforms are provided by Apple Inc.
<h3>Decline of workstations</h3>
<img src = "imgs\576px-SgiOctane.jpg" align = "RIGHT" width = "250" height= "250">
An SGI Octane graphics-workstation (1997-2000)
In the early 1980s, a high-end workstation had to meet the three Ms. The so-called "3M computer" had a Mibibyte of memory, a Megapixel display (roughly 1000�1000), and a "MegaFLOPS" compute performance (at least one million floating point operations per second).[a] As limited as this seems today, it was at least an order of magnitude beyond the capacity of the personal computer of the time; the original 1981 IBM PC had 16 KB memory, a text-only display, and floating-point performance around 1 kiloFLOPS (30 kiloFLOPS with the optional 8087 math coprocessor). Other desirable features not found in desktop computers at that time included networking, graphics acceleration, and high-speed internal and peripheral data buses.
Another goal was to bring the price for such a system down under a "Megapenny", that is, less than $10,000; this was not achieved until the late 1980s, although many workstations, particularly mid-range or high-end still cost anywhere from $15,000 to $100,000 and over throughout the early to mid-1990s.
The more widespread adoption of these technologies into mainstream PCs was a direct factor in the decline of the workstation as a separate market segment:
High-performance CPUs: while RISC in its early days (early 1980s) offered roughly an order-of-magnitude performance improvement over CISC processors of comparable cost, one particular family of CISC processors, Intel's x86, always had the edge in market share and the economies of scale that this implied. By the mid-1990s, some x86 CPUs had achieved performance on a parity with RISC in some areas, such as integer performance (albeit at the cost of greater chip complexity), relegating the latter to even more high-end markets for the most part.
Hardware support for floating-point operations: optional on the original IBM PC; remained on a separate chip for Intel systems until the 80486DX processor. Even then, x86 floating-point performance continued to lag behind other processors due to limitations in its architecture. Today even low-price PCs now have performance in the gigaFLOPS range, but higher-end systems are preferred for floating-point intensive tasks.
Large memory configurations: PCs (i.e. IBM-compatibles) were originally limited to a 640 KB memory capacity (not counting bank-switched "expanded memory") until the 1982 introduction of the 80286 processor; early workstations provided access to several megabytes of memory. Even after PCs broke the 640 KB limit with the 80286, special programming techniques were required to address significant amounts of memory until the 80386, as opposed to other 32-bit processors such as SPARC which provided straightforward access to nearly their entire 4 GB memory address range. 64-bit workstations and servers supporting an address range far beyond 4 GB have been available since the early 1990s, a technology just beginning to appear in the PC desktop and server market in the mid-2000s.
Operating system: early workstations ran the Unix operating system (OS) or a Unix-like variant or equivalent such as VMS. The PC CPUs of the time had limitations in memory capacity and memory access protection, making them unsuitable to run OSes of this sophistication, but this, too, began to change in the late 1980s as PCs with the 32-bit 80386 with integrated paged MMUs became widely affordable.
High-speed networking (10 Mbit/s or better): 10 Mbit/s network interfaces were commonly available for PCs by the early 1990s, although by that time workstations were pursuing even higher networking speeds, moving to 100 Mbit/s, 1 Gbit/s, and 10 Gbit/s. However, economies of scale and the demand for high-speed networking in even non-technical areas has dramatically decreased the time it takes for newer networking technologies to reach commodity price points.
Large displays (17" to 21"), high resolutions, high refresh rate were common among PCs by the late 1990s, although in the late 1980s and early 1990s, this was rare.
High-performance 3D graphics hardware: this started to become increasingly popular in the PC market around the mid-to-late 1990s, mostly driven by computer gaming, although workstations featured better quality, sometimes sacrificing performance.
High performance/high capacity data storage: early workstations tended to use proprietary disk interfaces until the emergence of the SCSI standard in the mid-1980s. Although SCSI interfaces soon became available for PCs, they were comparatively expensive and tended to be limited by the speed of the PC's ISA peripheral bus (although SCSI did become standard on the Apple Macintosh). SCSI is an advanced controller interface which is particularly good where the disk has to cope with multiple requests at once. This makes it suited for use in servers, but its benefits to desktop PCs which mostly run single-user operating systems are less clear. These days, with desktop systems acquiring more multi-user capabilities (and the increasing popularity of Linux), the new disk interface of choice is Serial ATA, which has throughput comparable to SCSI but at a lower cost.
Extremely reliable components: together with multiple CPUs with greater cache and error correcting memory, this may remain the distinguishing feature of a workstation today. Although most technologies implemented in modern workstations are also available at lower cost for the consumer market, finding good components and making sure they work compatibly with each other is a great challenge in workstation building. Because workstations are designed for high-end tasks such as weather forecasting, video rendering, and game design, it's taken for granted that these systems must be running under full-load, non-stop for several hours or even days without issue. Any off-the-shelf components can be used to build a workstation, but the reliability of such components under such rigorous conditions are uncertain. For this reason, almost no workstations are built by the customer themselves but rather purchased from a vendor such as Hewlett-Packard, IBM, Sun Microsystems, SGI, Apple, or Dell.
Tight integration between the OS and the hardware: Workstation vendors both design the hardware and maintain the Unix operating system variant that runs on it. This allows for much more rigorous testing than is possible with an operating system such as Windows. Windows requires that third-party hardware vendors write compliant hardware drivers that are stable and reliable. Also, minor variation in hardware quality such as timing or build quality can affect the reliability of the overall machine. Workstation vendors are able to ensure both the quality of the hardware, and the stability of the operating system drivers by validating these things in-house, and this leads to a generally much more reliable and stable machine.
Since the turn of the millennium, the definition of "workstation" has blurred to some extent. Many of the components used in lower-end "workstations" are now the same as those used in the consumer market, and the price differential between the lower end workstation and consumer PCs can be narrower than it once was (and in certain cases in the high-end consumer market, such as the "enthusiast" game market, it can be difficult to tell what qualifies as a "desktop PC" and a "workstation"). For example, some low-end workstations use CISC-based processors like the Intel Core or AMD Phenom II or FX as their CPUs. Higher-end workstations still use more sophisticated CPUs such as the modern iterations of the Intel Xeon, AMD Opteron, IBM POWER, or Sun UltraSPARC CPUs, and typically run a variant of Unix, often allowing these machines to still focus on one area of expertise extensively. In another instance, the Nvidia GeForce 256 graphics card spawned the Quadro, which had the same GPU but different driver support and certifications tailored to the unique requirements of CAD applications and retailed for a much higher price, so many took to using the GeForce as a "poor-man's" workstation card since the hardware was largely as capable plus it could be soft-modded to unlock features nominally exclusive to the Quadro.[1]
<br>
<div align="center"><a href = "main_sub.html"><input type=submit value=back></a></div>
</body>
</html>