The Race Is On! But What's In A Name?


By: W. David Young and Jo Ann C. Carland

CISC or RISC? Terms long familiar to the computer industry are now gaining significance in the popular press as users become more knowledgeable. The concepts of CISC (Complex Instruction Set Computers) are the very heart of our existing systems. Complex Instruction Sets allow us the flexibility to perform a multitude of applications using the same architectural configurations and yet that flexibility has a down side as well. Flexibility typically must trade with speed because of the permutations of instructions possible. On the other side, however, is the speed which is touted by the RISC (Reduced Instruction Set Computers) chip manufacturers.

What does the future hold for microcomputing? Most of us wonder. Should we purchase now or wait to see what the market will bring next month? Intel Corporation has released its new 80486 chip which purports to be a “mainframe on a chip” and announced the production of its 80586 chip. It sounds ready not only to take on microcomputer companies but mainframe manufacturers as well.

We are also hearing about RISC technology. It too is becoming a catchword in the popular press. The antagonists on the television program, “A Man Called Hawk,” were trying to sell flawed RISC chips for guided missile and radar systems not long ago. Has RISC become a household word almost before the systems have been implemented on a large scale? Which would you prefer: new cutting edge technology or tried and true concepts?

An Historical Perspective

The year is 1990 and everyone in the market for a personal computer has more choices than ever before. Historically, personal computer buyers had to decide between a computer made by Apple or an IBM compatible using a microprocessor produced by Intel Corporation.

Early microcomputers used the Intel 8080 microprocessor developed in 1974. These micros had the power equivalent to an IBM 704 which had been developed twenty years earlier. CP/M machines introduced in 1977 used the Zilog (Z80) microprocessor and had power commensurate with the IBM 7094 released in 1962. Since the first IBM PC was introduced in 1981, great advances have been achieved in the hardware. The first IBM PC utilized the Intel 8088 chip which had the same capacity as a VAX PDP 11/70 produced in 1975. In 1984, IBM unveiled its PC/AT which employed the Intel 80286 microprocessor and produced the power of a VAX 11/780 mainframe. Intel progressed with the 80386 chip creating personal computers exhibiting the power of a VAX 8600. Intel has now developed the 80486 microprocessor [1].

Every step on this ladder of processing power has been met with microcomputers harnessing the capabilities of previous generation mainframes. However, software has not kept pace with the hardware. Most software utilizes the technology of previous microprocessors and has not been able to link the practice of the art to the stateof-the-art.

The new 80486 microprocessor is comprised of over one million transistors. Personal computers incorporating this chip will be on the market in 1990. These machines are expected to have the processing power of an IBM 3090 mainframe with the capability of running dozens of programs concurrently and the power to make the machines more user friendly [1].

Intel has already announced its next generation of microprocessors, the 80586, which will contain four million transistors. Projections are for the 80686 to appear in 1995/1996 with 22 million transistors and the 80786 by the year 2000 with 100 million transistors, running at 250 Megahertz and occupying only one square inch [4]. Parallel processing and speed are being forecast.

Yet on the negative side, the “486” machines will have a starting price of approximately $20,000 and no software currently exists that utilizes the full capabilities of the new chip [1]. Such will also be true of the later technologies as they come on line initially. Can you wait or should you make the investment?

Computer buyers can have no way of accurately predicting when the “486” machines will be affordable. Computers using 80386 technology have been on the market over two years and are still much more expensive than the “286” machines. Software is also a major concern for the “486”. Currently, the “386” can run the existing software, but the computer buyer can only guess at how much time will have to elapse before software will truly incorporate the new technology [1]. The same phenomenon can be forecast for the “486” and its successors as well.

The RISC Technology

A different kind of microprocessor is being championed by some industry insiders. The RISC chip is considered by these people to be a better alternative to the 80486 chip. RISC technology was actually formulated more than thirty years ago by Seymour Cray of supercomputer fame, but a RISC computer was not built until 1982 when some graduate students at The University of California at Berkeley undertook the job [3]. The RISC process is opposite in design philosophy to the CISC technology employed by other processors. The CISC technology uses many complex instructions to enable the computer to perform a variety of tasks and to have available the coding for almost any possibility.

The RISC can offer speed by coding only those instructions which will be used for a particular function. The fewer the instructions to be processed, the faster the speed. For complex instructions, more coding is necessary, thus reducing the speed [2].

The RISC microprocessor will operate much faster than the 80486 because unnecessary instructions are not included. CISC microprocessors such as the “486” contain numerous instructions to deal with every possible problem. Time is needed to sort through these instructions even though a given instruction is only used when needed. Half of all the instructions are only rarely used. If only the necessary instructions are used as in the RISC technology, then the microprocessors can run much faster although they won’t be able to perform as many functions without additional programming [1].

Two companies have developed their own RISC microprocessors. SUN Microsystems originally was thought to have developed the industry standard with its SPARC chip. The SPARC chip received great support from such giants as AT&T. Motorola made serious inroads by introducing its 88000 RISC chip which steadily performs better than the SPARC chip. The 88000 has been compared to a miniature CDC 7600 mainframe.

Even Intel is getting into the act. It is producing a chip which is capable of running in a RISC mode or a CISC mode. This chip is designed for speed and compatibility [3].

RISC technology is being explored by Hewlett Packard who has announced that by the middle 1990s, they will have a single processor workstation with performance levels in excess of 50 MIPS (million instructions per second), floating point performance of 12 to 16 megaflops, and operating speeds of 60 to 90 megahertz using the submicron CMOS (Complementary Metal Oxide Semiconductor) fabrication techniques [6].

MIPS Computer Systems of Sunnyvale, California is also exploring the concept of RISC technology using the ECL (Emitter-Coupled Logic) circuitry promising 55 MIPS performance with floating point performance of 13.3 coded double precision Linpack Megaflops. This system, the RC3260, will contain 32 megabytes of main memory, expandable to 256 megabytes; a dual SMD disk controller, one 655 megabyte SMD hard drive, a SCSI controller and will cost $150,000 [5]. These systems are expected to be compatible with earlier versions and, therefore, be able to take advantage of existing applications.

Adapting Applications Software

RISC microprocessors do have a negative aspect though. Computers using these chips cannot use the existing CISC oriented software. However, RISC advocates feel that the majority of vendors will be willing to alter their software in order to take advantage of the greater operating speeds. These changes are being made easier by new software automation tools [1]. With structured programming techniques used today, the process of application development itself has increased in speed as mentioned in a previous Computer Corner. But what about existing programs which were developed before structure became the password and are still being used?

The issue of whether software will be available for the RISC technology in the applications necessary for the consumer and in a reasonable period is the question. The RISC concept is thirty years old while some versions of the Intel 8086 technology are less than ten years old, yet most applications software has been developed in CISC technology.

Computer buyers face a tough decision when choosing between machines using CISC microprocessors and machines using RISC microprocessors. Both technologies are sound, but each has drawbacks. It will probably be years before software that truly captures the potential of the technology will be readily available for either type of machine, especially those utilizing the RISC chip. Perhaps a reasonable alternative would be a hybrid approach or a combination of CISC and RISC depending upon the purpose of the application.

Price will also remain a factor. Actually, the computer buyer of today would need a crystal ball to make a perfect decision. Yet most of us lack that instrument so we must rely on our ability to weigh the advantages and disadvantages of each technology relative to the particular situation and try to make the best decision based on our present and future needs.

Is there a RISC in your future? Depending upon the application and purpose, the RISC is an entirely likely technology to choose.


1. Brandt, R. & Port, O. (with Hof, R. D.) “INTEL: The Next Revolution”, Business Week, September, 1988, 74-79.

2. Brink, J. & Spillman, R. Computer Architecture and VAX Assembly Language Programming. Menlo Park, California: Benjamin Cummings Publishing Company, 1987.

3. Horvitz, P. “What Makes the RISC Chip So Fast?”, PC Review, September, 1988, 12-17.

4. Marshall, M. “Intel 80586 to Contain 4 Million Transistors”, Infoworld, August 28, 1989, 1, 3.

5. Marshall, M. “MIPS Speedster Runs at 55 MIPS”, Infoworld, November 27, 1989, 56.

6. Marshall, M. “HP Unwraps Plans for Future RISC Chips”, Infoworld, December 11, 1989, 1, 3.

David Young is a graduate student who recentlyfinished his MBA at Western Carolina University. He plans to pursue hisPhD in Marketing in the fall. He has served as a graduate assistant to the MBA director and has taught computer related topics at Southwestern Community College.

Jo Ann C. Carland is an Assistant Professor of Computer Information Systems at Western Carolina University. She holds a Certificate in Data Processing from the Institute for Certification of Computer Professionals, an AB from Meredith College, an MAEd from Western Carolina University and a PhD from the University of Georgia. She is active in consulting and has authored numerous publications in the area of microcomputers and entrepreneurship.

W. Kay Turpin is taking over the task of Computer Corner Editor beginning with this issue of the PM Network. She is an Instructor in the Computer Information Systems Department at Western Carolina University. She received her B.S. in Computer Science from California State University in 1982 and her M.S. in Computer Science from the University of Arkansas in 1984. Since coming to Western Carolina, she has been very interested in teaching effectiveness and received the "Accessible Professor Award” in 1989 by the Special Services area and students. She teaches primarily programming courses such as FORTRAN, COBOL and Advanced COBOL as well as MIS topics.

February 1990



Related Content

  • Project Management Journal

    Using Principal–Steward Contracting and Scenario Planning to Manage Megaprojects member content locked

    By Turner, J. Rodney Megaprojects are complex, but people use constructs inappropriate in complex situations for their management, particularly contractual arrangements.

  • Project Management Journal

    A Dynamic Capabilities Model of Innovation in Large Interfirm Projects member content locked

    By Steen, John | Ford, Jerad A. | Verreynne, Martie-Louise The time-bounded nature of large interfirm projects and technical interdependencies constrain innovation.

  • PM Network

    Salto cuántico Nuevas iniciativas podrían acelerar un logro revolucionario de la computación member content open

    Cuando Google anunció en agosto que una de sus computadoras cuánticas había logrado simular una reacción química, se convirtió en el último avance de la tecnología. La promesa de Quantum: aprovechar…

  • PM Network

    Quantum Leap member content open

    By Ali, Ambreen | Hendershot, Steve | Hermans, Amanda | Thomas, Jen | Wilkinson, Amy When Google announced in August that one of its quantum computers had succeeded in simulating a chemical reaction, it became the latest breakthrough for the technology. Quantum’s promise: harnessing…

  • Playbook for Project Management in Data Science and Artificial Intelligence Projects member content open

    By PMI South Asia | NASSCOM The playbook presents a framework with recommendations on resources that organizations can use to build capability for DS/AI projects and a best practices toolkit to apply to different project stages.