Ed Grochowski's Website
Newer Articles
2015
2015: Year of the Man-Bun
Half a Year with the Haswell-E PC
2014
AT&T is Out of Control
TTL Turns 50
Daylight Savings Time Fail
2013
Are Your Computers Working for You?
What is a Real Computer?
We're the Phone Company
Home PC Tenth Anniversary
Ducky DK2108 Keyboard Review
Older Articles


(c) 2017
Ed Grochowski

What is a Real Computer?

Ed Grochowski

Posted 7-26-2013

As a computer hobbyist since the late 1970s, I have always been aware of the differences between the computer that I owned (a "toy computer"), and the computer that I wanted (a "real computer").

Such differences continued to exist even as technology improved: just when the computer that I owned received the capabilities of a "real computer", real computers would charge ahead to make the next advancement.

Here is a look back on the ever-higher bar for what constitutes a real computer.


High-Level Language

My first computer was a homebrew Z80 machine with 16 kilobytes of memory. I designed and built this computer in 1978.

Programmed in machine language by entering hexadecimal opcodes into the computer's memory, the computer was decidedly hard-to-use. I thought to myself "a real computer would be programmed in a high-level language". I learned a lot about computer programming by writing BASIC interpreters for this machine.


Pre-emptive Multitasking Operating System

My next awareness of a "real computer" came from being exposed to Unix Version 7 running on the DEC PDP 11/70 in my undergraduate classes at the University of California, Berkeley. Here was a computer that could run multiple jobs at once, albeit extremely slowly. Memory management hardware kept the jobs isolated even if one crashed.

"This was a real computer", I thought to myself. It would be many years before I would own a computer that ran a pre-emptive multitasking operating system.


32-bit word size

The previous computers were limited by a 16-bit word size and corresponding 64 kilobyte address space. 65,536 is not a lot of anything, by computer standards. By the early 1980s, I found myself saying that a real computer would have a 32-bit word size, enough to address four billion bytes of memory.

The real computer would take the form of the VAX 11/780 running BSD Unix, used in many of my classes at Berkeley. I would approximate this on my next home computer with a Motorola 68000 microprocessor. Free from the limitations of the 64 kilobyte address space, the 68000 had a useful lifespan of over a decade.


Bit-Mapped Graphics and Fixed Storage

Peripherals greatly improved during the 1980s.

"A real computer would have bit-mapped graphics and a hard-disk drive", I thought to myself. By 1985, costs had come down to the point where I could own both.


The PC becomes a Real Computer

Although Intel had been shipping microprocessors with 32-bit word size and memory management since 1985, personal computer operating systems would take a decade to catch up. With Microsoft's introduction of Windows 95 in 1995, the PC now ran a 32-bit, pre-emptive multitasking operating system. The PC had become a real computer.

The combination of 32-bit word size and pre-emptive multitasking proved to be extremely long-lived. From the mainframes of the 1960s, to the minicomputers of the 1970s, to the microcomputers of the 1980s, 32-bit architectures lasted well into the mid-2000s in the PC market. Such machines are still used today in handheld devices. Four billion bytes provides sufficiently large capacity to handle many computing tasks.


64-bit word size and Symmetric Multiprocessor

Since memory densities double every two years, it was only a matter of time before 32-bits were no longer enough. The crossover occurred in the 1990s in mainframe and RISC architectures. By the mid-2000s, I found myself sitting in front of my PC and thinking "a real computer would have a 64-bit word size".

Fortunately, microprocessors had been leading the advancement of computer technology for many years. It was not long before I owned a PC with a 64-bit word size in 2010. 64-bit architectures provide enough addressability to remain useful throughout my expected lifetime.

The 2010 PC was also a symmetric multiprocessor with a whopping four CPU cores. No longer limited to time-slicing between multiple tasks or sub-tasks (threads), the multiprocessor could run them simultaneously. The transition from uniprocessors to multiprocessors opened up tremendous potential for future performance improvements.

Here was a real computer by any historical measure: 64-bit word size, tens of billions of operations per second, 12 billion bytes of memory, and two trillion bytes of storage. The 2010-vintage PC had 5-6 orders of magnitude greater speed and storage capacity than the Z80 machine I started with in 1978. Such a machine would seem to be able to tackle any computing problem. Or can it?


What is the next Real Computer?

A defining characteristic of computer technology is that computing workloads grow over time. Computing problems come in all sizes, and there is no limit to the amount of computing that one would like to do. Increases in speed and capacity are put to use by having the computer do new things.

Process billions of pixels? No problem.

Process a week's worth of audio? No problem.

Such tasks would have seemed unimaginable only two decades earlier.

I am looking forward to new applications made possible by ever-increasing computer speeds and capacities.

My future real computer will likely look like the large servers of today, with many tens of CPU cores, or even a small data center with hundreds of machines. It is only a matter of time before advances in technology make that amount of computing smaller and more affordable.