In July 1958, at Texas Instruments, Jack St Clair Kilby proposed the creation of a monolithic device (an integrated circuit) on a single piece of silicon. Then, in September, he produced the first integrated circuit, containing five components on a piece of germanium that was half an inch long and was thinner than a toothpick.
1959. The following year, Fairchild Semiconductors filed for a patent for the planar process of manufacturing transistors. This process made commercial production of transistors possible and led to Fairchild’s introduction, in two years, of the first commercial integrated circuit. Soon transistors were small enough to make hearing aids that fitted into the ear and within pacemakers. Companies such as Sony started to make transistors operate over higher frequencies and within larger temperature ranges. Eventually they became so small and robust that many of them could be placed on a single piece of silicon. These were referred to as microchips and they started the microelectronics industry. The first two companies to develop integrated circuits were
Texas Instruments and Fairchild Semiconductors. At Fairchild Semiconductors, Robert Noyce constructed an integrated circuit with components connected by aluminum lines on a silicon oxide surface layer on a plane of silicon. He then went on to lead one of the most innovate companies in the world, the Intel Corporation.
After the success of the first commercial transistorized computer (IBM 7090/7094), IBM went on to develop the first automatic mass-production facility for transistors. Another important system, at the time, was the RCA 501, which was the first computer with COBOL. FORTRAN was fine for engineers and scientists as its language look much like mathematics, but this was little good to business applications, so COBOL was developed. Its adoption as a standard business language relied greatly on US government stating that they would not buy a computer if it did not handle COBOL. At the time, 1960, only the UNIVAC II and the RCA 501 could support it.
1961. The second generation of computers arrived in 1961, when the great innovator, Fairchild Semiconductors, released the first commercial integrated circuit. In the next two years, significant advances were made in the interfaces to computer systems. One of the first was by Teletype, who produced the Model 33 keyboard and punched-tape terminal. It was a classic design and was a standard part of most of the available systems. The other advance was by Douglas Engelbart, who received a patent for a mouse-pointing device for computers. Then, in 1963, DEC sold their first minicomputer to Atomic Energy of Canada. They would become one of the main competitors to IBM, but eventually fail as they dismissed the growth in the personal computer market.
The Great Mainframes (System/360, Cray, and PDP)
1964. The birth of the year was IBM’s System/360, which was an innovate system which integrated many models into a single series. This system would cost $5billion to develop, but would dominate the market for many years to come. A great strength was the compatibility across the whole series, where a program on one of the computers in the series would run on any other in the series (something that was extremely rare, at the time). The System/360 started the third generation of computers, which used integrated circuits rather than discrete transistors. This innovation was to continue, in 1970, with the release of the System/370, which used semiconductor memories.
A new company called Control Data Corporation (CDC) with their chief designer, Seymour Cray (who went on to start-up the Cray Computer company) was one of the first to successfully compete with IBM. Their first product was the CDC 6600, which was not compatible with the System/360 but had a much higher specification. This type of computer appealed to organizations who could develop their own software for their own specialist applications. As IBM’s computer was aimed at the mainframe market, a new term had to be found for a computer which had a higher specification than a mainframe, so the term supercomputer was formed. For there on, IBM had lost the high-end computer market, to CDC and, in coming years, to Cray.
For IBM this was not a great loss of market, as they were mainly interested in the mainstream business market, which did not have the same requirements for high-specification performance as specialist market, such as weapons research or product design. It was, though, a great embarrassment to IBM as the CDC computer was produced by a team of only 14 engineers and four programmers, but outperformed the best IBM computer by a factor of three. From then on, IBM was an industry leader in terms of customers, and not in technology. IBM, though, sailed close to the laws of the land when the announced that they were building a computer which was to be more powerful than the CDC 6000. Orders for the CDC 6000 immediately dried up, but the new IBM computer never appeared, leading to another anti-trust suit.
The production of transistors increased, and each year brought a significant decrease in their size. Gordon Moore, in 1964, plotted the growth in the number of transistors that could be fitted onto a single microchip, and found that the number of transistors that can be fitted onto an integrated circuit approximately doubled every 18 months. This is now known as Moore’s law, and has been surprisingly accurate ever since. In 1964, Texas Instruments also received a patent for the integrated circuit.
The Sperry UNIVAC (a name that was synonymous with computing for many years) 1108 and the RCA Spectra 70 were two of the first computers to use integrated circuits. RCA continued had continued their success with the RCA 501 by releasing the RCA Spectra 70 which was software compatible with the System/360, but cost about 40% less as RCA did not have the same development costs that IBM had faced. In 1967, however, IBM again showed their leadership in the computer industry by developing the first floppy disk.
At the time, there were only three main ways of writing computer programs: machine code, FORTRAN or COBOL, which were often difficult for inexperienced users to use, so in 1964, John Kemeny and Thomas Kurtz at Dartmouth College developed the BASIC (beginners all-purpose symbolic instruction code) programming language. It was a great success, although it was not used much in ‘serious’ applications until Microsoft developed Visual BASIC, which used BASIC as a foundation language, but enhanced it with an excellent development system (and of course with integrated Microsoft Windows support). Many of the first personal computers used BASIC as a standard programming language, and for a while BASIC looked as if it would become the standard programming language of the future. Unfortunately it suffered from many problems, which were the ones that where similar to FORTRAN (such as being able to declare variables without them being properly declared).
The Minicomputer and Intel
1968. In the mid-1960s all of the computers on the market were very expensive to purchase (approximately $1,000,000) and maintain, but were great computing workhorses. They were so expensive that most companies had to lease their computer systems, as they could not afford to purchase them. As IBM clung happily to their mainframe market, several new companies were working away to erode their share. DEC was to be the first, with their minicomputer, but the PC companies of the future would finally overtake them. The beginning of their loss of market share for IBM can be traced to the development of the microprocessor, and to one company: Intel. The beginning of the slide for IBM occurred in 1968, when Robert Noyce and Gordon Moore left Fairchild Semiconductors and met up with Andy Grove to found Intel Corporation. As Robert Noyce was well respected in the electronics industry for his development of the first integrated circuit with more than one transistor, Arthur Rock, a venture capitalist, provided the required start-up finance.
At the same time, IBM scientist John Cocke and others completed a prototype scientific computer called the ACS, which used some RISC (reduced instruction set computer) concepts. Unfortunately, the project was cancelled because it was not compatible with IBM’s System/360 computers. Along with this, several people were proposing the idea of a computer-on-a-chip, and International Research Corp. was the first to develop the required architecture, modeled on an enhanced DEC PDP-8/S concept. Wayne Pickette also proposed to Fairchild Semiconductors that they should develop a computer-on-a-chip, but was turned down. So, he went to work with IBM and went on to design the controller for Project Winchester, which had an enclosed flying-head disk drive.
In the same year, Douglas C. Engelbart, of the Stanford Research Institute, demonstrated the concept of computer systems using a keyboard, a keypad, a mouse and windows at the Joint Computer Conference in San Francisco’s Civic Center. He also demonstrated the use of a word processor, a hypertext system and remote collaboration. His keyboard, mouse and windows concept has since become the standard user interface to computer systems.
1969. In 1969, the computer industry was still dominated by IBM with their 32-bit System/360, but newer computers, especially the RCA Spectra 70 (16-bit) and the DEC PDP-8 (12-bit) stated to eat away at IBM’s main market as they now incorporated integrated circuits. IBM had expected the System/360 to remain in the market for up to 10 years, but it was obvious that they had to release a new system which incorporated the latest integrated circuits, especially in terms of memory, thus IBM’s released the impressive System/370. It effectively destroyed much of the competition so much so that over the next six years IBM where served with ten lawsuits for federal antitrust actions. With the coming computer recession at the beginning of the 1970s, only DEC (with their PDP range, which had an innovative architecture) and Data General (with their Nova, which had an innovate design with a single circuit board which was packaged with a single box – just as many of today’s computers are design) thrived.
In 1969, Hewlett-Packard, diversified away from their strong market in electronic test equipment, into the world of digital electronics with the world’s first desktop scientific calculator: the HP 9100A. At the time, the electronics industry was producing cheap pocket calculators. This development actually literally led to the development of affordable computers, when the Japanese company Busicom commissioned Intel to produce a set of between eight and 12 integrated circuits (ICs) for its new desktop calculator. Then, instead of designing a complete set of ICs, Ted Hoff at Intel designed an integrated circuit chip that could receive instructions and perform simple integrated functions on data. For the design, Intel produced a set of ICs, which could be programmed to perform different tasks. These were the first microprocessors and soon Intel (short for Integrated Electronics) produced a general-purpose 4-bit microprocessor: the 4004. A key for Intel was to be granted the rights to license these designs, for which, to Intel’s great relief, Busicom agreed.
The Birth of the Microprocessor
The Intel 4004 microprocessor caused a revolution in the electronics industry as previous electronic systems had a fixed functionality. With this processor the functionality could be programmed by software. Amazingly, by today’s standards, it could handle only 4 bits of data at a time (a nibble), contained 2,000 transistors, had 46 instructions and allowed 4kB of program code and 1kB of data. From this humble start, the PC has since evolved using Intel microprocessors. Intel had previously been an innovative company, and had produced the first memory device (static RAM, which uses six transistors for each bit stored in memory), the first DRAM (dynamic memory, which uses only one transistor for each bit stored in memory) and the first EPROM (which allows data to be downloaded to a device, which is then stored permanently).
In the same year, Intel announced the 1kB RAM chip, a significant memory increase over previously produced chips. Around the same time, one of Intel’s major partners, and also, as history has shown, competitors, Advanced Micro Devices (AMD) Incorporated was founded. It was started when Jerry Sanders and seven others left Fairchild Semiconductors (the incubator for the electronics industry which produced many spin-off companies).
At the same time, the Xerox Corporation, whose main core business was still very much paper based, gathered a team at the Palo Alto Research Center (PARC) and gave them the objective of creating ‘the architecture of information.’ This would lead to many of the great developments in computing, including personal distributed computing, the graphical user interface, the first commercial mouse, bit-mapped displays, Ethernet, client/server architecture, object-oriented programming, laser printing and many of the basic protocols of the Internet. Few research centers have ever been as creative and forward thinking as PARC was over those years. A key to this creativity was that they were not told what they had to do, and it was basically up to the creativity of the people involved to develop their own ideas. Many organizations expect that lots of finance for Research and Development will automatically produce new ideas and products, but no one can ever replace creativity.
1970. The most successful computer company of 1970s was DEC who, after developing their PDP (Programmed Data Processor) range in the 1960s from the PDP-1 to PDP-10, marked the new decade with a computer that, for many, offer excellent value for money with good performance: the classic PDP-11. It sold for $10,000 and went on to sell over 600,000 (a revenue of $6,000,000,000, which, at the time was a massive amount). Many research organizations or technical departments, could not afford the equivalent IBM system, and where not really complete unless they had a PDP-11. DEC looked unstoppable and many predicted that they would eventually overtake the mighty IBM. But it was IBM, themselves, who would stop their success in a single blow, with the IBM PC.
At the time, a computer recession hit, and many of the leading companies struggled against the might of the System/370, the excellent architecture of the PDP-11, and the innovate design of the Data General Nova (soon to become the Advanced Nova). New companies also started to appear, such as Memorex and Amplex, which did not built whole computers but specialized on specific elements of them, such as storage devices, and memory components. The systems that most of these companies aimed at were the System/360 and System/370, and they produced plug-compatible components which could be easily inserted into existing systems. In fact, it was now possible to build a whole System/360 or System/370 without requiring any parts from IBM. This was a worrying trend for the computer manufacturers as a small, specialized company can often innovate faster than a large general-purpose company. This proved to be the case in future years as companies would emerge who specialized in processor development (Intel), networking (Cisco System) and disk storage (Seagate).
1971. In 1971, Gary Boone of Texas Instruments filed a patent application for a single-chip computer and the microprocessor was released in November. Also in the same year, Intel copied the 4004 microprocessor to Busicom. When released the basic specification of the 4004 was:
- Data bus: 4-bit
- Clock speed: 108kHz
- Price: $200
- Speed: 60,000 operations/second
- Transistors: 2,300
- Silicon: 10m technology, 3´4 mm2
- Addressable memory: 640 bytes
Intel then developed an EPROM (electrically programmable read-only memory), which was integrated into the 4004 and allowed programs to be permanently added to the processor, which enhanced development cycles of microprocessor products. This development would have a major effect on computer manufacturers, such as IBM and DEC, as smaller companies could now use standard processors to build a small computer system without having to buy an expensive main-frame computer.
A significant event occurred, which went unnoticed by most people in the computer industry, was when Bill Gates and Paul Allen, signed an agreement with Computer Center Corporation to report bugs in PDP-10 software, in exchange for computer time. They called their new company the Lakeside Programming Group, to give it some credibility.
Other, more noticeable, events at the time included:
- UNIX. Ken Thompson at AT&T’s Bell Laboratories wrote the first version of the UNIX operating system (on a DEC PDP-7). One of the main objectives of UNIX was to produce a portable operating system which could be ported onto different computer systems, without having to modify the source code which produced the operating system. In future years, Brian Kernighan and Dennis Ritchie would develop the C programming language, so that they could compile the operating system using a C compiler, which meant that it could be easily transported onto other systems. For a while, in the 1980s, UNIX looked as if it was to become the de-facto operating system, before Microsoft Windows trumped it with the excellent Windows 95, and then again with Windows NT.
- LASER PRINTERS. Gary Starkweather at Xerox used a laser beam along with the standard photocopying processor to produce a laser printer.
- COMPUTER KIT. The National Radio Institute introduced the first computer kit, for $503.
- MICROCOMUTER. Texas Instruments developed the first microcomputer-on-a-chip, which contained over 15,000 transistors. They released the TMS1000 one-chip microcomputer, which had 1kB ROM, 32 bytes of RAM, and a simple 4-bit processor. In the following year (1973), Intel filed a patent application for a memory system for a multichip digital computer.
- FLOPPY DISK. IBM introduced the memory disk, or floppy disk, which was an 8-inch floppy plastic disk coated with iron oxide.
- WORD PROCESSOR. Wang Laboratories introduced the Wang 1200 word processor system. For a while Wang could do little wrong as their systems sold as quickly as they could be manufactured. They were perfect for the market appealed to typists as they could fit onto a desk and could easily correct mistakes without having to resort to correction fluid. Also they could save work, and recall it, and, of course, re-edit it. The market eventually dropt for this market when the PC incorporated a word processor, but could store much more information, and could easily connect to a whole range of printers.
PASCAL. Niklaus Wirth invented the Pascal programming language, as BASIC and FORTRAN had long been known for producing unstructured programs, with lots of GOTOs and RETURNs. Niklaus main aim was to develop a programming language which could be used to teach good, modular programming practices. It was quickly accepted for its clean, pseudocode-like language (pseudo-code is a near English format), and still survives today, but it has struggled against C/C++ (mainly because of their popularity of UNIX and in Microsoft Windows development) and Java (because of its integration with the Internet), but lives within Borland Delphi, an excellent Microsoft Windows development system.
- PORTABLE COMPUTER. At XEROX PARC, Alan Kay proposed that XEROX should build a portable personal computer called the Dynabook, which would be the size of an ordinary notebook; unfortunately, the PARC management did not support it. In future years, companies such as Toshiba and Compaq would fully exploit the idea. PARC eventually choose to develop the Alto personal computer.
INTERNET. ARPAnet, the network that would eventually lead to the model for the Internet had reached up to 15 nodes and 23 hosts. The might of DEC as the original architecture of ARPAnet used nine PDP-10’s and only five System/360’s (and a Illiac-IV, just for the record).
1973. In 1973, the model for future computer systems was illustrated at Xerox’s PARC, when the Alto workstation was demonstrated with a bit-mapped screen (showing the Cookie Monster from Sesame Street). The following year at Xerox, Bob Metcalfe demonstrated the Ethernet networking technology, which was destined to become the world-wide standard for local area networks. The networking technology used was far from perfect, as computers contended with each other for access to the network, but it was cheap and simple, and it worked relatively well. Since then it has successfully fought off other networking techniques, especially the IBM-preferred Token Ring technology.
Also in 1973, before the widespread acceptance of PC-DOS, the future for personal computer operating systems looked to be CP/M (control program/monitor), which was written by Gary Kildall of Digital Research. One of his first applications of CP/M was on the Intel 8008, and then on the Intel 8080. At the time, computers based on the 8008 started to appear, such as the Scelbi-8H, which cost $565 and had 1kB of memory.
IBM was also innovating at the time, creating a cheap floppy disk drive, and the IBM 3340 hard disk unit (a Winchester disk), which had a recording head that sat on a cushion of air 18 millionths of an inch above the platter. The disk was made with four platters, each 8-inches in diameter, giving a total capacity of 70MB.
A year later (1974) at IBM, John Cocke produced a high-reliability, low-maintenance computer called the ServiceFree. It was one of the first computers in the world to use RISC technology and it operated at the unbelievable speed of 80MIPS. Most computers at the time were measured in a small fraction of a MIP and was over 50 times faster than IBM’s fastest mainframe. The project was eventually cancelled as a competing project named ‘Future Systems’ was consuming much of IBM’s resources.
— Next — The 8-bit Microprocessors