On July 18, 1968 Robert Noyce, Gordon Moore and Andrew Grove from Fairchild Semiconductor founded NM Electronics, later known as Intel. The company's first property was purchased in Santa Clara, California.
1970: The Intel 1103
1971: The Intel 4004
In November 1971 announced the first microprocessor: the Intel 4004 four-bit central processor logic chip (U.S. Patent #3,821,715). Invented by Intel engineers Federico Faggin, Marcian Edward "Ted" Hoff, Stanley Mazor and Masatosi Shima, this was the first microprocessor. The size of "a little fingernail," the 4004 contained 2400 transistors and delivered more computing power than the ENIAC, which occupied a large room.
"The Crucial Role Of Silicon Design In The Invention Of The Microprocessor (A Testimonial from Federico Faggin, designer of the 4004 and developer of its enabling technology)
"Every time there is a new and important invention, there are many people who claim to be their inventor. This is also the case for the microprocessor. What are then the criteria to determine what the invention is and who invented it? What is exactly the microprocessor and what is novel about it?
"The microprocessor is the central processing unit (CPU) of a general-purpose electronic computer implemented in a single integrated circuit. The Intel 4004 was unquestionably the world’s first commercial microprocessor. No one had commercialized a single-chip CPU prior to Intel. There are people, however, who claim to have built CPUs in more than one chip before the 4004, although they were never commercialized as chip-sets but were used only in proprietary equipment. For example, Raymond Holt claims to have built with his team a three-chip microprocessor in 1969 for the US Navy’s F-14A; Lee Boysel of Four Phase Systems Inc., claims that he and his team created the first microprocessor, which was incorporated as part of a system, in 1969. Although their contributions were remarkable, their CPU implementation, not being a single chip, was not a microprocessor.
"Why is one chip so much different or better than two or three chips? If we accept to call a microprocessor a three-chip implementation of a CPU, then why shouldn’t a four or five-chip implementation be also called a microprocessor? Pretty soon it would be impossible to distinguish a microprocessor from a CPU board built with conventional components! A single chip is important not only because of its simplicity and elegance, but because a one-chip CPU is the irreducible minimum for a CPU, thus optimizing all the critical requirements of size, speed, cost and energy consumption. The microprocessor changed the world of computing exactly because it reduced to an absolute minimum the size, cost and energy consumption of a CPU while maximizing its speed.
"The existence of multiple-chip CPU realizations predating the 4004 indicates that the critical contribution of the 4004 in the industry was its implementation in a single chip rather than in multiple chips. This fact places much emphasis on the fundamental role played by the chip design that enabled the integration of the 4004 in a single chip, more than on its architecture. Simple CPU architectures requiring two to three thousand transistors – the same number of transistors used in the 4004 -- were generally known in 1968-1969, however it was not possible to integrate all those transistors in a single chip with the MOS technology available at that time.
"The primary reason for the appearance of the microprocessor in 1971, rather than a few years later and possibly by other companies, was the existence of the MOS Silicon Gate Technology (SGT). With the silicon gate technology, twice as many transistors could be integrated in the same chip size than with conventional metal gate MOS technology, using the same amount of energy, and with a speed advantage of about 4:1. This technology, originally developed by Federico Faggin at Fairchild Semiconductor in 1968, had also been adopted by Intel. In 1970, only Fairchild and Intel had been able to master the SGT. The 4004 could be integrated and made to function in a single chip not only because of Faggin’s intimate understanding of the silicon gate technology and his skills as a chip designer, but also because of all the additional technological and circuit innovations he created to make it possible (new methodology for random logic using silicon gate, bootstrap load, buried contact, power-resettable flip-flop - US patent 3.753.011-, new flip-flop design used in a novel static MOS shift register).
"There is a very specific and quite striking example showing that the chip design, more than its architecture, was the key to the creation of the microprocessor -- it is the CPU used in the Datapoint 2200 terminal. Conceived in 1969 by Computer Terminal Corporation (CTC), Texas Instruments attempted to integrate this CPU in a single chip in 1971, as a custom project commissioned by CTC. Described in the press in mid-1971, only a few months after the 4004 completion, this chip never functioned and it was never commercialized. In early 1972, exactly the same CPU that Texas Instruments failed to make viable, was integrated at Intel (assigned to Hal Feeney, under Faggin’s supervision) using the silicon gate technology and the CPU design methodology created by Federico Faggin. This CPU became the Intel 8008 microprocessor, and was first commercialized in April 1972. The 8008 chip size was about half the size of Texas Instrument’s chip and it worked perfectly" (http://www.intel4004.com/hyatt.htm, accessed 12-02-2013).
1972: The Intel 8008
In April 1972 Intel introduced the 8008 microprocessor, the first 8-bit microprocessor. With an external 14-bit address bus that could address 16KB of memory, it became the CPU for the first commercial, non-calculator personal computers: the US SCELBI kit and the pre-built French Micral N and Canadian MCM/70, and the Datapoint 2200.
"Originally known as the 1201, the chip was commissioned by Computer Terminal Corporation (CTC) to implement an instruction set of their design for their Datapoint 2200 programmable terminal. As the chip was delayed and did not meet CTC's performance goals, the 2200 ended up using CTC's own TTL based CPU instead. An agreement permitted Intel to market the chip to other customers after Seiko expressed an interest in using it for a calculator" (Wikipedia article on Intel 8008, accessed 12-02-2013).
1974: The Intel 8080
In April 1974 Intel released the 8080 eight-bit microprocessor, considered by many to be the first general-purpose microprocessor. It featured 4,500 transistors and about ten times the performance of its predecessors. Within a year the 8080 was designed into hundreds of different products, including the MITS Altair 8800 designed by H. Edward Roberts.
"The 8080 also changed how computers were created. When the 8080 was introduced, computer systems were usually created by computer manufacturers such as Digital Equipment Corporation, Hewlett Packard, or IBM. A manufacturer would produce the entire computer, including processor, terminals, and system software such as compilers and operating system. The 8080 was actually designed for just about any application except a complete computer system. Hewlett Packard developed the HP 2640series of smart terminals around the 8080. The HP 2647 was a terminal which ran BASIC on the 8080. Microsoft would market as its founding product the first popular programming language for the 8080, and would later acquire DOS for the IBM-PC" (Wikipedia article on Intel 8080, accessed 12-02-2013).1978: The Intel 8086
1979: The Intel 8088
On July 1, 1979 Intel introduced the 8088 microprocessor, a low-cost version of the 8086 using an eight-bit external bus instead of the 16-bit bus of the 8086, allowing the use of cheaper and fewer supporting logic chips. It was the processor used in the original IBM PC.
1985: The Intel 386
In 1985 Intel introduced the 32-bit 386 microprocessor. It featured 275,000 transistors— more than 100 times as many as the first Intel microprocessor, the 4004, developed in 1971.
(This entry was last revised on 01-18-2015.)