Intel Corporation, American manufacturer of
semiconductor computer circuits. It is headquartered in Santa Clara, Calif. The company’s name comes from “
integrated
electronics.”
Intel was founded in July 1968 by American engineers
Robert Noyce and
Gordon Moore. Unlike the archetypal
Silicon Valley start-up business with its fabled origins in a youthful founder’s garage,
Intel opened its doors with $2.5 million in funding arranged by Arthur Rock, the American financier who coined the term
venture capitalist.
Intel’s founders were experienced, middle-aged technologists who had established reputations. Noyce was the coinventor in 1959 of the
silicon integrated circuit when he was general manager of
Fairchild Semiconductor, a division of Fairchild Camera and Instrument. Moore was the head of research and development at Fairchild Semiconductor. Immediately after founding
Intel, Noyce and Moore recruited other Fairchild employees, including Hungarian-born American businessman
Andrew Grove. Noyce, Moore, and Grove served as chairman and chief executive officer (CEO) in succession during the first three decades of the company’s history.
Intel’s initial products were
memory chips, including the world’s first metal oxide semiconductor, the 1101, which did not sell well. However, its sibling, the
1103, a one-kilobit dynamic
random-access memory (
DRAM) chip, was successful and the first chip to store a significant amount of information. It was purchased first by the American technology company
Honeywell Incorporated in 1970 to replace the core memory technology in its computers. Because DRAMs were cheaper and used less power than core memory, they quickly became the standard memory devices in computers worldwide.
Following its DRAM success,
Intel became a public company in 1971. That same year
Intel introduced the erasable programmable read-only memory (
EPROM) chip, which was the company’s most successful product line until 1985. Also in 1971
Intel engineers Ted Hoff, Federico Faggin, and Stan Mazor invented a general-purpose four-
bit microprocessor and the first single-chip microprocessor, the
4004, under contract to the Japanese
calculator manufacturer Nippon Calculating Machine Corporation, which let
Intel retain all rights to the technology.
Not all of
Intel’s early endeavours were successful. In 1972 management decided to enter the growing digital
market by purchasing Microma. But Intel had no real understanding of consumers and sold the watchmaking company in 1978 at a loss of $15 million. In 1974 Intel controlled 82.9 percent of the DRAM chip market, but, with the rise of foreign semiconductor companies, the company’s market share dipped to 1.3 percent by 1984. By that time, however, Intel had shifted from memory chips and become focused on its microprocessor business: in 1972 it produced the 8008, an eight-bit
central processing unit (CPU); the
8080, which was 10 times faster than the 8008, came two years later; and in 1978 the company built its first 16-bit microprocessor, the 8086.
In 1981 the American computer manufacturer
International Business Machines (IBM) chose Intel’s 16-bit
8088 to be the CPU in its first mass-produced
personal computer (PC). Intel also provided its microprocessors to other manufacturers that made PC “
clones” that were compatible with IBM’s product. The IBM PC and its clones ignited the demand for desktop and portable computers. IBM had contracted with a small firm in Redmond, Wash.,
Microsoft Corporation, to provide the disk operating system (DOS) for its PC. Eventually Microsoft supplied its
Windows operating system to IBM PCs, which, with a combination of Windows
software and Intel chips, were dubbed “
Wintel” machines and have dominated the market since their inception.
Of the many microprocessors Intel has produced, perhaps the most important was the
80386, a 32-bit chip released in 1985 that started the company’s commitment to make all future microprocessors backward-compatible with previous CPUs. Application developers and PC owners could then be assured that software that worked on older Intel machines would run on the newest models.
With the introduction of the
Pentium microprocessor in 1993, Intel left behind its number-oriented product naming conventions for trademarked names for its microprocessors. The Pentium was the first Intel chip for PCs to use parallel, or superscalar, processing, which significantly increased its speed. It had 3.1 million
transistors, compared with the 1.2 million transistors of its predecessor, the 80486. Combined with Microsoft’s
Windows 3.x
operating system, the much faster Pentium chip helped spur significant expansion of the PC market. Although businesses still bought most PCs, the higher-performance Pentium machines made it possible for consumers to use PCs for multimedia, graphical applications such as
games like
Doom and
Wing Commander that required more processing power.
![Moore’s law
[Credit: Encyclopædia Britannica, Inc.] Moore’s law
[Credit: Encyclopædia Britannica, Inc.]](http://cache-media.britannica.com/eb-media/67/74067-003-1513D408.gif)
Intel’s business strategy relied on making newer microprocessors dramatically faster than previous ones to entice buyers to upgrade their PCs. One way to accomplish this was to manufacture chips with vastly more
transistors in each device. For example, the 8088 found in the first IBM PC had 29,000 transistors, while the 80386 unveiled four years later included 275,000, and the
Core 2 Quad introduced in 2008 had more than 800,000,000 transistors. The
Tukwila, scheduled for production in 2010, is designed to have 2,000,000,000 transistors. This growth in transistor count became known as
Moore’s law, named after company cofounder Gordon Moore, who observed in 1965 that the transistor count on a silicon chip would double approximately annually; he revised it in 1975 to a doubling every two years.
In order to increase consumer brand awareness, in 1991 Intel began subsidizing computer advertisements on the condition that the ads included the company’s “Intel inside” label. Under the cooperative program, Intel set aside a portion of the money that each computer manufacturer spent annually on Intel chips, from which Intel contributed half the cost of that company’s print and
television ads during the year. Although the program directly cost Intel hundreds of millions of dollars each year, it had the desired effect of establishing Intel as a conspicuous brand name.
![An Intel® Pentium® 4 processor (detail of die photo) contains more than 40 million …
[Credit: © Intel Corporation] An Intel® Pentium® 4 processor (detail of die photo) contains more than 40 million …
[Credit: © Intel Corporation]](http://cache-media.britannica.com/eb-media/14/79414-003-E95B7DFA.gif)
Intel’s famed technical prowess was not without mishaps. Its greatest mistake was the so-called “
Pentium flaw,” in which an obscure segment among the Pentium CPU’s 3.1 million transistors performed division incorrectly. Company engineers discovered the problem after the product’s release in 1993 but decided to keep quiet and fix the problem in updates to the chip. However, mathematician Thomas Nicely of Lynchburg College in West Virginia also discovered the flaw. At first Grove (then CEO) resisted requests to recall the product. But when IBM announced it would not ship computers with the CPU, it forced a recall that cost Intel $475 million.
Although bruised by the Pentium fiasco, the combination of Intel technology with Microsoft software continued to crush the competition. Rival products from the semiconductor company
Advanced Micro Devices (AMD), the wireless communications company
Motorola, the computer workstation manufacturer
Sun Microsystems, and others rarely threatened Intel’s market share. As a result, the Wintel duo consistently faced accusations of being
monopolies. In 1999 Microsoft was found guilty in a U.S. district court of being a monopolist after being sued by the
Department of Justice, while in 2009 the
European Union fined Intel $1.45 billion for alleged monopolistic actions. In 2009, Intel also paid AMD $1.25 billion to settle a decades-long legal dispute in which AMD accused Intel of pressuring PC makers not to use the former’s chips.
By the mid-1990s Intel had expanded beyond the chip business. Large PC makers, such as IBM and
Hewlett-Packard, were able to design and manufacture Intel-based computers for their markets. However, Intel wanted other, smaller PC makers to get their products and, therefore, Intel’s chips to market faster, so it began to design and build “
motherboards” that contained all the essential parts of the computer, including graphics and networking chips. By 1995 the company was selling more than 10 million motherboards to PC makers, about 40 percent of the overall PC market. In the early 21st century the Taiwan-based manufacturer ASUSTeK had surpassed Intel as the leading maker of PC motherboards.
By the end of the century, Intel and compatible chips from companies like AMD were found in every PC except
Apple Inc.’s Macintosh, which had used CPUs from Motorola since 1984.
Craig Barrett, who succeeded Grove as Intel CEO in 1998, was able to close that gap. In 2005 Apple CEO
Steven Jobs shocked the industry when he announced future Apple PCs would use Intel CPUs. Therefore, with the exception of some high-performance computers, called
servers, and
mainframes, Intel and Intel-compatible microprocessors can be found in virtually every PC.
Paul Otellini succeeded Barrett as Intel’s CEO in 2005.
Jane Shaw replaced Barrett as chairman in 2009, when the company was ranked 61st on the
Fortune 500 list of the largest American companies.