A key figure in the history of electronics, Dr. Moore predicted in 1965 that computer power would double every year for a decade, and by the mid-1970s he had revised his forecast to every two years. As computing power grows exponentially — and costs drop — it’s been dubbed Moore’s Law and has become a standard scientists have successfully met for decades.
Making computers smaller, faster, and cheaper means integrating more and more circuitry onto silicon slivers. Dr. Moore, as he put it in a 1965 magazine article, envisioned these integrated circuits “leading to such wonders as home computers — or at least terminals connected to a central computer — automatic controls for automobiles and personal portable communication devices.” Made his signature prediction.
Moore’s Law became a driving force in computer technology for the next half century. “It created Silicon Valley,” Carver Mead, a retired California Institute of Technology computer scientist who coined the phrase “Moore’s Law,” told The Associated Press on the law’s 40th anniversary.
“Innovation in electronics is as much about vision as tinkering, and Gordon Moore saw the future better than anyone in the last 50 years,” said Michael S. Malone, author of “The Intel Trinity,” in a 2014 history. of the company. “The industry does not measure its performance by Moore’s Law. It designed and targeted its goals based on that, making the law a self-fulfilling prophecy.
Intel led the rapid development. In 1971, it introduced the first very powerful integrated circuit, the so-called “general-purpose programmable processor” — or microprocessor — the brain of a computer on a single chip. It contained 2,300 transistors on a 12-square-millimeter piece of silicon, or about a fraction of the size of a thumbnail.
“We really are the revolutionaries in the world today — not the kids with long hair and beards who broke up the schools a few years ago,” Dr. Moore told a reporter at the time. (Today, Intel, still the industry leader, can fit about 1.2 billion transistors in the same space.)
Dr. Moore knew that increasing computer power by packing more transistors into smaller chips would eventually run against the laws of physics, as the size of an atom limits silicon’s ability to narrow the paths through which electrons travel. But he cautioned against predicting “the end of progress” because scientists, he said, will continue to find more unique solutions.
“Every time someone declares that Moore’s Law is dead, there is some progress,” Malone said.
Dr. Moore founded Intel in 1968 with physicist Robert Noyce. Co-founder of Fairchild Semiconductor, founded in 1957 with Noyce and six others. Among Fairchild’s many inventions, two stand out as revolutionizing computing, and Dr. Moore played a significant role in each.
Originally a chemical printing process made computer chips in batches instead of one at a time. Another, Noyce’s idea, is to put not just one transistor on a patch of silicon — the on-off switch of computers — but several, with wires to connect them. It is an integrated circuit that originated as a microprocessor at Intel. (Texas Instruments scientist, Jack Gilby, simultaneously and independently invented the integrated circuit.)
Integrated circuits and the means to mass-produce them set the pace of scientific and corporate race by Moore’s Law.
Fairchild, headquartered southeast of San Francisco, did not offer stock options to its employees, and many scientists left to form new companies. Companies named “Fairchildren” include Advanced Micro Devices, National Semiconductor, LSI Logic and Intel.
The exodus from Fairchild transformed the orchards of the surrounding countryside into Silicon Valley, a mecca for high-tech start-ups. An exhibit at the Computer History Museum in Mountain View features a “family tree” of dozens of Valley companies with roots in Fairchild.
In a 2015 interview for the Chemical Heritage Foundation, Dr. “Every time we got a new product idea, we had several spinoffs,” says Moore. “Most of the companies here today can trace their lineage back to Fairchild. This is where the engineer-entrepreneur really moved.
At Intel, Dr. Moore focused on moving products quickly from the drawing board to the customer. He fostered an entrepreneurial mindset and streamlined practices that have become essential characteristics of Silicon Valley.
“When we set up Intel,” Dr. Moore told PBS talk show host Charlie Rose, “We specifically didn’t set up a separate lab. We told the development department to do their job properly in the manufacturing facility. … So we removed a step.”
Intel’s first president, Arthur Rock, who raised the initial funding for Intel, told Fortune magazine in 1997 that Dr. described Moore as a great scientist who “set his eyes on one goal more than anyone else and made everyone get there.” In contrast, Noyce, Intel’s first chief executive, “had a stroke of genius, but he couldn’t stick to anything,” Rock said.
Dr. Moore succeeded Noyes as chief executive in 1975. Dr. Moore and his own hard-driving successor, Andrew S. When Grove refocused the company on making microchips that stored information (memory chips) rather than chips that processed information (logic chips), the company’s days were critical. It was a multi-billion dollar success story for Intel.
Friend’s Chemistry Collection
Gordon Earle Moore was born on January 3, 1929 in San Francisco. He grew up in Pescadero, California, a farming community in San Mateo County. His father was an assistant county sheriff, and his mother helped run his family’s general store.
He was 10 when his family moved to Redwood City, far from Menlo Park and Palo Alto. A neighbor friend received a chemistry set for Christmas and invited young Gordon over to blow things up.
“Most people who knew me then would have described me as quiet,” he once joked, “except for the bombs.”
Dr. Moore, the first in his family to attend college, graduated from the University of California at Berkeley in 1950 with a bachelor’s degree in chemistry. Four years later, he earned a doctorate in chemistry from the California Institute of Technology, and began working at Johns Hopkins University’s Applied Physics Laboratory in Laurel, MD.
In 1956, physicist William Shockley appointed Dr. Moore to the Shockley Semiconductor Laboratory near Stanford University. That year, Shockley and two other scientists won the Nobel Prize in Physics for their work at Bell Laboratories, including the invention of the transistor. A compact, reliable way to regulate electricity, transistors will replace bulky, easily broken vacuum tubes in computers and other devices.
Within a year, Shockley’s overbearing management style – and tendency to claim other people’s work as his own – prompted Dr Moore and seven other scientists to bolt.
The “traitor eight,” as Shockley called them, were recruited as a group to study and develop semiconductors. They were rejected by more than twenty companies. Finally, Sherman Fairchild, an inventor whose father was the founder of IBM, invested $1.5 million to start Fairchild Semiconductor with rogue engineers.
Fairchild’s successes were so numerous that by the time the company outgrew its first facility, Dr. Moore wrote in an essay that the tiles on the ceiling of the coffee room “were covered with the imprints of all these champagne corks.”
After an administrative shakeup at Fairchild, Dr. Moore partnered with Noyce to found Intel. He stepped down as chief executive in 1987 and was named chairman emeritus a decade later. He relinquished that role in 2006.
Dr. Moore is a past board chair of the Institute of Electrical and Electronic Engineers and Caltech. His honors include the National Medal of Technology awarded in 1990. A decade later he and his wife, the former Betty Whittaker, created a foundation with more than $6 billion in endowments to support grants for conservation, scientific research and education.
Besides his wife, whom he married in 1950, survivors include Kenneth and Steven and four grandchildren.
Because of his stature in Silicon Valley, Dr. Moore was often called upon to make predictions about the future of science and technology. Having once dismissed the idea of a personal computer as “something of a joke”, he liked to say that he was not well-suited for the role.
“The importance of the Internet surprised me,” he told the New York Times in 2015. “It seemed like just another little communications network that would solve some problem. I didn’t realize that it was going to open up a whole universe of new opportunities, and it certainly has. I wish I had predicted that.”