Moore’s Law Shows Its Age
50-year-old axiom hits limits amid
the many new steps needed to turn silicon wafers into the latest computer
chips.
By Don Clark in the Wall Street Journal
Silicon Valley pioneer Gordon Moore laid out a bold theorem 50 years ago.
Engineers would cram twice as many transistors on tiny squares of silicon every
year or so, producing more and more power in ever-smaller machines.
His extrapolation, known as Moore’s
Law, has been one of the most enduring precepts of the technology industry,
foretelling the revolutionary emergence of personal computers, mobile phones, Web
servers and network routers. Each generation of chips usually brought more
performance at a lower cost.
But Moore’s Law is hitting some
painful limits.
The design and testing of a
chip with the latest technology now
costs $132 million, up 9% from the previous top-of-the-line chip, estimates
International Business Strategies Inc., a consulting firm in Los Gatos, Calif.
A decade ago, designing such an advanced chip cost just $16 million. Meanwhile,
some companies for the first time are unable to reduce the cost of each tiny
transistor.
The changes are triggered partly by
the many new processing steps needed to turn silicon wafers into the latest
computer chips. Circuitry for the latest chips has a width of 14 nanometers, or
billionths of a meter, which enables manufacturers to squeeze hundreds of
millions more transistors on a chip than they could in the past. But designing
products that use so many more components takes lots of time and money.
While companies say they likely can
keep shrinking the size of silicon chips for another decade or so, that work is
bringing diminishing financial returns. Some chip designers already are
limiting their use of the newest technology to high-end products where
performance is more important than cost.
“We are being very careful,” said
Henry Samueli, co-founder, chairman and chief technology officer at Broadcom Corp. ,
which is based in Irvine, Calif., and makes chips for about half the world’s
tablets and smartphones. “The price of these chips is going up dramatically.”
Micron Technology
Inc. Chief Executive Mark Durcan adds:
“There will be smaller and smaller pieces of the market that will pay for the
improvement.” Micron, of Boise, Idaho, makes flash memory chips, which are used
in smartphones, digital cameras and tablets to store photos.
Mr. Moore was director of the
research and development laboratories at Fairchild
Semiconductor,
a unit of Fairchild Camera and Instrument
Corp. and seminal Silicon Valley startup, when Electronics magazine published his predictions on April 19, 1965 under the headline: “Cramming more
components onto integrated circuits.”
He extrapolated that the number of
components on a single silicon chip would double every year from about 60 to as
many as 65,000 by 1975. In 1975, he adjusted the formula to predict a doubling
every two years.
Fairchild sold its first transistors
for $150 each, but prices for the company and its rivals dropped year after
year in the wake of Mr. Moore’s projection.
Semiconductor giant Intel Corp. ’s Core i5 microprocessors include 1.3 billion
transistors, each costing $0.00000014, or a penny per 70,000 transistors,
according to the Santa Clara, Calif., company.
Mr. Moore, now 86 years old, didn’t
use the words “Moore’s Law” in his article, but they took hold as an axiom in
Silicon Valley and as general shorthand for just about any kind of progress,
technological or otherwise.
“I googled ‘Moore’s Law’ and I
googled ‘Murphy’s Law’ and ‘Moore’ beats ‘Murphy’ by at least two to one,” he
said in a January interview by Intel.
Mr. Moore co-founded Intel three
years after his 1965 prediction and retired as the company’s chairman in 1997.
He now lives in Hawaii. Mr. Moore couldn’t be reached, and Intel said he wasn’t
available to comment.
At first, Moore’s Law was largely a
yardstick for chip engineers. It gradually became a competitive imperative,
spurring companies to relentlessly innovate.
Until the mid-2000s, Moore’s Law
helped chip makers boost a key aspect of computing performance known as
operating frequency, or clock speed. But higher clock speeds generated too much
heat and consumed too much power as the market shifted to portable computing
devices.
That led to a strategic shift by
Intel and other chip makers, which are using tricks like changing the shape of
transistors to make them switch faster and use less energy.
But the industry’s costs keep
rising, with new chip-fabrication plants costing as much as $10 billion. Cost
pressures led International Business Machines Corp. last
year to pay $1.5 billion to
another company to take over its semiconductor
operations.
Companies that can afford to keep
pushing Moore’s Law are finding it increasingly hard to keep up the pace.
Intel’s introduction of 14-nanometer technology was two quarters late because
of delays in reducing manufacturing defects.
Intel said it is confident that the
process will result in greater savings per transistor than past advances. “It
was a little bit harder but has got us to a better place in the end,” said Mark
Bohr, a senior Intel fellow who helps lead development of production processes.
More transistors provide the biggest
benefits when tasks can be broken up and tackled by many processor cores at
once. Nvidia Corp. chips render ultrarealistic images on computer screens by
simultaneously painting colors on thousands of pixels.
Some big computer users are going
beyond such chips to try other new designs, since smaller transistors alone
aren’t boosting computing speeds enough.
“Moore’s Law is not having the same
effect on the rate of gains we are seeing,” said Gordon MacKean, a senior
director of Google
Inc. ’s hardware platforms team.
Some makers of data-storage chips
are taking more dramatic steps. Producers of chips called NAND flash memory
used in smartphones and an increasing number of computers have decided to stop
shrinking transistors, worried that smaller circuitry won’t store data
reliably.
Instead, they plan to stack circuits
in three dimensions—32 or 48 layers per chip—rather than on a flat square of
silicon to keep boosting the capacity of their devices.
Micron and Intel expect to produce
so-called 3-D NAND chips that initially store as much as 384 gigabits of data, or
three times more than conventional memory chips.
Later this year, Intel expects to
deliver a chip for specialized applications with eight billion transistors—or
133 million times more than chips than when Mr. Moore made his projection.
No comments:
Post a Comment