What
is Accelerating Change?
In both
universal and human history, there are a special subset of
physical events (e.g., Carl Sagan's Cosmic
Calendar at the universal scale, Gordon Moore's
Law
of IC Transistor Density at the human scale) that have
continually increased both their speed and efficiency of change.
Continually accelerating systems are able to accomplish more
with fewer resources; as a result, they avoid normal limits
to exponential growth. Over the 20th century, several areas
of computational and technological capacity have continuously
accelerated, even independent of economic recession, driven
primarily by powerful new physical and economic efficiencies
discovered by physicists and engineers working at small scales.
Even more interestingly, looking ahead we can see no near-term
limit to several of these accelerating physical and technological
efficiencies of the microcosm.
A combination
of scientific discovery and human innovation have continually
removed temporary barriers to this accelerating computational
and technological advance. In the mid-1990's, the International
Technology Roadmap for Semiconductors (ITRS) consortium
noted that depositing metal on silicon for integrated circuit
production would run into a miniaturization block circa 2005.
Then in 2001, a University of Massachusetts team learned how
to deposit metal as a supercritical fluid, rather than as
a gas or liquid, sidestepping the roadblock. In 2003, Intel's
Andy Grove noted that gate leakage current has become
a significant problem in the miniaturization of gallium arsenide
semiconductors. Meanwhile, Lucent researchers discovered that
hafnium arsenide exhibits 1,000 times less leakage, making
it one of several contenders expected to keep Moore's Law
healthy for many years hence.
Even when
we consider what has been called an approaching "Moore's
Law limit" (circa 2015) to chip miniaturization, when
gate sizes are so small that electrons can no longer be kept
from spontaneously "quantum tunnelling" between
neighboring circuits, we realize this will simply move us
into an era of system miniaturization, rather than circuit
miniaturization. This latter process is already well under
way (e.g., systems-on-a-chip: cellphone-on-a-chip, GPS-on-a-chip,
etc.). In other words, as future chips become minimum-sized
and fully reconfigurable commodities, development of massively
modular (both highly parallelized and differentiated) computing
systems (e.g., Danny Hillis and his Connection
Machine) will become economically feasible. Today's modestly-parallel
computer architectures (e.g., graphics render farms, distributed
computing, and early grid computing) portend tomorrow's massively-parallel,
irreducibly complex, and biologically-inspired platforms.
Such "horizontal acceleration" must remain subdued
until the exponential economies realized through today's "vertical
(miniaturization) acceleration" reach at least a temporary
plateau.
We live
in a world where fat-fingered 21st century humans have learned
such miracles as the creation of multi-million mirror MEMS
devices (i.e., optical waveguides), to teleport light, and
to run quantum computing algorithms on a single atom of calcium.
How long can this continue? Will we continue to make astounding
hardware discoveries in the microcosm? (Note, for example,
this 2004 optoelectronics
advance that provides a millionfold greater conversion
efficiencies than previous solutions).
Rolf
Landauer and others note that there is no minimum physical
energy of computation. We are beginning to see, and may eventually
utilize physical structure as far down as the Planck scale,
the minimum dimensions of space and time as revealed in modern
physical theory. Seth Lloyd has estimated that the
"ultimate laptop" has black hole-level energy densities. Today's
Pentium
chips already have several orders of magnitude greater energy
densities than any living system on Earth.
We have
entered an era of continual surprise. Many serious observers
now expect the capacities and intelligence of our information,
sensing, storage, and communications technologies to continue
their stunning rate of progress for as far as we can see into
this new century. This profound growth in computational capacity
will predictably enable a host of new products and services
that are presently impossible. If Moore's Law, for example,
continues to double approximately every 18 months for microprocessors,
then continuing this process another 15 years will yield another
1,000X greater technological capacity. What new emergences
will this enable? Even more surprisingly, in a special subset
of areas (such as graphics processors), computational capacity
doubling has been progressing even faster than every 18 months,
for at least eight years.
Which
coming applications, enabled by accelerating change, have
the greatest strategic importance? Which will be the most
useful and enduring, and why? How can we best promote their
balanced development?
Understanding
accelerating change requires a new way of thinking. Gaining
foresight with regard to the meaning, implications, risks,
and opportunities of accelerating technological change has
become both our greatest lever for moving the world and our
most fundamental educational priority. Join us as together
we improve our collective insight in Accelerating
Change at Stanford each year.
[For more,
see "Understanding
the Accelerating Rate of Change," Ray Kurzweil
and Chris Meyer, 2003.]
|