Computing experts at Sandia
National Laboratories have launched an effort to help discover what computers
of the future might look like, from next-generation supercomputers to systems
that learn on their own — new machines that do more while using less energy.
For decades, the computer industry operated under Moore’s Law, named for Intel
Corp. co-founder Gordon Moore, who in 1965 postulated it was economically
feasible to improve the density, speed and power of integrated circuits
exponentially over time. But speed has plateaued, the energy required to run
systems is rising sharply and industry can’t indefinitely continue to cram more
transistors onto chips.
The plateauing of Moore’s Law is
driving up energy costs for modern scientific computers to the point that, if
current trends hold, more powerful future supercomputers would become
impractical due to enormous energy consumption. Solving that conundrum will
require new computer architecture that reduces energy costs, which are
principally associated with moving data, Leland said. Eventually, computing
also will need new technology that uses less energy at the transistor
device-level. Sandia is well positioned to work on future computing technology
due to its broad and long history in supercomputers, from architecture to
algorithms to applications.
More information: