Life After Moore's Law

Tagged: Moore's Law, Technology
Source: forbes.com - Read the full article
Posted: 3 years 51 weeks ago

It's time for the computing industry to take the leap into parallel processing.

For the past four decades explosive gains in computing power have contributed to unprecedented progress in innovation, productivity and human welfare. But that progress is now threatened by the unthinkable: an end to the gains in computing power.

We have reached the limit of what is possible with one or more traditional, serial central processing units, or CPUs. It is past time for the computing industry--and everyone who relies on it for continued improvements in productivity, economic growth and social progress--to take the leap into parallel processing.

Reading this essay is a serial process--you read one word after another. But counting the number of words, for example, is a problem best solved using parallelism. Give each paragraph to a different person, and the work gets done far more quickly. So it is with computing--an industry that grew up with serial processing--and which now faces a serious choice between innovation and stagnation.

The backdrop to this issue is a paper written by Gordon Moore, the co-founder of Intel ( INTC - news - people ). Published 45 years ago this month, the paper predicted the number of transistors on an integrated circuit would double each year (later revised to doubling every 18 months). This prediction laid the groundwork for another prediction: that doubling the number of transistors would also double the performance of CPUs every 18 months.

This bold prediction became known as Moore's Law. And it held true through the 1980s and '90s--fueling productivity growth throughout the economy, transforming manufacturing, services, and media industries, and enabling entirely new businesses such as e-commerce, social networking and mobile devices.

Moore's paper also contained another prediction that has received far less attention over the years. He projected that the amount of energy consumed by each unit of computing would decrease as the number of transistors increased. This enabled computing performance to scale up while the electrical power consumed remained constant. This power scaling, in addition to transistor scaling, is needed to scale CPU performance... | Read More