Why IEEE Is Involved In Improving Computing

There are many reasons why a multidisciplinary approach is needed

13 March 2013
blog_trackPhoto: Elie Track

Elie Track, an IEEE senior member, is cochair of the IEEE Rebooting Computing Working Group, along with IEEE Fellow Tom Conte.

“Rebooting Computing” is a term first coined by IEEE Life Fellow Peter Denning as part of his National Science Foundation-sponsored initiative to revamp computing education. It was independently discovered by my IEEE colleague, Tom Conte, as part of our brainstorming and preparation for the IEEE Future Directions initiative to rethink the computer, “from soup to nuts.” We decided to seek Peter’s permission, which he granted, to repurpose the term for this new initiative.

With the many activities geared towards improving specific aspects of computing it is fair to ask: Why IEEE? Why a new initiative? What’s different? The drivers are twofold: (1) focus the target specifications to clarify the desired goals of the next generation of computers, and (2) ensure an appropriate multidisciplinary approach that avoids a monochromatic vision where an important improvement contributes little because of the limitations of the other aspects.

IEEE is the ideal forum to catalyze the effort because it has, in its societies and councils, all the fields needed for a comprehensive approach. Our kick-off to ensure such inclusion consists of seeking community input through this blog, a new website to be launched in a few weeks, and a workshop following the IEEE International Electron Devices Meeting being held in Washington D.C., in December. The workshop’s focus is to define the best paths for achieving the combination of ultra-high performance—a quantum leap targeting increased computing power up to ExaFLOPs —and energy efficiency, two goals that have been heretofore contradictory. Combining these two goals is indispensable to meet future computing needs. As data centers and centralized computing platforms gain in importance, enabling individual and smaller computing stations to tap into their power and unlocking the key to scientific, security, and community (such as weather and earthquake prediction) computing needs, the need for the higher performance becomes compelling. Yet, without addressing energy efficiency, the large systems simply cannot be deployed. Current predictions, validated by programs at DARPA and other agencies, require almost a nuclear power plant to power one such system. The environmental impact of power-hungry systems can be devastating.  Combining performance with energy-efficiency will be indispensable. There may be more than one path to achieve the desired result. It is certain that, regardless of the path, the answer will involve a combination of solutions addressing different aspects of computing: device technology, circuits, architecture, memory, software, user interface, and more. A multidisciplinary approach will be essential.

In addition, a number of disruptive technologies—from superconducting processors fulfilling some of the core tasks to quantum computing to innovative architectures—must be considered to ensure the selection of the best path to pursue. The December Rebooting Computing workshop will be the first meeting of experts from all these areas and will give them a platform to debate and propose which path combinations they recommend. The cross fertilization achieved in these exchanges should lead to new programs that sponsors can then adopt to unlock the key to “Rebooting Computing.”

What is your take on this topic? What do you think your own area of expertise can contribute? Do you agree with the goal, and our approach? Do you want to ensure that a specific area is identified as a key constituency in this effort? Comments, suggestions, and requests are welcome in the commenting area below. I very much look forward to an open and fruitful discussion.

*This article has been corrected from an earlier version.

Photo: iStockphoto

Learn More