Two engineers who each have had a lasting impact on artificial intelligence are being honored this year with IEEE medals. The annual awards are presented for contributions or leadership in IEEE fields of interest. The medals were presented on 18 June at the IEEE Honors Ceremony, at Gotham Hall in New York City.
Christos H. Papadimitriou, professor of electrical engineering and computer science at the University of California, Berkeley, is receiving the 2016 IEEE John von Neumann Medal. IBM sponsors the award. Papadimitriou is being recognized for “providing a deeper understanding of computational complexity and its implications for approximation algorithms, artificial intelligence, economics, database theory, and biology.”
He conducted pioneering research on applying computational complexity as a tool for understanding limits and solving problems for the scientific community, working to forge connections and collaborations between computer science and other disciplines.
He has been a key player in understanding nondeterministic polynomial total search problems—computational challenges in which solutions, though guaranteed to exist, might be difficult to find. He has been influential in developing algorithmic game theory, which involves the convergence of computer science and economic theory.
Papadimitriou defined the “price of anarchy,” which measures the degree of equilibrium inefficiency in a game and is important for quantifying loss due to the unpredictable behavior of selfish agents within networks such as the Internet. He has demonstrated how computational complexity can be applied to biology and other natural processes.
Geoffrey Hinton, professor emeritus of computer science at the University of Toronto and a researcher at Google, will receive the 2016 IEEE/RSE James Clerk Maxwell Medal. The award is sponsored by IEEE. Hinton is being cited for “pioneering and sustained contributions to machine learning, including developments in deep neural networks.”
He is an authority on machine learning. In the 1980s he helped find a way to apply the back propagation algorithm—which determines how a network’s behavior changes when weights and biases are considered—to neural networks. He also has applied the algorithm to speech and visual object recognition, fraud detection, plant monitoring, and automated check verification.
His early work on the Boltzmann machine—a network of symmetrically connected, neuronlike units that make stochastic decisions about whether to be on or off—introduced many concepts that have remained at the forefront of neural network learning. Boltzmann machines were initially considered too slow for widespread application. As computing power increased, however, Hinton was able to develop one with faster training properties.
Hinton’s work in deep learning has revolutionized the field of machine learning, especially affecting machine-vision applications including image classification, medical diagnostics, law enforcement, computer gaming, and enhanced vehicle safety.
Read about the other innovators who received IEEE’s top awards this year.
This article is part of our June 2016 special issue on artificial intelligence.