Rosalind Picard Develops Specs that See Through You

High-tech glasses analyze a person’s emotions during conversation

21 October 2011

profile Photo: MIT

Imagine you’re recounting the climax of last night’s big sporting event to a co-worker. He nods and smiles. But does he really share your enthusiasm, or is he actually bored out of his mind?


A pair of high-tech eyeglasses developed by IEEE Fellow Rosalind Picard, a professor of media arts and sciences, and other researchers could shed light on such questions.


Picard and her Affective Computing Group at the MIT Media Lab have spent more than five years developing a prototype that relies on a built-in camera and software to track the movements of a conversation partner’s face and analyze facial expressions. It then tries to determine what that person is feeling.


Put them on, start talking to someone, and pay attention to the small LED visible only to you. If you see red, it might mean the person you’re talking to is uninterested or disagrees with what you’re saying.


The glasses might be especially helpful to people with autism who find it difficult to read social cues.


The team’s work was featured in the September issue of New Scientist magazine.


READING BETWEEN THE LINES

Picard and researchers Rana el Kaliouby and Jacky Lee decided to develop the glasses as a mobile platform for software Kaliouby had created. The software interprets feature points on the face and maps them to facial expressions.


“To use the software, users need to wear the glasses so the camera frames the other person’s face but doesn’t obscure their own vision,” Picard says.


The team developed a narrow strip containing a tiny webcam that can sit atop a standard pair of eyeglasses. A wire connects the strip to a small computer, analyzes the images, and matches the facial expressions to different emotions found in a database. Based on that analysis, the LED is red, yellow, or green depending on how the conversation is going.


“Its colors are modeled after a traffic signal,” Picard explains. “It turns green when the camera detects that the face in front of you is smiling, nodding, and pointed directly at you while you’re speaking, looking interested.” The LED turns yellow if the person has a confused expression and red if the person loses interest in the conversation, signaled by the fact that his or her face turns up at an angle or turns away from the wearer.


The glasses are still in the development phase. Picard says the software needs to be modified to compensate for the wearer’s head movements, for example. Although the group hasn’t formally tested the apparatus, several people with autism and Asperger’s syndrome have expressed interest in owning a pair.


“It is a serious handicap to not be able to tell whether your boss is pleased or displeased,” Picard notes, “or whether that girl you’re talking to is taking an interest in what you’re saying.”


AFFECTIVE COMPUTING

The glasses are the research group’s latest development in a line of devices designed to measure stress, interest, and emotion. The devices use wearable sensors to measure people’s physiological changes as they perform certain activities. Their first device, developed in the early 1990s, required the subject to wear sticky electrodes and wires as well as a sensor carried in a satchel dangling from the shoulder.


Since then the MIT group has designed smaller, more stylish devices, like the Q sensor, a wireless biosensor about the size of a coin. It measures emotional arousal through skin conductance, a form of electrical activity that increases during states of excitement, attention, or anxiety, and decreases when the wearer is bored or relaxed. The sensor is typically worn inside a wristband, but it also comes with sticky electrodes so it can be placed anywhere on the body.


The Q sensor is now being used by a variety of professionals, Picard says, including doctors who want to measure autonomic changes in their patients related to medication and treatment, as well as researchers who want to measure a patient’s anxiety levels and study post-traumatic stress disorder, autism, and other conditions.


The data gathered from the users is secure because the information is carefully controlled, she says. The person’s identity is never stored, and Q sensor users must opt-in to share their results with researchers.


“I think it’s really important that privacy not be an afterthought but an important consideration in the earliest phase of design,” she says.


THE RELUCTANT LEADER

When Picard began her career, studying human emotion was not on her agenda. After earning a bachelor’s degree in electrical engineering from Georgia Tech in 1984 and a master’s degree in electrical engineering and computer science from MIT in 1986, she joined AT&T Bell Laboratories in Holmdel, N.J. As a member of the technical staff, she helped develop new computer architectures for image processing. In 1991 she earned a doctoral degree in electrical engineering and computer science at MIT, and she joined the MIT Media Lab.


“As a woman in engineering, I worked hard to be taken as the serious researcher I am,” she says. “Twenty years ago, studying emotion was not even on the radar for most engineers—it was considered irrational. If you were smart, you didn’t want to have anything to do with it.”


But she realized, after years of studying the human brain, that emotion is an important aspect of artificial intelligence and computer engineering. She says colleagues who were working to create architectures to replicate the decision-making ability of the human cortex kept running into the same problem: They couldn’t re-create what motivates people to make choices or perform certain tasks.


“I started to realize the vital role that emotion-processing components of the brain, like the limbic system, were playing in rational decision-making, perception, and more,” she says.


She published her first article in 2001 about affective computing, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State,” in IEEE Transactions on Pattern Analysis and Machine Intelligence. And in 2009, she and Kaliouby founded Affectiva, a company in Waltham, Mass., that produces emotion-reading technology such as the Q sensor and the Affdex, a program that uses a webcam to help marketers determine how people react to different brands and commercials.


“It’s exciting to see that some of our inventions are becoming products that can help people learn,” Picard says.


Learn More