Sensor Implants for Everywhere in the Body

IEEE Fellow works on electronics to monitor nerves, muscles, and organs

11 November 2016

IEEE Fellow Jan M. Rabaey and researchers at the University of California, Berkeley, made headlines recently by building the first wireless, dust particle–size sensors that could eventually be implanted in the human body to monitor nerves, muscles, and organs.

The sensors, known as neural dust motes, have so far been implanted in rats’ muscles and peripheral nerves. The motes rely on ultrasound projected into the body for power and to read out measurements. Ultrasound, already widely used in hospitals, can penetrate nearly anywhere in the body. The journal Neuron in August published an article describing the researchers’ work: “Wireless Recording in the Peripheral Nervous System With Ultrasonic Neural Dust.”

The scientific codirector of the Wireless Research Center and chair of the electrical engineering division at UC-Berkeley, Rabaey has been with the university since 1987. His background is primarily in integrated circuits and systems, including wireless systems such as the low-power interfaces he describes in the Neuron article. In 1995 he was elevated to IEEE Fellow for “contributions in design synthesis and concurrent architectures for signal processing applications.”

For the past decade, he has been collaborating with IEEE Senior Member Jose Carmena and other IEEE colleagues, like Senior Members Michel Maharbiz and Elad Alon, on wireless brain-machine interfaces (BMIs). Carmena, a professor of electrical engineering and neuroscience at UC-Berkeley, is codirector of its Center for Neural Engineering and Prostheses. Carmena, a coauthor of the article on the neural dust, is also cochair of IEEE Brain. Rabaey is an active member of the initiative and has been presenting his BMI research at IEEE Brain workshops including one held in December at Columbia University.

The Institute asked him about his work and about his concept of a human Intranet.

What is a neural dust mote?

Think of the dust mote as a measurement device tapping into the body’s electric fields. A 1-millimeter cube, about the size of a large grain of sand, the mote contains a piezoelectric crystal that converts ultrasound vibrations projected to it from outside the body into electricity to power a tiny, on-board transistor placed in contact with a nerve or muscle fiber. As natural electrical activity in the nerve varies, it changes the current passing through the transistor, thus providing a read-out mechanism for the nerve’s signal.

To send the information back out of the body, the mote uses the ultrasound. The external transducer first sends ultrasound vibrations to power the mote and then listens for the returning echo as vibrations bounce back. The changing current through the transistor alters the piezoelectric crystal’s mechanical impedance, thereby modulating the amplitude of the signal received by the ultrasound transducer in the room. The slight change in received signal strength allows the receiver to determine the voltage sensed by the mote.

Right now the mote is larger than the researchers would like. Once it’s shrunk to 50 microns on a side—of a size that can be inserted into the brain or central nervous system—the sensor could identify changes as they occur in the human body or during a particular physical activity. Based on that information, a physician or even the person himself could either stimulate a certain body part—a peripheral nerve, say—or a part of the brain. For example, the mote could be implanted in the brain of a paraplegic to enable control of a computer or a robotic arm.

What is a human intranet?

A human intranet would be a network of sensors and actuators that connects points in your brain and your body. The electronics could be in the form of neural motes, or they could be in wearables attached to clothing or integrated into garments. Electronics could also be worn on top of the skin or inserted under the skin, like an electronic tattoo. The sensors would continuously monitor our state of health—what is going on in our bodies, and how well, or not so well, we are performing activities.

Marathon runners, for example, could continuously measure their heart rate and glucose level, and the air their muscles receive. Sensors would collect that information, send it wirelessly to an outside network with a central computer, where it would be processed and converted to a form that could say how the body is performing. That information also could be sent back to the runner. Even better, that processing could be performed on a local processor, avoiding the need to transmit the data.

We already have one of these computational systems: the brain. It gets signals from different nerves, processes them, and decides how to act on them. This natural computer could react to the implanted sensors. That is where BMIs come in. They would connect our biological brain to the electronics that will be on us. BMIs allow for a tighter integration among these auxiliary devices and how we operate as humans. Combine this electronics with the brain, and we have what I call the human intranet. With it, every part of the body could be monitored, rather than just through a single wearable on, say, the wrist. People could see early signs of wellness or illness—which would be great to know.

All the activity taking place today with augmented reality and other ways to enhance ourselves with technology is already moving toward real-time monitoring of the body. However, the true human intranet is still many years away.

Who could benefit from the human intranet?

People with severe medical problems, such as paraplegics, and those who push the limits of their bodies, such as athletes and those in the military. They are going to drive this. Also some artists willing to experiment with their bodies in the interest of their art. The extra sensing capability could expand the scope of what they’re doing.

In the long term, the human intranet will benefit everyone, because it will allow the able-bodied to augment themselves with technology and enhance their interactions with the world around them.

This article is part of our November 2016 special issue on technologies for the brain.

Learn More