When IEEE Member Aydogan Ozcan began his career as a Harvard Medical School research faculty member in 2005, he noticed a glaring hole in the product lines of lightweight, portable, and inexpensive devices meant for health care in remote regions of developing nations.
“I saw all sorts of miniature diagnostic gadgets designed to fit into pockets, including microfluidic chips for cytometry [cell counting] or biomedical sensing. But one type was missing: microscopes,” he says. “Most of these gadgets need microscopes to detect or quantify the biological processes occurring at the microscale. A beautifully small device that has to sit on a big microscope is only a half-solution.” Was it possible to condense microscopes as well?
That became the focus of his research when he arrived at the University of California at Los Angeles as an assistant professor of electrical engineering in 2007. There, Ozcan started the Ozcan Research Group at the UCLA School of Engineering to create photonics-based telemedicine tools. He wanted to create a technology that would turn a cellphone into a microscope. The phone would then transmit the images it senses over its wireless network to a lab for diagnosis.
His invention is both lightweight and inexpensive. The light source and sensor chip together weigh less than 45 grams and are smaller than a few cubic centimeters. Even a phone camera chip with a resolution as low as 1 megapixel, which costs less than US $1, can produce viable images. The microscope uses the same sensor already installed on the cellphone’s camera.
Ozcan says that when his product goes to market next year, he hopes it will help health-care workers in remote areas more easily screen blood, water, and urine for anemia, tuberculosis, HIV, and malaria, as well as for other bacteria and parasites.
His device has earned him eight of the 14 awards he has picked up during the course of his career. Four of them came this year, including a Bill & Melinda Gates Foundation Grand Challenges Explorations grant, National Geographic Magazine’s Emerging Explorer Award, and the National Science Foundation Career Award on Biophotonics.
“The major challenge to developing the cellphone microscope was designing robust and fast computational codes to make up for the lack of optical components, such as objective lenses, found in regular microscopes,” says Ozcan, who calls his device the Ultra-Compact Lensfree Telemedicine Microscope.
“We’re using cellphone functionality to put microscopes into the hands of health-care workers in communities that lack medical resources,” he says. “This is especially useful because it’s estimated that close to 90 percent of the world’s population will own at least one cellphone by 2015.”
The Ozcan Research Group at UCLA has received financial support from the Office of Naval Research, the National Science Foundation, the National Institutes of Health, the Air Force Office of Scientific Research, the Vodafone Americas Foundation, and the Gates Foundation. Two years ago, Ozcan founded Microskia in Los Angeles to commercialize the technology. He says he wants to raise US $5 million to $10 million during the next four years to fine-tune his device’s computational and hardware abilities, and to create a more user-friendly interface.
The mobile microscope works by capturing holographic shadows of cells and microorganisms—which are then sent over the cellphone network to a medical center, where the shadows are processed to yield images understandable to the human eye. First, a transparent chamber holding the fluid is placed on the cellphone camera sensor. Then a green LED illuminates the cells or microorganisms in the fluid—whose holographic shadows are captured and digitally recorded by the complementary metal-oxide semiconductor sensor chip installed in the phone. The holographic shadow of each type of cell has a unique identifying texture.
The shadows recorded by the phone are sent to a remote PC running Ozcan’s codes, which convert the shadows into microscopic images within seconds. Doctors at the center can use the images, along with the patient’s demographic information, for diagnoses, which can be communicated back to remote health-care workers.
Ozcan earned a bachelor’s degree in electrical engineering in 2000 from Bilkent University in Ankara, in his native Turkey, and a master’s and Ph.D. in EE from Stanford University in 2005. The following year, he joined the research faculty of the Harvard Medical School Wellman Center for Photomedicine to work on applying photonics technology to biomedical imaging, planting the seeds for his current work.
His multidisciplinary approach to engineering problems is emblematic of his definition of engineering and IEEE’s role in it. “Electronics, electrical engineering, photonics, physics, chemistry, material science, and bioengineering all have the same goal,” he says. “And being a member of IEEE is about more than just electronics. It’s about how to find cutting-edge solutions to make life simpler and better.”