Picture this: in a horrible accident, a farmer in a remote area of northern Manitoba, Canada, has his arm torn off by a hay baler. Immediate surgery is needed if he is not to lose the arm. His local hospital is equipped to handle the procedure, but its surgeons don’t have experience in reattaching limbs.
A qualified surgeon is available at a hospital in Chicago. Without even needing to scrub up, she steps into the hospital’s remote surgery suite, takes hold of the controls, and performs the operation via a robot that the surgeons in Manitoba have set up. The doctor in Chicago doesn’t merely consult on the surgery, she actually performs it.
The controls in her suite allow her to manipulate the remote robot. But just as important, they also provide her with tactile feedback so that what she holds in her hand feels just like the scalpel or other instrument she would use in a real operating room. If the robot-controlled scalpel strikes bone, the device she holds suddenly becomes harder to push, as would a real scalpel. Depending on the angle at which it strikes the bone, it may even twist in her grip.
Far-fetched? Not so. Remote surgery has not yet been performed on human beings, but it is possible and will in all probability become a reality within our lifetime.
The key to accomplishing the feat lies in the rapidly growing field of haptics, which studies sensing and manipulating objects and environments through touch. The goal is to build devices that do for the sense of touch what television and radio do for the senses of vision and hearing—provide ways for delivering a sensory experience through a man-made device—according to IEEE Member J. Edward Colgate, professor of mechanical engineering at Northwestern University in Evanston, Ill. Haptics has been around for about two decades, but progress has been hampered by its interdisciplinary nature, Colgate says. It requires the cooperation of experts in such diverse areas as neurology, applied psychology, robotics, human-computer interaction, control systems engineering, and communications.
NEW PUBLICATION To foster the growth of the field, the IEEE is introducing a new journal, Transactions on Haptics, cosponsored by the IEEE Computer, Robotics and Automation, and Consumer Electronics societies. The journal covers the gamut of work in haptics, from fundamental research on human tactile perception to the latest commercial applications. The first issue is expected in September, and it will come out quarterly thereafter, according to Colgate, the editor in chief.
Haptics is not a new field for the IEEE. In 2002, the Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems (popularly known as the Haptics Symposium) became part of the IEEE Virtual Reality Conference. The next Haptics Symposium takes place on 13 and 14 March in Reno, Nev.
Haptics applications aren’t just a dream; some are already in use. Motorola’s Razr 2 cellphones use vibrating haptic feedback in conjunction with their touch-screen displays and side buttons. Video games use haptic feedback—to transmit the “feel of the road” through the steering wheels of virtual vehicles, for example. Several car models have a haptic knob on their radio that operates smoothly when acting as a volume control but feels as though it is clicking into place when functioning as a tuning knob.
More than 10 years ago, a team at the Biomimetics and Dextrous Manipulation Laboratory at Stanford University developed a haptic paddle to help mechanical engineering students understand the behavior of dynamic systems by letting them feel the vibration, damped oscillations, and other movements described by the equations they were studying. The approach succeeded. “It was evident the students were understanding the concepts [encapsulated by the equations] for the first time,” says IEEE Member Allison Okamura, who was then at Stanford but is now a professor of mechanical engineering at Johns Hopkins University, in Baltimore, where she has refined the haptic paddles.
Colgate is working on variable friction, which allows the sensing of textures. With the addition of Peltier-effect devices, which change temperature with applied voltage, temperature sensation can be added. A goal of his work is to develop systems that present shape and texture information to multiple bare fingertips. That could help medical professionals learn to detect tumors via palpation, or even do so at a distance.