More than a billion people around the world have some sort of disability, and it’s important to create technologies to help them. That was the idea behind many of the technologies on display at the CSUN (California State University, Northridge) Assistive Technology Conference, held from 1 to 3 March in San Diego. Here are three of the more impressive devices.
ASSISTANT FOR THE BLIND
Say you’re walking to meet a friend at a restaurant, and the normal route you take is blocked by construction. A sighted person could navigate around the obstacle, but if you’re blind, your task is much more difficult.
That’s where Aira, a remote personal assistant service, could come in handy. Aira customers receive a pair of smartglasses equipped with a microphone and camera. They get help from a human assistant sitting sometimes hundreds of kilometers away. The camera feeds the Aira representative a view of the wearer’s surroundings. If help is needed, the wearer can simply tap a button on the glasses to contact the representative, who uses a laptop to connect with the person’s camera.
Once connected, the representative can describe the surroundings to the wearer, track her location on a map, and guide her through city streets and transit systems. Representatives also can help with everyday tasks, like grocery shopping, reading restaurant menus, and picking out clothing. Erich Manser, who is legally blind and an expert on accessibility technologies for the blind, said at the conference that he sometimes asks Aira representatives to give him the play-by-play of his daughter’s soccer games.
Customers of the San Diego–based startup can subscribe to Aira for a set number of minutes each month (with the company promising its representatives won’t disconnect when the customer runs out of minutes). Monthly plans range from US $89 to $329. Along with the smartglasses, customers receive an AT&T personal Mi-Fi device that helps them connect to the Internet wherever they are. The smartglasses can be paired with the customer’s earbuds or headphones.
A STEADY SPOON
A simple task such as eating a bowl of soup can be a messy ordeal for people whose hands shake uncontrollably due to essential tremor or Parkinson’s disease. That’s what motivated a startup in South San Francisco, Calif., to design the Liftware Steady, a utensil handle with sensors and motors that cancel out a customer’s tremor and cut down on spills. The startup, originally called Lift Labs, was acquired by Google in 2014 and is now part of Verily Life Sciences.
Sensors in the Liftware Steady handle detect hand motions, and its small computer distinguishes an involuntary tremor from the intended movement of the hand. Three attachments are available: a tablespoon, a soup spoon, and a fork. To stabilize the utensil and hold it steady, the computer directs two motors in the handle to move the attachment in the direction opposite to a detected tremor. The handle, with a charger and a spoon attachment, is $195. The soup spoon and fork are $35 each.
In December the company released the Liftware Level, which allows people with mobility issues from, for example, cerebral palsy to hold a spoon or other utensil at any angle without spilling its contents.
The batteries for both devices lasts for at least an hour of continuous use on a single charge, and the attachments are dishwasher-safe.
In December an anonymous donor gave 24 of the devices to Ability Now, an organization in Oakland, Calif., that provides assistive technologies and support services to people in the Bay Area with developmental and physical disabilities. Thanks to the high-tech utensil, one recipient told Verily that she has for the first time in years felt comfortable enough to eat Christmas dinner in the same room as her family.
It’s not just startups producing assistive technologies. Microsoft, through its Garage incubator program, is developing Hearing AI, a smartphone app for the deaf and hard of hearing. The app, which interprets sounds through artificial intelligence, vibrates when the alarm on a smoke detector or carbon monoxide detector goes off. Another feature applies deep learning to convert speech to text, and vice versa, making it easier for the user to communicate with the hearing world.
The app also uses augmented reality to help people visualize the sounds around them. Users simply hold their smartphone up to what they see before them and see animations overlaid on the scene. The animation will pulsate, for example, to the rhythm of a pop song playing in the room.
The app, still being tested, is so far available only for iOS smartphones.
This article is part of our June 2017 special issue on assistive tech.