Imagine all sorts of data—videos, news articles, images, and podcasts—about your surroundings being presented to you over your smartphone. Point your phone in one direction and you see a video about breaking news happening right around the corner. Aim it at the museum you're passing and read about the Vermeer exhibit featured that day. Point it in another direction and read Twitter dispatches sent by passersby.
That isn't the distant future—it is already happening, thanks to the work of IEEE Member Blair MacIntyre and other developers of mobile augmented reality (AR). "Mobile AR" refers to the concept of overlaying media (graphics, pictures, video, and sounds) from the world around you onto a device such as a smartphone.
MacIntyre, an associate professor of interactive computing and director of Georgia Tech's Augmented Environments Lab, has spent more than a decade there working on AR. He established the lab to advance the technology in 1999 with funding from Motorola, Qualcomm, Turner Broadcasting, and others.
Photo: Georgia Tech
For the past few years he has been focusing on two key AR areas: smartphone apps and gaming.
MacIntyre and his team at the lab, including IEEE Member Alex Hill, developed an open-source mobile AR browser called Argon, which is sponsored by Alcatel-Lucent and expected to be released in the Apple app store soon. Downloaded on a smartphone, Argon organizes the surrounding information for display. That information comes from app developers who create channels—basically streams of location-coded content, delivered from a Web server, that users can view on their smartphones. Argon's software relies on the phone's location, tracking it using such technologies as GPS, accelerometers, orientation sensors, and compasses. Other tracking technologies, based on computer vision, will be added soon, MacIntyre says, and services supporting the discovery of nearby channels can be provided by Web service creators.
"As faster and more powerful phones with amazing processors were developed, it has made AR applications possible," he says.
One of the first widely available AR apps was Google Sky Map, which overlays information about the stars, constellations, and planets when you point your phone at the sky. Another, called Layar, brings up information about local stores, restaurants, and businesses.
Argon is different from Layar because it "is the first real AR browser, in analogy to a Web browser," MacIntyre says. "Layar requires developers to specify AR content in a fixed format. Argon's use of Web standards gives the channel developers much more control and possibilities.
Several channels have been created that show such content as nearby tweets and Flickr images.
MacIntyre is also focusing on products that merge the creative virtual environment of a video game with the social aspects of a board game.
"Instead of each player staring at a screen, the players can sit around a table, running an AR app on their smartphones that overlays computer graphics on a game board that sits on the table, letting them interact with the game—and each other," MacIntyre says.
He and his students are working on AR games in the lab's Qualcomm-supported Augmented Reality Game Studio. They're collaborating with Tony Tseng, a professor at the Savannah College of Art and Design, in Georgia, and his students from the Interactive Design and Game Development program. MacIntyre is also working on AR games at his start-up, Aura Interactive. One game recently developed in collaboration with Mattel and Qualcomm is an AR version of Mattel's Rock 'em Sock 'em Robots game, in which players control little boxers.
The AR games rely on a smartphone app and a physical game board designed with trackable features. The app uses the image-based targeting implemented in Qualcomm's free SDK for Android phones, an AR technique in which the pattern of small high-contrast corner features determines the position and orientation of the phone's camera relative to the image on the game board. The app analyzes the corner features on the board and then overlays graphics, such as the robot boxers, in the smartphone display. Players can then control the robots by touching options on their screens.
MacIntyre and his researchers have developed a variety of AR board games over the years, including ARhrrrr! (in which players kill virtual zombies) and Arf (which allows people to play with a virtual dog on a table). Recently the AR Game Studio created Spintopia, which lets kids create three-dimensional dynamic spiral patterns.
One of the key challenges in advancing AR applications is to improve the tracking so smartphones can more accurately pull up geographically related content.
"The GPS in your smartphone can tell you in the best case where you are within 1 to 10 meters," MacIntyre says. "But in a city with tall buildings, the accuracy can drop to 40 meters because the device can't 'see' enough GPS satellites for a reasonably accurate position fix. If the phone only knows where it is within 40 meters, it's pretty much impossible to point it across the street and expect to overlay information about the buildings there."
Until tracking technology improves—something MacIntyre says can be done by combining the image-tracking technology he is using for games with GPS and orientation sensors in the phones—users won't see apps that overlay content directly on the buildings in front of them.
MacIntyre says he is confident such apps will be only the beginning. "As with many technologies, it's often difficult to imagine the possibilities for the future," he says. "Back when the Web was only for sharing documents and pictures for scientific research, no one could have predicted Facebook, Twitter, and Amazon. AR has an exciting future."
To see a video on MacIntyre's AR work, visit http://ieeetv.ieee.org/ieeetv-specials/georgia-tech-augmented-reality-lab.