A New Tool to Aid Urban Rescue Teams

This system makes it easier to locate people and things

2 May 2014

Dealing with disasters and emergencies in the middle of crowded cities is nothing if not challenging. Multiple agencies are often involved, and their rescue operations must be carefully coordinated, especially when time is of the essence. Whatever the problem—a building fire, bus accident, kidnapping, or earthquake—lives could be saved if rescue teams could see a map of the area that pinpointed the location of both victims and first responders.

Some 13 researchers from various universities in Finland, including IEEE members Riku Jäntti and Heikki Koivo, believe they can provide just that in what they call a Mobile Urban Situation Awareness System—or MUSAS, for short. Their paper, “Localization Services for Online Common Operational Picture and Situation Awareness,” is available from IEEE Access, the free online, open-access megajournal.

Jäntti and Koivo are professors at Aalto University, in Helsinki, Finland. Senior Member Jäntti is an associate professor of communications engineering and head of the department of communications and networking at the university’s School of Electrical Engineering. Koivo, an IEEE Life Senior Member, is a professor emeritus there. Other researchers are also from Aalto, as well as from the National Defence University, the University of Vaasa, and the VTT Technical Research Centre. The group also had cooperation from Prairie View A&M University, in Texas; Tokyo University; the University of Utah, in Salt Lake City; and Yonsei University, in Seoul, South Korea.

“Knowing where things are and combining several sources of information enables context-aware data gathering, analysis, and decision making, and aids in situation awareness,” they write.

Their system combines information from any number of devices, including handhelds, to provide an overview of an event, and transmits it all to a central location. Information can come from cameras, wireless sensor networks, wireless LANs, mapping software, wearable wireless sensors, and remote-controlled robots.

THE BIG PICTURE

First responders, as well as those in law enforcement and the military, are trained for situation awareness—the act of making decisions based on understanding what they see, assigning meaning to it, and then predicting what might happen next. Situation awareness is especially important when information is coming in quickly and poor decisions could lead to a loss of life.

Data is key to forming what the researchers call a common operational picture (COP). The MUSAS COP includes information about the location of people and things. It uses wireless sensor networks and wireless LANs to constantly map the affected area using localization information from sources such as GPS, the victims’ mobile devices, signals from cell towers, and acoustic sensing. It overlays the ever-changing information on a map of the surroundings or even on building blueprints or floor plans. The MUSAS can operate independently of lighting, temperature, humidity, and other ambient conditions, and can even be accessed through the walls of buildings.

The system also enables a command center to see the situation on large displays, while those on the scene can view a scaled-down version on a handheld Android device. Maps, blueprints, and floor plans can be marked up on the screen with notes and symbols. A sharing subsystem allows remote members of the rescue team not only to view but also to transmit new information.

TRACKING THE TEAM

During an emergency, knowing where victims are located is critical, but so is knowing where the first responders are. While commercial systems like GPS can track people outdoors, locating someone indoors is more difficult because GPS satellites can’t establish a line of sight inside buildings.

The MUSAS researchers believe they have solved that problem by outfitting the rescue team with a formidable set of devices—wireless sensors embedded with a microcontroller-based computing unit for running localization algorithms, together with an inertial measurement unit with a 3-D gyroscope, magnetometer, acceleration sensors for inertial navigation, and a data transmitter. The inertial navigation unit depends on estimating a person’s step length from acceleration data gathered from footwear. The sensors also track each team members’ physical state, such as whether the person is walking, standing, ascending, or descending, along with each activity’s intensity level.

MOBILE ROBOT

For mapping and reconnaissance missions, the MUSAS also has mobile robots with both localization and mapping capabilities. The remote-controlled robot is deployed to gather information, much like an exploring scout, so as not to risk human lives. It can build a metric map and is central to creating a common frame of reference for the system. Weighing about 100 kilograms, the robot is outfitted with sensors, a laser range finder, a camera, a server, and a communications system. Data from the laser range finder, images from the camera, the calculated position, and the constructed map of the area are sent to a central operator to help build the common operational picture.

MORE USES

Future plans for MUSAS include making the system more robust by using a distributed server, building a 3-D model of the target area, and improving how information is viewed on the handheld devices. The researchers also are working to expand the system from being restricted to one site to include hundreds of sites and buildings.

“This means that another command level will need to be added that looks at the picture covering the entire area,” says Koivo.

Koivo added that the group has been collecting hundreds of voice samples spoken in about 30 languages in order to build a voice data bank to help identify the speakers involved in emergencies. 

This article has been corrected from an earlier version.

Learn More