Jump to Content

Media Release

September 13 2007

UniSA puts rescue workers in the safety zone

Imagine being a fire fighter in the front line of a fire zone, with a wall of fire raging around you and you don’t know which way to turn. Suddenly a giant hand comes down from the sky and points you directly to a safety zone.

Researchers at the University of South Australia have developed a world-first technology that gives crews inside a command control centre a bird’s-eye view of what’s happening within a fire or search and rescue area, and enables them to communicate directly with people in the field, using visual cues to better manage rescue missions and safely control fires.

Outdoor crews are equipped with mobile augmented reality (AR) systems, comprising a computer back-pack with head-mounted screen, that enable them to see 3D life-sized objects in real time and experience the interactions as though they are appearing from the sky above.

Exchanging information between command control centres and people in the field is typically achieved by verbal communication using radio controlled devices. With a large number of people in the field, managing this communication is quite complex.

The technology created by researchers at UniSA’s Wearable Computer Laboratory includes a tabletop display system overlaid with information from a ceiling mounted projector that shows satellite images of an outdoor environment from a top-down perspective, as well as the exact positions of people in the field in real time.

Indoor operators can simply use their hands to point to objects on the tabletop or place physical props on the table surface to guide outdoor crews to a particular area, direct them to perform certain tasks or to relay further instructions. The objects and gestures are captured by cameras around the outside of the table.

PhD researcher Aaron Stafford describes the tabletop as a miniature movie set, with a blue screen around the tabletop that makes it easier to track what the cameras see. “We use maps of the area with the locations of outdoor users measured using GPS. We capture props and gestures on the tabletop, and a computer attached to each of the cameras processes all of the data, and sends 3D images wirelessly to outdoor crews equipped with mobile AR systems,” Stafford said.

He found that natural hand gestures used to perform tasks such as navigating around a terrain map significantly reduced the number of verbal expressions required to relay information to people in the field.

“Control centre operators can circle an area and say ‘don’t go here’, or write ‘danger’ on a Post-it note and put it in the area of danger to alert crews. When the danger passes, the prop can be simply removed from the table. Operators can also point and drag their fingers to indicate a path to follow or a boundary to avoid crossing,” Stafford said.

Using a track ball mouse, indoor operators can also scroll the map on the table – similar to flying across the landscape - to oversee the operations of a very large area, and where scaling to the size of the table would make it impractical to view features on the map.

“An indoor operator could place a toy car on the table with a ‘follow me’ note, and then using the track ball mouse cause the car to drive across the landscape to guide field crews to their destination. The 3D geometry of objects such as toy cars or animals placed on the surface appear life-sized and realistic-looking to outdoor crews,” Stafford said.

“Updates can be transmitted over the network for each camera at five frames per second to mobile computers, giving outdoor crews continued access to the visual hull from any viewpoint, even when the network fails.”

The research is being undertaken by Stafford, under the supervision of Adjunct Senior Research Fellow Dr Wayne Piekarski and associate supervisor, Professor Bruce Thomas from Wearable Computer Laboratory at the University of South Australia.


Contact for interview

Media contact

top^