Teaching robots to navigate - we built a zoo

21st September 2020

QUT researchers built a virtual zoo to test the ability of robots to navigate real-world environments by seeing if it could track down the king of the jungle by using symbols and an abstract map.

Dr Ben Talbot working on how robots can use navigation cues.

 

In the study Robot Navigation in Unseen Spaces using an Abstract Map published in the journal IEEE Transactions on Cognitive & Developmental Systems, Dr Ben Talbot and a team of researchers  including Dr Feras Dayoub, Professor Peter Corke and Professor Gordon Wyeth, considered how robots might be able to navigate using the same navigation cues that people subconsciously use in built environments.

The researchers programmed an Adept GuiaBot robot with the ability to interpret signs and directional information using the abstract map, a novel navigation tool they developed to help robots navigate in unseen spaces.

 

Dr Talbot, from the QUT Centre for Robotics, said in large-scale outdoor settings like roadways, GPS navigation could be used to help robots and machines navigate a space, but when it came to navigating built environments like university campuses and shopping complexes robots typically still rely on a pre-determined map or a trial and error method of finding its way around.

“As a human, if you come to a university campus for the first time, you don’t say “I can’t do it because I don’t have a map, or I’m not carrying a GPS,” Dr Talbot said.

“You would look around for human navigation cues and use that symbolic spatial information to help guide you to find a location.”

Dr Talbot said people relied on a range of navigation cues in getting around in an unfamiliar location, such as directional signs, verbal descriptions, sketch maps, and asking people to point them in the right direction.

He said the key for people navigating an unknown space was that their imagination of the space can guide them, without requiring them to have already experienced it.

 

The first time a person goes to a zoo, for instance, they are likely to turn up with the expectation that animals are located in logical groupings such as shared habitats. If they find koalas, for instance, they might expect to find kangaroos nearby. Similarly, they wouldn’t expect to find penguins in the desert-themed area.

“It’s this concept of when you’ve never been somewhere, you see information about a place and imagine some kind of map or spatial representation of what you think the place may look like,” Dr Talbot said.

“Then when you get there and see extra information, you adjust your map to improve its accuracy or navigational utility.

“But robots typically can’t navigate with this concept of a fuzzy map that doesn’t have any concreteness to it. This was the big contribution introduced in the abstract map; it allowed the robot to navigate using imaginations of spaces rather than requiring prior direct perception.”

Instead of explicit distances and directions that might be used by robots in navigating a rigidly structured environment like a warehouse, the GuiaBot scanned QR-like codes as it travelled around the virtual zoo to read directional information, such as “the African safari is past the information desk”.

While the robot proved slightly quicker than people who navigated the space using the same directional information, Dr Talbot said the limited sample size in the study meant the robot versus person comparison should not be over emphasised.

“It’s a meaningful result in that we can say performance was comparable to that of humans, but it’s not at a stage yet where we could drop a human and robot on a real campus and they could compete,” Dr Talbot said.

“We would like to eat away at some of those differences in the future though.”

An Adept GuiaBot robot using an abstract map to navigate .

 

Dr Talbot said one of the challenges in creating robots that can navigate unfamiliar spaces was for them to judge the significance of visual imagery they observe, and whether it is relevant to the navigation task at hand.

An example is a robot might see a sign on the outside of a building and interpret a word such as library as the label, but then misinterpret a “I love New York” shirt as labelling its current location New York.

Dr Talbot said enhancing robotic navigation with abstract maps would allow robots to better navigate spaces such as schools, hospitals, offices and even zoos. This ability, he said, was a transition to robots becoming more useful in other built-up environments.

The research was supported by the Australian Research Council’s Discovery Projects Funding Scheme.

Media contact:

Rod Chester, QUT Media, 07 3138 9449, rod.chester@qut.edu.au

After hours: Rose Trapnell, 0407 585 901, media@qut.edu.au

Connect with us

Follow us on social media to keep up to date with all things QUT Science and Engineering.

Students

Visit AskQUT to get your questions answered, 24/7.

3138 2000

Researchers

Contact for enquiries about research within our faculty.

3138 7200

Industry

Industry contacts and partners can contact us here.

3138 9948