For social robots and service robots to be able to operate effectively 'out of the box' in indoor environments, they must be able to understand the context of those environments without necessarily developing or having access to a complex map.
The aim of this project is to develop a visual classifier for the type of rooms encountered in a typical home (bedroom, bathroom, kitchen, living room, laundry, etc.), medical facility (hallway, ward, waiting area, treatment room), commercial office environment (personal office, open plan seating, meeting room, kitchen), or school.
We will be using the Pepper robot from Softbank Robotics as a test and demonstration platform. The aim is for Pepper to become capable of recognising the function of a room it enters, and developing a labelled map of a new indoor environment.
This project is a component of the Vison-Enabled Humanoid Robotics project, which is funded by the Queensland Government, using the Pepper robot from Softbank Robotics. The project aims to enhance future humanoid robotic platforms to help robots see and interact with their environment and interact dynamically with humans.
You will develop a dataset of classified images of indoor location types, and implement algorithms to recognise and label unfamiliar indoor locations.
The project will result in a demonstration of the Pepper robot exploring a new indoor environment, and classifying the rooms and spaces it encounters.
Skills and experience
Excellent programming skills in Python or C++.
Basic knowledge of image processing, machine learning and AI.
Previous experience in robotics is not essential, as it is expected to be developed as part of this project.
Contact the supervisor for more information.