Augmented reality (AR), or mixed reality, has become a mature technology with many possible practical applications in manufacturing, retail, navigation and entertainment.
We're interested in using AR to support human-robot interaction. In this project, you'll investigate how a human can use AR to better understand how a robot perceives the world and to understand the robot's intentions.
You'll implement an augmented reality application that projects sensor data from the robot into the AR space. This will help us better understand how the robot perceives the world.
This sensor data can comprise of:
- laser scan data
- 3D point clouds
- 3D object models from semantic Simultaneous Localisation and Mapping (SLAM)
In addition, you can visualise the robot's intentions in the AR space, so that a human co-worker can better understand what the robot wants to do next. Examples can include navigation goals and paths, or grasp points.
You'll gain understanding and technical expertise in operating and programming AR devices. You'll understand how robots represent their world using different data structures.
Your experiments will give us valuable insights into which methods are effective. Your AR application will help our researchers develop better scene understanding algorithms for mobile and stationary robots.
Skills and experience
You should have a strong background in programming (Python, C/C++) and software development.
You should have strong knowledge and understanding of the following fundamentals:
- computer vision.
You may be able to apply for a research scholarship in our annual scholarship round.
Contact the supervisor for more information.