Skip to content

Develop and implement a non-contact real-time respiratory monitoring technique, using low cost 3D camera technology, and explore its use in abdominal-thoracic cancer sites

Study level

PhD

Master of Philosophy

Honours

Vacation research experience scheme

Faculty/Lead unit

Topic status

We're looking for students to study this topic.

Supervisors

Dr Andrew Fielding
Position
Senior Lecturer
Division / Faculty
Science and Engineering Faculty
Dr Ajay Pandey
Position
Senior Lecturer
Division / Faculty
Science and Engineering Faculty

Overview

Consumer-grade depth sensing technology has in recent years become widely available. A number of vendors have developed similar technologies, for example, Microsoft KinectTM (now discontinued), Intel’s RealSenseTM, the Asus Xtion depth sensor, and the Qualcomm Spectra ISP platform, now in its second generation.

The systems make use of camera technology that measures the distance to a surface, rapid image acquisition can then enable real time motion detection of location of the surface. The infra-red transmitter and sensor use a time-of-flight method to construct a depth map (in mm) of the field of view, based on a manufacturer distance calibration, at frame rates up to 30 Hz.

The depth resolution of these technologies is improving with each iterative development making them excellent candidates for clinical applications where patient position monitoring is important.

This project will investigate the use of the Intel RealSense camera technology for monitoring the patient position during radiotherapy treatments

Research activities

The research will involve the use of ‘phantoms’ to physically mimic patient geometries and motion, with the Intel RealSenseTM system for a range of motion amplitudes and frequencies typical of patient breathing motion.

Construction of a real-time depth map of the field of view measured by the depth-sensors will be used to construct a 3D measurement of the moving phantom surface during radiation delivery.

One or more Intel’s RealSenseTM depth sensors will be used to create a 3D point cloud or surface representations of the patient region of interest.

Correctly referencing and aligning this 3D point cloud data with existing simulation (Computed Tomography) CT scans will enable us to map and register the patient surface. Intra-fraction movement of tissue will be tracked by checking the areas that are in consensus with the CT scans and ones that do not overlap with the simulation data.

Outcomes

Expected outputs include:

  • A thesis
  • 1-3 journal papers, depending on time-frame and scope
  • System for monitoring patient motion during radiotherapy treatments

Skills and experience

Ideally you will have a background in physics, computer science, engineering, mathematics or similar, and have expertise in:
  • Data analysis and software development.
  • Image processing.
  • Experimental skills.
  • Excellent communication skills.

Scholarships

You may be able to apply for a research scholarship in our annual scholarship round.

Annual scholarship round

Keywords

Contact

Contact the supervisor for more information.