Understanding the environment through sound
Professor Paul Roe summarises large volumes of sounds recorded in nature through visual representation. Using numbers and colours as identifiers on spectrograms, bird calls and the sounds made by wildlife can be identified at specific times and places, allowing for analysis of endangered species and biodiversity.
Australian Research Council grant to build acoustic observatory
A QUT-led research collaboration between ecologists, biologists and computer scientists will see an observatory established to analyse Australia’s fragile and mega-diverse environment through sounds detected and recorded through sensors installed in over 400 locations around the country.
Connecting people across distances through daily routines
Professor Margot Brereton has developed a messaging kettle that lights up a lava lamp type device when a connected kettle in another location is turned on. The device keeps people in touch while living apart - turning on the kettle reminds people of each other, and they can also write messages through a connected device.
Helping patients to communicate post-stroke
Dr Bernd Ploderer has developed StrokeAssist, an app that allows stroke survivors to communicate when their ability to speak and understand has been impaired. Its functionality allows users to express their needs to medical staff, schedule medical appointments, and track mood and pain during recovery.
Our discipline brings together a diverse team of experts who deliver world-class education and achieve breakthroughs in research.
Explore our staff profiles to discover the amazing work our researchers are contributing to.
The graduate showcase for our Bachelor of Games and Interactive Environments is attended by industry professionals and exhibits the best polished and published games by our final-year students.
- big data
- digital games and player experience
- e-science (environmental monitoring)
- Information and Communications Technologies for Development (ICT4D)
- interaction design
- participatory design.
Our research involves an analysis of people and their behaviours, particularly within engaging situations and contexts.
We’re interested in understanding people’s perceptions, their behaviours and their experiences as they engage with video games and interactive technology.
We apply our research to gamification, play and new interaction mechanisms to motivate and engage with children, people with diverse abilities and people from different socioeconomic and cultural backgrounds
This project aims to foster do-it-yourself practices among people from low socio-economic backgrounds by understanding existing practices at four diverse makerspaces, and enabling people to co-design technological prototypes that fit their own needs.
This project will make our culture more inclusive, harness the strengths of people from low socio-economic backgrounds, increase their community engagement, and raise their economic prospects.
This project aims to visualise and analyse big sound data, to detect patterns of animal and bird calls at different temporal and spatial scales. Eco-acoustics are important for scaling environmental monitoring; since the resulting big sound data is opaque and its fully automated analysis is intractable, human-computer methods are needed to interpret the data.
The project expects to deliver multi-scale sound visualisation, end-user analytic tools and annotation and management methods so people can monitor the environment with insight and accuracy.
This project aims to co-design a social, interactive and visual internet search interface for people with intellectual disability. Enabling choice and independence is key to the new National Disability Insurance Scheme, but people with intellectual disability are effectively excluded from much of the web.
This project will investigate ways to access and provide information using technologies such as interactive avatars, virtual worlds and trusted social support. Expected outcomes are new search interface technology, a theoretical framework and new web accessibility guidelines.