Andy the anatomical sidekick turning Lauren’s frown upside down

16th August 2019

Humans are irrational. At least, a robot would think so… for now.

PhD researcher Lauren Fell is working on a framework that will help robots understand how humans make decisions based on facial cues.

“Human judgements are difficult to explain probabilistically because things like cognitive biases make us unpredictable,” Lauren said.

“We don’t really know exactly how we decide to trust or not trust someone when we first see their face, for example.”

Lauren is applying quantum cognition to social perceptions – using theories of quantum physics to explain cognitive phenomena, specifically how we perceive faces.

She is even building herself a side-kick to help – Andy.

“’Andy’ is an anatomical robot I started building as a side-project, but now think he will be useful for my research,” Lauren said.

Andy is a prototype anatomical robot face with silicon muscles and air valves, designed to mimic human facial expressions.

Lauren and Andy taught people how to make soft muscles they could take home with them from Robotronica 2019 — QUT’s free robotics festival at the Gardens Point campus.

“It was actually just an idea I had one day – I was looking into pneumatic muscles for a freelance project I was working on and thought it might be interesting to make a robot with all the same muscles as a human face.

“When the call for expressions of interest came through for Robotronica late last year, that gave me the motivation to actually try it out and I’ve been working on it ever since.

Lauren completed her undergraduate degree in psychology – not robotics – and taught herself how to make soft robotics using code, silicon moulding, air valves and 3D printing.

She also learned about facial anatomy and which muscles were used in different expressions, such as smiling.

“I started by learning how to design the muscles themselves. The construction of the muscles using silicon and 3D printing was actually the harder part of Andy’s development,” Lauren said.

Giving Andy facial expressions was trial and error, according to Lauren.

“Coding was not that complex. I just needed to make sure air was getting to the right muscle so it expanded and contracted the way a real muscle would.”

“His expressions are not large at the moment because he is just a prototype but he will have stronger muscles in the future.

“It’s been a really fun learning experience and my PhD has actually gone in the direction of looking at how people judge personality traits from faces, so I might even use Andy for that research which might be interesting.


A post shared by Andy (@andytheanatomybot) on


Connect with us

Follow us on social media to keep up to date with all things QUT Science and Engineering.


Visit AskQUT to get your questions answered, 24/7.

3138 2000


Contact for enquiries about research within our faculty.

3138 7200


Industry contacts and partners can contact us here.

3138 9948