Humanising the technology
We are exploring ways of lowering the barriers between technology and humans to improve our lives. To achieve this vision, we are working on developing novel human computer interfaces and interactions that seamlessly integrate with a user’s mind, body and behaviour, providing an enhanced perception. We call these 'Assistive Augmentations'. We believe carefully designed Assistive Augmentations can empower people constrained by impairments to live more independently and extend one's perceptual and cognitive capabilities beyond the ordinary.
We are keen to explore the design space of Assistive Augmentation from two dimensions: ability and bodily integration perspectives (see the figure). This creates four quadrants where top left focuses on empowering users with disabilities with interfaces with high bodily integration (e.g. Bionic Limbs & eyes). The bottom left quadrant focuses on assisting users with disabilities with interfaces that are decoupled from body (e.g. white-cane). Bottom right quadrant focuses on enabling users with interfaces that helps them do things better that are decoupled from body (e.g. email app on smartphone). Finally the top right quadrant focuses on augmenting users with new capabilities using interfaces with high bodily integration (e.g. creating extra hands). We recognise the value of all four quadrants as they serve different purposes. Creating such assistive augmentations poses many challenges including development of enabling technologies and ensuring a holistic design approach for real-world applicability.
Our work on creating Assistive Augmentations has been inspired by people have developed extraordinary ways to overcome their limitations by making use of their senses in unconventional ways. Prominent examples include Evelyn Glennie – a world-renowned solo percussionist who is profoundly deaf but able to perceive music through her skin; and Ben Underwood – who could hear his surrounding environment and move independently using self-taught echolocation after having his eyes removed when he was a toddler. With Assistive Augmentations, we aim to create similar possibilities regardless of the person's ability.
Examples of our work
Working towards the vision of humanising , we developed novel human computer interfaces such as Ai-See, that allows blind users to access information simply by pointing at objects and asking questions; Prospero a memory training assistant that is able to detect when a user can learn more efficiently; OM, that allows deaf users to 'feel' music; GymSoles, a smart insole that enables users to perform exercises with the correct body posture; and Kiwrious, that allows users easily access to physical phenomena around them. Please visit our projects page to see all projects.
We explore the design space of Assistive Augmentation with various proof of concept projects in application areas of assistive technology, technology enhanced education and emotional wellbeing.
Our Research Areas
Human-Centered Machine Learning
New input and interaction concepts
Virtual & Augmented Reality