Shift in Perspective
The rise of smart devices and AI holds great potential to transform daily interactions and life goals, but current user interfaces often fall short, especially for individuals with disabilities who rely on assistive technologies. For instance, a blind person navigating a supermarket may find traditional accessibility features like text-to-speech cumbersome and inefficient. This disconnect between the user and device limits true empowerment.
Why is that? We believe that the conversations have been overly focused on user-system coexistence and compatibility (e.g. “I will use this device” , “This is a user-friendly device”) rather than integration and augmentation of ability (e.g. “This device is a part of me”,“I have evolved with technology to unlock my full potential”). A paradigm shift is thus needed from viewing technology as a separate tool, to considering it as an integrated extension of the human body, mind, and identity. We call this new perspective of human augmentation “Assistive Augmentation”
Take Adrianne Haslet-Davis , for instance, a ballroom dancer who lost her left leg in the Boston Marathon bombing. With a dance prosthesis, her ability to dance was restored—not just her leg. Technologies like this exemplify the transformative power of Assistive Augmentations.
Assistive augmentation is defined by two core dimensions: Ability and Integration. We categorise ability into the perceptual, physical, and cognitive dimensions, and explore various methods for augmenting these capabilities: amplification, substitution, and extending to new modalities. The other core principle of integration pertains to how these augmentations fit into our lives. Assistive Augmentation approaches this holistically, through the bodily, temporal, identity and social lens.
Our work on creating Assistive Augmentations has been inspired by people who have developed extraordinary ways to overcome their limitations by making use of their senses in unconventional ways. Prominent examples include Evelyn Glennie – a world-renowned solo percussionist who is profoundly deaf but able to perceive music through her skin; and Ben Underwood – who could hear his surrounding environment and move independently using self-taught echolocation after having his eyes removed when he was a toddler. With Assistive Augmentations, we aim to create similar possibilities regardless of the person's ability.
Examples of our work
Working towards the vision of humanising , we developed novel human computer interfaces such as Ai-See, that allows blind users to access information simply by pointing at objects and asking questions; Prospero a memory training assistant that is able to detect when a user can learn more efficiently; OM, that allows deaf users to 'feel' music; GymSoles, a smart insole that enables users to perform exercises with the correct body posture; and Kiwrious, that allows users easily access to physical phenomena around them. Please visit our projects page to see all projects.
We explore the design space of Assistive Augmentation with various proof of concept projects in application areas of assistive technology, wellbeing and learning.
Areas of Focus
Assistive Technology
Emotional Wellbeing
Embodied Learning