Our research vision is to create Assistive Augmentations: Designing intelligent systems that extend the limits of our perceptual and cognitive capabilities.

Our senses define the way we perceive the world and interact with the environment around us. While permanent or situational impairments often limit our capability to accomplish even simple tasks, a few people have developed extraordinary ways to overcome their limitations by making use of their senses in unconventional ways. Prominent examples include Evelyn Glennie – a world-renowned solo percussionist who is profoundly deaf but able to perceive music through her skin; and Ben Underwood – who could hear his surrounding environment and move independently using self-taught echolocation after having his eyes removed when he was a toddler.

Assistive Augmentations aim to enhance our perceptual and cognitive capabilities and provide solutions to overcome impairments to anyone with the help of technology. We believe carefully designed Assistive Augmentations can empower people constrained by impairments to live more independently again and even extend one's perceptual and cognitive capabilities beyond the ordinary.

Working towards this vision, we developed intelligent systems such as Ai-See, that allows blind users to access information simply by pointing at objects and asking questions; Prospero a memory training assistant that is able to detect when a user can learn more efficiently; OM, that allows deaf users to 'feel' music; GymSoles, a smart insole that enables users to perform exercises with the correct body posture; and Kiwrious, that allows users to see the invisible physical phenomena around them.

Our Focus

We aim to design Intelligent systems that leverage state-of-the-art technology and to develop innovative concepts and tools that pave the way for the next generation of assistive augmentations.

Our current application focus includes Assistive Technology (wearables with embeded AI to supporting deaf and blind users), EdTech (new engaging learning platform to make science fun and engaging), Well-being (home-based intellegent systems for early detection and reduction of diabetic foot ulcers), and Agri-food technology (Intellegent interfaces to support tea teasters for predication & blend recommendation).

Our Research Areas

  • Human-Centered Machine Learning

  • Wearable interfaces, on-body feedback and electronic textiles

  • Applications of Self-Supervised Learning techniques to understand physiological, verbal and visual information

  • New input and interaction concepts

  • Embedded electronics, rapid prototyping & Toolkits

  • Augmented Reality