SpiderVision is a wearable device that extends the human field of view to augment a user’s awareness of things happening behind one’s back. SpiderVision leverages a front and back camera to enable users to focus on the front view while employing intelligent interface techniques to cue the user about activity in the back view. The extended back view is only blended in when the scene captured by the back camera is analyzed to be dynamically changing. In this project, we explore factors that affect the blended extension, such as view abstraction and blending area.

Example setting: a user, wearing SpiderVision, is reading a book. A passerby wants to get his attention, waves and eventually throws a ball at him (see slideshow - second image). SpiderVision then blends the back view in (see slideshow - third image. Here: a full blending, overlaying both front and back view), showing a live feed of the passerby throwing the ball. This is only one of many visualizations we explore within the SpiderVision project.


  • Kevin Fan, Jean-Marc Seigneur, Jonathan Guislain, Suranga Nanayakkara, Masahiko Inami. Augmented Winter Ski with AR HMD. AH '16 Proceedings of the 7th Augmented Human International Conference 2016 (Article No. 34) [PDF]
  • Kevin Fan, Jean-Marc Seigneur, Suranga Nanayakkara, Masahiko Inami. Electrosmog Visualization through Augmented Blurry Vision. AH '16 Proceedings of the 7th Augmented Human International Conference 2016 (Article No. 35) [PDF]
  • Kevin Fan, Jochen Huber, Suranga Nanayakkara and Masahiko Inami. “SpiderVision: Extending the Human Field of View for Augmented Awareness”. In Proceedings of the 5th Augmented Human International Conference (AH ’14), ACM, Article 49. [DOI]