
Yeo Kian Peen
Senior Research Engineer
Yeo Kian Peen received his B.Eng and M.Eng degree from the National University of Singapore. He started his research career in the area of bio-acoustics and signal processing while working as a research engineering in the Marine Mammal Research Laboratory (MMRL), Tropical Marine Science Institute (TMSI). In 2011, he joined the Augmented Senses Group in SUTD and entered into a whole new different research area in Human Computer Interactions (HCI). A hands-on person, he brings with him skills in electronics, hardware design and prior knowledge in acoustics to his current research, exploring the exciting area of sound augmentation.
Off work, Kian Peen is an avid independent backpacker, whom enjoys interacting with people from all over the world. During his free time, Kian Peen also likes to cycle, swim, catch a good movie or meet up with friends for food and drinks.
Publications
iSwarm: a swarm of luminous sea creatures that interact with passers-by
Schroepfer, T., Nanayakkara, S. C., A., Wortmann, T., Cornelius, A., Khew, Y.N. and Lian, A., Yeo K.P., Peiris, R.L., Withana, A., Werry. I., Petry, B., Otega, S., Janaka, N., Elvitigala, S., Fernando, P., Samarasekara, N., 2014. iSwarm: a swarm of luminous sea creatures that interact with passers-by. In Singapore iLight 2014 (Marina Bay, Singapore, 7- 30 March, 2014).
StickEar: Making Everyday Objects Respond to Sound
Yeo, K.P., Nanayakkara, S.C. and Ransiri, S., 2013, October. StickEar: making everyday objects respond to sound. In Proceedings of the 26th annual ACM symposium on User interface software and technology (pp. 221-226).
StickEar: Augmenting Objects and Places Wherever Whenever
Yeo, K.P. and Nanayakkara, S.C., 2013. StickEar: augmenting objects and places wherever whenever. In CHI'13 Extended Abstracts on Human Factors in Computing Systems (pp. 751-756).
SpeechPlay: Composing and Sharing Expressive Speech Through Visually Augmented Text.
Yeo, K.P. and Nanayakkara, S.C., 2013, November. SpeechPlay: composing and sharing expressive speech through visually augmented text. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration (pp. 565-568).
SmartFinger: Connecting Devices, Objects and People seamlessly
Ransiri, S., Peiris, R.L., Yeo, K.P. and Nanayakkara, S.C., 2013, November. SmartFinger: connecting devices, objects and people seamlessly. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration (pp. 359-362).
FingerDraw: More than a Digital Paintbrush
Hettiarachchi A., Nanayakkara S. C., Yeo K.P., Shilkrot R. and Maes P. ”FingerDraw: More than a Digital Paintbrush”, ACM SIGCHI Augmented Human, March, 2013.
EyeRing: A Finger Worn Input Device for Seamless Interactions with our Surroundings
Nanayakkara S. C., Shilkrot R. Yeo K.P. and Maes P.”EyeRing: A Finger Worn Input Device for Seamless Interactions with our Surroundings”, ACM SIGCHI Augmented Human, March, 2013.
AugmentedForearm: Exploring the Design Space of a Display-enhanced Forearm
Olberding S., Yeo K.P., Nanayakkara S.C. and Steimle J. “AugmentedForearm: Exploring the Design Space of a Display-enhanced Forearm”, ACM SIGCHI Augmented Human, March, 2013.