Towards a ‘Multisensory’ Future of Computing 26/01/18 in KB L.T 1.3 at 14:00Published: Wednesday, 29 November 2017
Towards a ‘Multisensory’ Future of Computing 26/01/18 in KB L.T 1.3 at 14:00
The use of the senses of vision and audition as interactive means has dominated the field of Human-Computer Interaction (HCI) for decades, even though nature has provided us with many more senses for perceiving and interacting with the world around us. That said, it has become attractive for HCI researchers and designers to harness touch, taste, and smell in interactive tasks and experience design. Despite the increasing interest in the different senses as interaction modalities in HCI, there is only a limited understanding of what tactile, gustatory, and olfactory experiences we can design for and how to integrate those sensory stimuli into the interaction with and through technology in a meaningful way. Within this talk, I will present a snapshot into the challenges and opportunities for multisensory HCI.
Marianna Obrist is a Reader in Interaction Design at the Department of Informatics, School of Engineering and Informatics at the University of Sussex, UK. Marianna is leading the Sussex Computer Human Interaction Lab (SCHI ‘sky’ Lab), a research group dedicated to the investigation of multisensory experiences for interactive technology. The interdisciplinary SCHI Lab team explores tactile, gustatory, and olfactory experiences as novel interaction modalities. This research is mainly supported by a five-year grant from the European Research Council. Before joining Sussex, Marianna was a Marie Curie Fellow at Newcastle University and prior to this an Assistant Professor for Human-Computer Interaction at the University of Salzburg, Austria. More details on her research can be found at: http://www.multi-sensory.info