Time+Place: Thursday 11/02/2016 14:30 Room 337-8 Taub Bld.
Title: Enhanced Human: Wearable computing that transforms how we perceive and interact with our world
Speaker: Jeremy Cooperstock - COLLOQUIUM LECTURE http://www.cim.mcgill.ca/~jer/
Affiliation: McGill University, Montreal
Host: Ran El-Yaniv

Abstract:

The growth of wearable computing brings with it opportunities to 
leverage body-worn sensors and actuators, endowing us with capabilities 
that extend us as human beings. In the scope of my lab's work with 
video, audio, and haptic modalities, we have explored a number of 
applications that help to compensate for or overcome human sensory 
limitations. Examples include improved situational awareness for both 
emergency responders and the visually impaired, treatment of amblyopia, 
balance and directional guidance, and visceral awareness of remote 
activity. These applications impose minimal input requirements due to 
their limited need for interaction by the mobile user. However, more 
general-purpose mobile computing, involving richer forms of manipulation 
of digital content, require alternative display technologies and 
motivate new ways of interacting with this information that break free 
from the limited real estate of small screen displays.


Short Bio:
Jeremy Cooperstock is an associate professor in the department of
Electrical and Computer Engineering, a member of the Centre for
Intelligent Machines, and a founding member of the Centre for
Interdisciplinary Research in Music Media and Technology at McGill
University. He directs the Shared Reality Lab, which focuses on computer
mediation to facilitate high-fidelity human communication and the
synthesis of perceptually engaging, multimodal, immersive environments.
He led the development of the world's first Internet streaming
demonstrations of Dolby Digital 5.1, multiple simultaneous streams of
uncompressed high-definition video, a high-fidelity orchestra rehearsal
simulator, and a simulation environment that renders graphic, audio, and
vibrotactile effects in response to footsteps. Cooperstock's work on the
Ultra-Videoconferencing system was recognized by an award for Most
Innovative Use of New Technology from ACM/IEEE Supercomputing and a
Distinction Award from the Audio Engineering Society. The research he
supervised on the Autour project earned the Hochhausen Research Award
from the Canadian National Institute for the Blind and an Impact Award
from the Canadian Internet Registry Association, and his Real-Time
Emergency Response project won the Gold Prize (brainstorm round) of the
Mozilla Ignite Challenge.  Cooperstock has worked with IBM at the Haifa
Research Center, the T.J. Watson Research Center in Yorktown Heights,
New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was
a visiting professor at Bang & Olufsen, Denmark, where he conducted
research on telepresence technologies as part of the World Opera
Project. 

Refreshments will be served from 14:15
Lecture starts at 14:30