EFRI-M3C: MAP4VIP: Man-Machine Co-Learning for Visually Impaired People
Project Status: Active (Started: 2011)
Research Description
The objective of this EFRI-M3C project is to develop models of sensorimotor control in order to establish a set of design criteria for developing improved assistive technologies for visually impaired people. Multimodal sensory information obtained by wearable machine sensors will substitute for and/or augment impaired vision with somatosensory and other novel stimulation (?transducing?) methods. The project has the following three research threads: (1) Determination of the information needed by visually impaired people to perform wayfinding and arm reaching tasks and the impact on task performance when transducing information through visual, auditory and vibrotactile modalities. (2) Understanding of motor skill acquisition and control through in vivo measurements of brain activity and movement performance. (3) Development of a sensorimotor model through man-machine co-learning, which can be used for improving sensor/display designs and applied to machines.
The intellectual merit of this research is that it will generate a theoretical foundation for a deeper understanding of the neural mechanisms of sensorimotor integration and motor learning, shedding new light on these functions in humans and machines. This will lead to new design concepts of alternative perception, formulation of required information necessary for successful orientation and wayfinding, and development of cost-effective and revolutionary mechatronic devices to assist visually impaired people in achieving mobility functions comparable to people with normal vision. The interdisciplinary team includes experts in engineering, computer science, psychology, and applied physiology from the City College of New York and Georgia Tech, tackling challenging problems on the boundaries of sensing, cognition, and action. Advisory board members, including experts in (neuro-)ophthalmology and human vision research, as well as counselors at the NYS Commission for the Blind and Visually Handicapped, will provide guidance for the study.
The broader impacts of this research will be assistive technologies for individuals with sensory impairments, whose numbers have been rising due to the increasing population of older adults in the US and around the world. The project will also create new and expand existing academic programs in assistive technologies, brain computer interfaces and sensorimotor integration. It will increase cross-campus education opportunities for students particularly in under-represented groups of the two campuses.
Sponsors
- NSF EFRI-M3C Award on MAP4VIP
- IEEE/NSF Workshop on Multimodal and Alternative Perception for Visually Impaired People (MAP4VIP) (Sponsored by NSF Award #1327236)
- City Seeds – Grant for Interdisciplinary Scientific Research Collaborations, “Wearable and Multimodal Wayfinding for the Visually Impaired,” Zhigang Zhu (Computer Science, PI), Tony Ro (Psychology) and YingLi Tian (Electrical Engineering), 02/01/2011- 01/31/2012