REM 2014


Virtual Environment Designs and Navigation using Unity3D

This will allows RPs to practice JavaScript and/or C# for game designs, multimodal sensor simulation for assisted navigation.


Mentoring Team
  • Zhigang Zhu
  • Wai L. Khoo
  • Greo Olmschenk
  • Yuying Gosser

Wearable Vibrotactile Designs Using Arduino and Bluetooth Technology

RPs will be able to learn new technologies in smart sensors and mobile computing. This cluster will be in collaboration with Vista Wearable, Inc., a startup by our students which aims to develop assisted navigation devices for the blind.


Mentoring Team
  • Zhigang Zhu
  • Tony Ro
  • Edgardo Molina
  • Lei Ai
  • Frank Palmer

Mobile Computing in Detection and Recognition for the Blind

The RPs will learn programming skills to develop algorithms of text/object detection and recognition on smartphones or other mobile devices (such as Google Glass, RGB-D sensors), for assisting visually impaired people.


Mentoring Team
  • YingLi Tian
  • Zhigang Zhu
  • Hao Tang
  • Feng Hu
  • Greg Olmschenk

Multimodal Sensor Assisted Navigation Evaluation with Human Subjects

The RPs will be involved in human subject experiments for evaluating the technologies in assisting visually impaired people in recognition and navigation.


Mentoring Team
  • Tony Ro
  • Zhigang Zhu
  • Lei Ai