WE ARE on a mission to help visually impaired people navigate in a sighted world.
About 36 million people worldwide are blind. For many of these people, navigating new spaces can be a cumbersome or frustrating experience as they listen and feel their way around their environment.
AR for VIPs seeks to utilize the spatial mapping power of augmented reality devices in order to solve 5 meter problems for users. This could improve independence for blind people by allowing them to find and read objects such as bus stops without sighted assistance.
SEE HOW IT WORKS
Our project uses Augmented Reality devices and a combination of spatial audio clues and speech sounds to deliver semantic information in the surroundings to a blind and visually impaired person.
Watch our demonstration video below, or find audio descriptions here: https://bit.ly/2VtYl6M
The HoloLens builds up a 3D map of the user’s environment over time. It uses infrared mapping to create a triangular mesh that lets applications approximate where objects are.
We use the mesh generated by the HoloLens to implant digital audio beacons that sonify obstacles in the environment.
We capture images using the HoloLens camera and read out semantic information in the environment with the help of Google’s text recognition APIs.
School of Information
We are an interdisciplinary team of Master's students at UC Berkeley's School of Information. Our advisor is UCB faculty member and HCI expert Kimiko Ryokai.
XR Designer, Spatial Audio Composer
XR Engineer, Text Recognition
XR Designer and Engineer, Spatial Mapping
Extended Reality at Berkeley
Extended Reality at Berkeley is a student group dedicated to bringing virtual reality to the campus community. These undergraduates helped develop the original AR for VIPs prototype and are now responsible for taking it in exciting new directions!