Deaf people have a lot of trouble understanding others, especially in places when they don’t have an american sign language (ASL) interpreter nearby or when he’s not directly in their eye sight, which can happen in a lot of important moments. But Google Glass can now help them understand what a narrator says without the need of an ASL interpreter at all, thanks to the SignGlasses project, driven by a group at Brigham Young University.
The SignGlasses project is trying to help deaf students right now and is conducting experiments at the planetarium. The students wear Google Glass and they see the ASL interpreter directly on the Glass’ display.
If their project proves to be successful and it does work properly, we expect to see it as an app, which could help a lot of deaf people in difficult situations.
The team is working close with deaf students and other researchers from Georgia Tech and they are trying to expand SignGlasses even further.
Another idea they are trying right now is helping them get information about a word they don’t understand. The user has to point the finger at the specific word while pressing the camera button and Glass will display a video definition of the word.
All the information about the project and its results will be made public at the Interaction Design and Children even in June.
You can watch the video below to understand better what the project intends to achieve.