Reports are in that Google, as a part of its Glass Project, is developing technology that will enable blind people to see through a sight to sound translation mechanism. The concept in use here is that of Sound Navigation and Ranging or sonar- which is used by submarines and also marine animals like dolphins for determining positions of target objects. Let’s see how this works:
Active sonar creates a pulse of sound or ping, which is directed towards the target and then listens for the reflection or echo of the pulse. This pulse is electronically generated using a sonar projector which consists of power amp, signal generator and electro acoustic transducer. The distance of the target object is measured by calculating the time from pulse transmission to reception which is converted into a range by finding out the speed of sound.
The Google Glass already has a camera to see the target object and headphones to hear sounds but it still needs a hardware system to transmit the pulse to the target object and receive the echo or sound reflected from the target. Thus, we can expect some sound to sight translator device that connects with Google glass.
The tech community is eagerly waiting to see how this project will turn out. The anticipation is not just because of the level of sophistication that is embedded in the project but also because it intends to give the visually impaired a chance to see. We can imagine how this revolutionary technology will change the lives of millions of people.