Live subtitles: How smart technology could help deaf people
By William Mager
There are many new technologies that can help people with disabilities, like live subtitling 24/7 for deaf people, but how well do they work?
Deaf people always remember the first time a new technology came on the scene, and made life just that little bit easier in a hearing world.
I’ve had many firsts. Television subtitles, text phones, the advent of the internet and texting all opened up opportunities for me to connect with the wider world and communicate more easily.
After a while tiredness overtakes excitement and I take the headset off”
So when I first heard about Google Glass – wearable technology that positions a small computer screen above your right eye – I was excited. Live subtitling 24/7 and calling up an in-vision interpreter at the touch of a button. Remarkably both seemed possible.
That was a year ago. Since then, Tina Lannin of 121 Captions and Tim Scannell of Microlink have been working to make Google Glass for deaf people a reality. They agreed to let me test out their headset for the day.
First impressions are that it feels quite light, but it is difficult to position so that the glass lens is directly in front of your eye.
Once you get it in the “sweet spot” you can see a small transparent screen, it feels as though it is positioned somewhere in the distance, and is in sharp focus. The moment you get the screen into that position feels like another first – another moment when the possibilities feel real.
But switching your focus from the screen to what’s going on around you can be a bit of a strain on the eyes. Looking “up” at the screen also makes me look like I’m a bad actor trying to show that I’ve had an idea, or that I’m deep in thought.
The menu system is accessed in two ways. There is a touch screen on the side which can be swiped back and forth, up and down, and you tap to select the option you want.