The gigantic technological evolution of the latest years provided us with a lot of cool gadgets. These gadgets are not only fun to play with, they have underused potential to become the answers to huge challenges our society faces. 1 of these challenges is the communication between deaf and hearing people. During our Hackathon, we developed a platform to unite them.
Communicating in our society can be quite challenging for deaf people. Translation difficulties create a gap between them and hearing people, causing prejudices about deaf people and their abilities and skills This has clearly been illustrated by an article in De Standaard, which states that a deaf doctor has been sent to work at a sheltered workplace by VDAB, only because she is deaf.
This made us dream of a world in which being deaf is not a restriction. We wanted to bridge the gap between deaf and hearing people by using modern technology. It should be possible to teach hearing people sign language in an interactive way. When sign language becomes widely known, the gap is closed, making the possibilities become endless.
With this purpose in mind, we started researching different technologies that are able to track gestures. The Leap Motion, a small device targeted for virtual reality applications, especially tailored for gaming, was the best fit. It is able to track the position and movement of the hands, fingers and phalanges. Their development kit also provides code that generates a 3D rendered representation of the hand tracked by the device in real time. Imagine the possiblities?
Before implementing our solution we did extensive research about the deaf community in Flanders. We also contacted Fevlado, the federation of Flemish organizations of deaf people, and asked them about the concrete problems deaf people are facing daily. In the end, we wanted to create something really useful, something that really met and even exceeded the expectations.
Our research learned that:
Sign language is region bound, just like spoken language. There's no such thing as universal sign language
Based on this, we decided to initially focus on VGT but to build a platform that's extensible and has the potential to be used for other sign language dialects.
Our research learned us that converting spoken language to text only is not enough for the majority of deaf people. They would still experience difficulties during meetings for example, where many people talk at the same time. It would also be very difficult to track who said what based on the transcripts. For a speech-interpretation app, a translation to sign language could benefit a portion of the potential users.
Sign language is complex and for the sake of the hackathon we decided to start lean and support the alphabet only. We tracked the hand gestures for every letter of the alphabet a dozen times. This data represented our initial dataset. By comparing this datatset to the data the leap motion generates in real time, we were able to match hand gestures to sign language. By using machine learning algorithms we trained the platform to be more accurate in mapping gestures to the manual alphabet.
We implemented different use cases:
We are very proud of what we achieved but we don't want this to be the end of our story. We continue dreaming of a world where anyone has access to this platform, can use it in his own source and target language and can make the platform evolve in order to create a better world for everyone.