Run a Google search on the terms “Millennial” and “purpose” and you’ll find a great number of articles about the young generation’s desire for helping others in their work.
It therefore came as no surprise that some of the many notable projects in recent hackathons and pitch competitions, wherein student entrepreneurs and programmers come together to create their own products (and even companies), are tools built around social causes.
(I wrote about an all natural, low-cost toothbrush here and a cellular-free communications network for refugees here.)
The same can be said for Philadelphia-based Drexel University’s hackathon, where the winning team last year designed a system for hearing impaired persons to communicate effectively online.
Translation Based on Computer Vision
Nigel Coelho, a recent graduate from Drexel and a partner at First Round Capital’s college venture capital arm Dorm Room Fund, is the co-founder of Sign Me.
Sign Me aims to provide real-time translations for users of the American Sign Language (ASL) using Microsoft’s Kinect Motion Sensor.
The version that won Drexel’s 2016 hackathon would text the translations to another user’s phone via Twilio’s API, which makes it perfect for facilitating conversations between ASL users and non-ASL users when no human translators are present.
After receiving emails from people in the ASL community about his tool, Nigel soon realized that many people are frustrated with the lack of digital resources for hearing impaired persons.
The array of sign language translation apps is quite limited, and virtually no tools are in place at the workplace to allow hearing impaired persons to communicate effectively at the office without the need for a human translator.
“Currently, 360 million people worldwide have some form of disabling hearing loss,” Nigel explained, “and employers are 34% less likely to hire an experienced candidate with a disability.”
He added, “the Americans with Disabilities Act prohibits discrimination against individuals with disabilities in all areas of public life.
“Corporate and public entities are obligated to facilitate communication between deaf people and non-deaf people in the workplace.”
For the next step, Nigel is setting his sights on creating a real-time ASL-to-text translation video chat application since existing services like Skype and Google Hangouts do not support American Sign Language translation and do not cater to the needs of the deaf population.
Nigel Coelho and his team members, Kush Patel, Tyler Tran and Tyler Reynolds already have a working prototype.
“Rather than hiring human translators, advances in computer vision allow ASL translation to be done quickly and effectively online,” said Nigel.
Nigel turned the venture into his senior design project at Drexel, and together with his team wrote a computer vision algorithm that could translate hand gestures into text in real-time while video chatting with someone else virtually.
Essentially, a deaf person can sign into the camera while video chatting with a non-deaf person or a person who is not well versed in ASL, and textual translations will appear on screen in real time.
The Problem with Translation
At this point, you’re probably wondering, “why build a tool for sign language when everyone can communicate by typing in the first place?”
This is a question that Nigel receives a lot from people who saw his team’s demo (myself included), and involves a more complicated answer than one might think.
One thing that many people don’t realize is that ASL and English are different languages.
Despite being the predominant sign language of Deaf communities in the U.S., ASL is not a direct, word-to-word translation of the English language, and has a fundamentally different grammatical structure.
As such, those who are born deaf, or those who choose to use ASL as their primary mode of communication, may not internalize the linguistics and syntax of written English. Some of them have difficulty in constructing and comprehending sentences using written text.
*Don Grushkin, Professor of Deaf Studies, provides an excellent breakdown of this problem on Quora.*
Currently, Nigel’s application allows users to communicate using ASL, which bypasses the issue of textual expression.
The potential next step would be to create a real-time text-to-ASL translation that could help overcome the issue of textual comprehension for Deaf people as well.
“Nonetheless, some hearing impaired individuals that are not deaf at birth may learn English and typing at some point in their life and are able to communicate online through that, in such cases our application may be a choice for them to use,” said Nigel.
Looking ahead, Nigel sees further applications for his team’s tool beyond just real-time translations.
“Another use case could be making educational tools to train individuals in American Sign Language accuracy,” he suggested, “this in turn can promote ASL literacy and help further integrate hearing impaired people into the work force.”
Follow our monthly newsletter to see how entrepreneurs are applying technology to social impact here.