VIDEO: THIS 3D-PRINTABLE ROBOT ARM IS A SIGN LANGUAGE INTERPRETER...
A while back Engineers at the University of California, San Diego developed a prototype that they call " The Language of Glove", a Bluetooth-enabled, sensor-packed glove that reads the sign language hand gestures and translates that action into text.
Now a team from the University of Antwerp is developing a robotic sign language interpreter. The first prototype of the robot hand, named Project Aslan, is mostly 3D-printed and can translate text into fingerspelling gestures – so basically doing the opposite of the "Language of Glove" – but the team's ultimate goal is to build a two-armed robot with an expressive face, to convey the full complexity of sign language.
There have been a number of technological attempts to bridge the gap between the hearing and deaf communities, including smart gloves and tablet-like devices that translate gestures into text or audio, and even a full-size signing robot from Toshiba.
Project Aslan – which stands for "Antwerp's Sign Language Actuating Node" – is designed to translate text or spoken words into sign language. The Aslan's arm is connected to a computer which is connected a network, this allows users to connect the device to the local network and send text messages to Aslan. The hand will then start communicating in sign language. It currently uses an alphabet system called fingerspelling, where each individual letter is communicated through a separate gesture.
The robot hand is made up of 25 plastic parts 3D-printed from an entry-level desktop printer, plus 16 servo motors, three motor controllers, an Arduino Due microcomputer and a few other electronic components. The plastic parts reportedly take about 139 hours to print, while final assembly of the robot takes another 10. Check out Project Aslan in action in the video below.