November 17th, 2015
Roozbeh Jafari, Associate Professor for the Department of Biomedical Engineering at Texas A&M University is leading the development of a tool for American Sign Language (ASL) translation. While previous attempts for automatic ASL translation have largely relied on cameras and visual tracking technology, Jafari’s project tracks muscle movement and external motion. “The sensor is based on EMG, or electromyogram technology,” Jafari said. “Combined with the external motion sensors, which show us the overall hand movement, the EMG allows us to discriminate between gestures,” he said. “A fine-grain of interpretation […] motion sensors give us the overall sense and muscle activities give us information about the fine-grained intent.”
The prototype was revealed this past June at the Institute of Electrical and Electronics Engineers (IEEE) 12th Annual Body Sensor Networks Conference, . . .