Gesture Recognition using Electromyographic, Spatial and Temporal Input Data

Date
2016
Journal Title
Journal ISSN
Volume Title
Publisher
Producer
Director
Performer
Choreographer
Costume Designer
Music
Videographer
Lighting Designer
Set Designer
Crew Member
Funder
Rehearsal Director
Concert Coordinator
Moderator
Panelist
Alternative Title
Department
Haverford College. Department of Computer Science
Type
Thesis
Original Format
Running Time
File Format
Place of Publication
Date Span
Copyright Date
Award
Language
eng
Note
Table of Contents
Terms of Use
Rights Holder
Access Restrictions
Open Access
Tripod URL
Identifier
Abstract
With the advancement of technology, electromyographic (EMG) and spatial data recognition is having a growing impact on accessible computing. I designed a customizable script that uses a combination of EMG, spatial and temporal data, such that each user of the script can select a custom profile of gestures they want to use to type. The gestures chosen for each custom profile will be determined by each user’s level of ability/disability. Based off of the custom profiles each user selects and speed at which each user types we determine if using EMG, spatial data and temporal data can serve as viable form of text input. While this research placed a strong emphasis on text input, it also supports the ideal of universal design in other contexts. The Myo armband was used to read and interact with the EMG, spatial and temporal data. The interface of this script also revealed multiple techniques for scripting with the Myo that allows people with disabilities to use spatial data to type.
Description
Subjects
Citation
Collections