Movement and Location Notation for American Sign Language
Date
2006
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Producer
Director
Performer
Choreographer
Costume Designer
Music
Videographer
Lighting Designer
Set Designer
Crew Member
Funder
Rehearsal Director
Concert Coordinator
Advisor
Moderator
Panelist
Alternative Title
Department
Swarthmore College. Dept. of Linguistics
Type
Thesis (B.A.)
Original Format
Running Time
File Format
Place of Publication
Date Span
Copyright Date
Award
Language
en_US
Note
Table of Contents
Terms of Use
Full copyright to this work is retained by the student author. It may only be used for non-commercial, research, and educational purposes. All other uses are restricted.
Rights Holder
Access Restrictions
Terms of Use
Tripod URL
Identifier
Abstract
This paper outlines a hybrid notational system for transcribing the
movement and location parameters of signs in American Sign Language
(ASL). This system is designed to aid in the translation of ASL to
English. The paper begins by summarizing the current translation
technology, focusing on the instrumentation-based approach favored by
the New Jersey Institute of Technology. This approach uses a handshape
recognizer and an inertial positioning system to identify the handshape,
orientation, movement, and location components of ASL. The nonmanual
component can not be measured with the current system. Of these five
parameters, only handshape and orientation are defined adequately by
existing notational systems for identification by instrument-based
approaches. The accepted linguistic descriptions (the Stokoe and
Movement-Hold systems) and several notational systems (HamNoSys,
Labanotation, and Farris) are surveyed, focusing on the assets and deficits
of their methods for representing movement and location with respect to
the inertial positioning system. A hybrid notational system for movement
and location parameters, based on the organization of data imposed by the
instrumentation used in its collection, is outlined.