An Action-Based Concept for the Phonetic Annotation of Sign Language Gestures
Research output: Contribution to book/Conference proceedings/Anthology/Report › Conference contribution › Contributed › peer-review
Contributors
Abstract
Communicative actions are specific movements or gestures which are accomplished by vocal tract articulators (lips, tongue, velum etc.) for speech, by facial articulators (eye brows, eye lids etc.) for co-verbal facial expression and by other bodily articulators (hands, arms etc.) for co-verbal gesturing. While action-based approaches already exist for spoken language processing it is the aim of this paper to adopt action theory for signed language processing (i.e. production and perception of sign language). Method: An action-based method for the phonetic annotation of sign language has been developed and a 100 sentence American Sign Language corpus has been analyzed using this method. Results: Five basic types of sign actions were identified, all indicating the importance of movement phases even if the goal of a gesture is to reach a specific target (e.g. specific hand shape, orientation, location, and/or direction). Conclusion: This study is a starting point for investigating sign language production quantitatively in terms of a unified action theory.
Details
Original language | English |
---|---|
Title of host publication | Elektronische Sprachsignalverarbeitung 2010 |
Editors | Hansjörg Mixdorff |
Publisher | Dresden : TUDpress |
Pages | 33-39 |
Number of pages | 7 |
ISBN (print) | 978-3-941298-85-9 |
Publication status | Published - 1 Mar 2010 |
Peer-reviewed | Yes |
Externally published | Yes |
Publication series
Series | Studientexte zur Sprachkommunikation |
---|---|
Volume | 53 |
ISSN | 0940-6832 |
External IDs
ORCID | /0000-0003-0167-8123/work/168716959 |
---|
Keywords
Keywords
- Multimodal Communication