AvatAR: An immersive analysis environment for human motion data combining interactive 3D avatars and trajectories

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in KonferenzbandBeigetragen

Beitragende

Abstract

Analysis of human motion data can reveal valuable insights about the utilization of space and interaction of humans with their environment. To support this, we present AvatAR, an immersive analysis environment for the in-situ visualization of human motion data, that combines 3D trajectories with virtual avatars showing people's detailed movement and posture. Additionally, we describe how visualizations can be embedded directly into the environment, showing what a person looked at or what surfaces they touched, and how the avatar's body parts can be used to access and manipulate those visualizations. AvatAR combines an AR HMD with a tablet to provide both mid-air and touch interaction for system control, as well as an additional overview device to help users navigate the environment. We implemented a prototype and present several scenarios to show that AvatAR can enhance the analysis of human motion data by making data not only explorable, but experienceable.

Details

OriginalspracheEnglisch
TitelProceedings of the ACM Conference on Human Factors in Computing Systems (CHI)
Redakteure/-innenSimone Barbosa, Cliff Lampe, Caroline Appert, David A. Shamma, Steven Drucker, Julie Williamson, Koji Yatani
Seiten23:1-23:15
Seitenumfang15
ISBN (elektronisch)978-1-4503-9157-3
PublikationsstatusVeröffentlicht - 2 Mai 2022
Peer-Review-StatusNein

Externe IDs

Scopus 85130554142
Mendeley 9400561e-4605-342e-a42d-fc3fa9f3d89d
dblp conf/chi/ReipschlagerBDM22
unpaywall 10.1145/3491102.3517676
ORCID /0000-0002-2176-876X/work/151435427

Schlagworte

Schlagwörter

  • Immersive Analytics, In-situ visualisation, analysing space utilization, augmented/mixed reality, human motion data, motion analysis