Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos
Publikation: Beitrag zu Konferenzen › Paper › Beigetragen › Begutachtung
Beitragende
Abstract
In this paper, we present Dynascape, an immersive approach to the composition and playback of dynamic real-world scenes in mixed and virtual reality. We use spatially tracked RGB-D cameras to capture point cloud representations of arbitrary dynamic real-world scenes. Dynascape provides a suite of tools for spatial and temporal editing and composition of such scenes, as well as fine control over their visual appearance. We also explore strategies for spatiotemporal navigation and different tools for the in situ authoring and viewing of mixed and virtual reality scenes. Dynascape is intended as a research platform for exploring the creative potential of dynamic point clouds captured with mobile, tracked RGB-D cameras. We believe our work represents a first attempt to author and playback spatially tracked RGB-D video in an immersive environment, and opens up new possibilities for involving dynamic 3D scenes in virtual space.
Details
Originalsprache | Englisch |
---|---|
Seiten | 1–12 |
Publikationsstatus | Veröffentlicht - 2023 |
Peer-Review-Status | Ja |
Konferenz
Titel | ACM Symposium on Virtual Reality Software and Technology 2023 |
---|---|
Kurztitel | VRST 2023 |
Veranstaltungsnummer | 29 |
Dauer | 9 - 11 Oktober 2023 |
Ort | Te Pae Christchurch Convention Centre |
Stadt | Christchurch |
Land | Neuseeland |
Externe IDs
ORCID | /0000-0002-8923-6284/work/142659862 |
---|---|
ORCID | /0000-0002-3671-1619/work/142659910 |
ORCID | /0009-0005-2434-6318/work/158304154 |
Scopus | 85175258949 |