Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos
Research output: Contribution to conferences › Paper › Contributed › peer-review
Contributors
Abstract
In this paper, we present Dynascape, an immersive approach to the composition and playback of dynamic real-world scenes in mixed and virtual reality. We use spatially tracked RGB-D cameras to capture point cloud representations of arbitrary dynamic real-world scenes. Dynascape provides a suite of tools for spatial and temporal editing and composition of such scenes, as well as fine control over their visual appearance. We also explore strategies for spatiotemporal navigation and different tools for the in situ authoring and viewing of mixed and virtual reality scenes. Dynascape is intended as a research platform for exploring the creative potential of dynamic point clouds captured with mobile, tracked RGB-D cameras. We believe our work represents a first attempt to author and playback spatially tracked RGB-D video in an immersive environment, and opens up new possibilities for involving dynamic 3D scenes in virtual space.
Details
Original language | English |
---|---|
Pages | 1–12 |
Publication status | Published - 2023 |
Peer-reviewed | Yes |
Conference
Title | ACM Symposium on Virtual Reality Software and Technology 2023 |
---|---|
Abbreviated title | VRST 2023 |
Conference number | 29 |
Duration | 9 - 11 October 2023 |
Location | Te Pae Christchurch Convention Centre |
City | Christchurch |
Country | New Zealand |
External IDs
ORCID | /0000-0002-8923-6284/work/142659862 |
---|---|
ORCID | /0000-0002-3671-1619/work/142659910 |
ORCID | /0009-0005-2434-6318/work/158304154 |
Scopus | 85175258949 |