Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos

Research output: Contribution to conferencesPaperContributedpeer-review

Abstract

In this paper, we present Dynascape, an immersive approach to the composition and playback of dynamic real-world scenes in mixed and virtual reality. We use spatially tracked RGB-D cameras to capture point cloud representations of arbitrary dynamic real-world scenes. Dynascape provides a suite of tools for spatial and temporal editing and composition of such scenes, as well as fine control over their visual appearance. We also explore strategies for spatiotemporal navigation and different tools for the in situ authoring and viewing of mixed and virtual reality scenes. Dynascape is intended as a research platform for exploring the creative potential of dynamic point clouds captured with mobile, tracked RGB-D cameras. We believe our work represents a first attempt to author and playback spatially tracked RGB-D video in an immersive environment, and opens up new possibilities for involving dynamic 3D scenes in virtual space.

Details

Original languageEnglish
Pages1–12
Publication statusPublished - 2023
Peer-reviewedYes

Conference

TitleACM Symposium on Virtual Reality Software and Technology 2023
Abbreviated titleVRST 2023
Conference number29
Duration9 - 11 October 2023
LocationTe Pae Christchurch Convention Centre
CityChristchurch
CountryNew Zealand

External IDs

ORCID /0000-0002-8923-6284/work/142659862
ORCID /0000-0002-3671-1619/work/142659910
ORCID /0009-0005-2434-6318/work/158304154
Scopus 85175258949

Keywords