Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos

Publikation: Beitrag zu KonferenzenPaperBeigetragenBegutachtung

Abstract

In this paper, we present Dynascape, an immersive approach to the composition and playback of dynamic real-world scenes in mixed and virtual reality. We use spatially tracked RGB-D cameras to capture point cloud representations of arbitrary dynamic real-world scenes. Dynascape provides a suite of tools for spatial and temporal editing and composition of such scenes, as well as fine control over their visual appearance. We also explore strategies for spatiotemporal navigation and different tools for the in situ authoring and viewing of mixed and virtual reality scenes. Dynascape is intended as a research platform for exploring the creative potential of dynamic point clouds captured with mobile, tracked RGB-D cameras. We believe our work represents a first attempt to author and playback spatially tracked RGB-D video in an immersive environment, and opens up new possibilities for involving dynamic 3D scenes in virtual space.

Details

OriginalspracheEnglisch
Seiten1–12
PublikationsstatusVeröffentlicht - 2023
Peer-Review-StatusJa

Konferenz

TitelACM Symposium on Virtual Reality Software and Technology 2023
KurztitelVRST 2023
Veranstaltungsnummer29
Dauer9 - 11 Oktober 2023
OrtTe Pae Christchurch Convention Centre
StadtChristchurch
LandNeuseeland

Externe IDs

ORCID /0000-0002-8923-6284/work/142659862
ORCID /0000-0002-3671-1619/work/142659910
ORCID /0009-0005-2434-6318/work/158304154
Scopus 85175258949

Schlagworte