3D [Embodied] is a mixed reality performance involving a virtual world as a platform to explore 3D immersive spatial virtual and physical displays. Combining real time geometry perspective transformation of the peripheral projected space and skeleton tracking from the dance performer, 3D [Embodied] experiences spatial augmented reality. Both video and audio rendering are generated in real time.
3D [EMBODIED]
3D [Embodied] investigates 3D immersive spatial embodied design principles. It materializes the core concept of this study by investigation 3D video mapping interactions as an extended agent for the dance performer. It connects real-time interactions between 3D geometry perspective calculations and the body in motion.
Using Quartz Composer node-based visual programming and 3D transformation graphic animation patches, the input from the Kinect camera is mapped onto the geometry of the 3D shape or object. By transforming the generative proprietes in the node, such as direction (along the x,y, and z-axes), speed and scale, the user can control and transform the 3D objects in real time.
The technical and conceptual research developed was published at ImmersiveMe ’13 ACM: dl.acm.org/citation.cfm?id=2512147
Abstract
Dance performance interconnects the notion of space and movement, providing therefore the ideal framework to design and research immersive relations between the virtual and physical world: mixed reality performance. We start by discussing conceptual approaches to immersive and mixed reality environments, and reporting some of the current state of the art solutions. We developed two proof-of-concept systems and frameworks to implement the use of spatial augmented reality applied to dance performance. Our systems establish a real time dialogue and interaction between 3D geometry and perspective grids calculations with body movement. We also present our framework design and implementation of our system, outlining relevant problems and solutions we find in the process. Finally, the frameworks and design processes were validated through a field experiment – the performance – after which we gathered and analyzed the feedback from the performers, the audience and the choreographer.
The output was rendered individually in Quartz Composer and grouped in the final shape of the stage design using Madmapper in order to adjust the perspective transformation and warping of the different planes individually.
Technical description:
Concept / Interactive Design / Video Design / Stage Design by João Beira
Choreography by Yacov Sharir
Sound design by Bruce Pennycook
Dancers: Tawny Jessica, Lily Amelie Hayes, Billie Rose Secular and Reema Bounajem.
Technical support by Sebastian Kox (oneseconds.com), Yago de Quay and Marta Ferraz. Austin, 2013.
Nomination for Best Video Design 2013 by the Austin Critics Table Awards.
Presented at the Payne theater at the University of Texas, Austin April 18-23, 2013.