VR 2017

2017 IEEE Virtual Reality (VR), March 18-22, 2017, Los Angeles, CA, USA

Desktop Layout

360° Video Cinematic Experience
Conference Papers
Ballroom A, Chair: Gerd Bruder
MR360: Mixed Reality Rendering for 360° Panoramic Videos
Taehyun Rhee, Lohit Petikam, Benjamin Allen, and Andrew Chalmers
(Victoria University of Wellington, New Zealand)
Abstract: This paper presents a novel immersive system called MR360 that provides interactive mixed reality (MR) experiences using a conventional low dynamic range (LDR) 360° panoramic video (360-video) shown in head mounted displays (HMDs). MR360 seamlessly composites 3D virtual objects into a live 360-video using the input panoramic video as the lighting source to illuminate the virtual objects. Image based lighting (IBL) is perceptually optimized to provide fast and believable results using the LDR 360-video as the lighting source. Regions of most salient lights in the input panoramic video are detected to optimize the number of lights used to cast perceptible shadows. Then, the areas of the detected lights adjust the penumbra of the shadow to provide realistic soft shadows. Finally, our real-time differential rendering synthesizes illumination of the virtual 3D objects into the 360-video. MR360 provides the illusion of interacting with objects in a video, which are actually 3D virtual objects seamlessly composited into the background of the 360-video. MR360 was implemented in a commercial game engine and tested using various 360-videos. Since our MR360 pipeline does not require any pre-computation, it can synthesize an interactive MR scene using a live 360-video stream while providing realistic high performance rendering suitable for HMDs.

Authors:


Time stamp: 2019-05-23T07:16:56+02:00