Publication | Open Access
Local light field fusion
1K
Citations
45
References
2019
Year
Realistic RenderingEngineeringNovel ViewsPhysics-based VisionDifferentiable RenderingVirtual RealityComputational ImagingPhotonicsLight Field ImagingVirtual ExplorationMachine VisionPhysicsDeep LearningComputer VisionReal World Scenes3D VisionScene UnderstandingExtended RealityScene Modeling
Prior methods require dense view sampling or lack guidance for sampling to produce high‑quality novel views. The study proposes a deep‑learning algorithm that synthesizes novel views from irregularly sampled images by expanding each view into a local light field via a multiplane image representation and blending adjacent fields. The algorithm expands each sampled view into a local light field using a multiplane image representation, blends adjacent fields to render novel views, and derives a sampling bound from plenoptic theory to guide users, as demonstrated in an AR smartphone app. Applying the derived bound, the method attains Nyquist‑rate perceptual quality while requiring up to 4000× fewer views.
We present a practical and robust deep learning solution for capturing and rendering novel views of complex real world scenes for virtual exploration. Previous approaches either require intractably dense view sampling or provide little to no guidance for how users should sample views of a scene to reliably render high-quality novel views. Instead, we propose an algorithm for view synthesis from an irregular grid of sampled views that first expands each sampled view into a local light field via a multiplane image (MPI) scene representation, then renders novel views by blending adjacent local light fields. We extend traditional plenoptic sampling theory to derive a bound that specifies precisely how densely users should sample views of a given scene when using our algorithm. In practice, we apply this bound to capture and render views of real world scenes that achieve the perceptual quality of Nyquist rate view sampling while using up to 4000X fewer views. We demonstrate our approach's practicality with an augmented reality smart-phone app that guides users to capture input images of a scene and viewers that enable realtime virtual exploration on desktop and mobile platforms.
| Year | Citations | |
|---|---|---|
Page 1
Page 1