Collaborative Illumination Estimation for Mobile Mixed-Reality Devices

Background

Light estimation is a critical component for realistic rendering of virtual scenes. For mixed-reality, a merging of virtual and physical worlds, accurate light estimation is especially important; inconsistencies in shadow and highlight rendering between the two domains are particularly noticeable, diminishing the intended immersive experience for the user. Current commercial technologies present only coarse illumination estimation by ambient light sensing of a scene’s average pixel values. More advanced academic research solutions estimate directional light intensity by combining light transmission samples from scene geometry with machine learning. However, these approaches can be computationally expensive, slow to update, and prone to inaccuracies. For these reasons, a light estimation technique that balances illumination resolution, coverage, and updating speed can greatly enhance user experience for mixed-reality devices. 

 

Invention Description

Researchers at Arizona State University have developed a system for illumination estimation that prioritizes computational efficiency and real-time update speed. The approach creates an environment map (a representative lighting model of the scene) by using specular target objects to generate radiance samples for each camera image pixel. Building on this technique is a novel network feature that addresses inherent insufficiencies of single-viewpoint sampling: Multiple mobile devices can collaboratively sense radiance samples for aggregation into a multiple-viewpoint interpolated environment map.

 

To evaluate runtime performance and perceptual efficacy, the system was implemented on the Unity 3D Game Engine and deployed on both Android and iOS devices. When update speeds were at high priority, lighting estimation was achieved in as little as 15 milliseconds, while high-spatial-quality estimations required 200 milliseconds. In a user study conducted with 26 augmented scenes, 66.7% of the 99 participants indicated a preference towards results generated by this invention over those from competing techniques, while 12.6% of participants were indifferent.

 

Potential Applications

•       Gaming

•       Workplace training

•       Education

•       Real Estate

•       Medical imaging

 

Benefits and Advantages

•       Innovative – System is first to use collaborative sensing for illumination estimation, and balances operating speeds with rendering quality

•       Effective – User study indicates strong preference over existing techniques

•       Comprehensive – Combines samples from multiple viewpoints to create a full environmental map

•       Integrative – Framework can work in tandem with existing light estimation techniques

 

Homepage of Professor Robert LiKamWa

 

Patent Information: