The idea of a spatial computing revolution rests on a bedrock capability: Seamlessly integrating virtual objects with the real world via an augmented reality interface. Google has now officially rolled out its ARCore Depth API for Android, which could be a starting gun for spatial computing development.
As Rajat Paharia, Product Lead, AR Platform, wrote in a post on Google's developer page, occlusion is the key capability of Depth API. That is the ability for digital objects to accurately appear behind real-world objects.
It sounds simple, but it's been a real stumbling block for AR, leading to a glitchy experience in which virtual objects seem to mask or teleport through physical objects in the real world. Mastering occlusion helps ensure virtual objects feel real, which is key to the technology growing beyond silly social media masks and awful marketing gimmicks.
With the global market of WebAR-compatible devices approaching 3 billion, and as standards continue to improve, mobile AR is set to become a key delivery platform for immersive experiences. There's also a massive move toward always-on AR. Imagine a world in which AR is completely integrated into day-to-day life, rather than one-off marketing or storytelling campaigns. We're still a ways off, but unquestionably occlusion is one of the challenges that will need to be addressed.
Google first previewed ARCore Depth API last year. Depth-from-motion algorithms generate a depth map with a single RGB camera, like the one found in phones. Google has used the runway since that preview to explore the technology with developers across a range of use cases. One of those, Illumix, a game studio, has been using occlusion to deepen the realism of one of its gaming experiences, Five Nights at Freddy's AR: Special Delivery. Characters in the game can now behind real-world objects and jump out.
Google Creative Lab also experimented with the technology, developing an app called Lines of Play, in which users create domino art in AR and topple the creations, which collide with furniture and other physical barriers in a room.
Paharia points out that that improved sensor hardware will significantly expand the capabilities of AR. "While depth sensors, such as time-of-flight (ToF) sensors, are not required for the Depth API to work, having them will further improve the quality of experiences."
The sensors could reduce scanning times, reducing lag, and helping further integrate the virtual and real-world experiences.