Ever since Google launched ARCore last year, an augmented reality software kit for developers, the company has been hard at work tweaking the technology to convince users that the augmented objects they see exist in the real world.
One of the main reasons for skepticism is that virtual objects often disappear as soon as a real object comes within the camera's angle – or, even less realistically, their digital bodies overlap with real-world tables and beds, and appear clumsily pasted on top of them.
The new feature Google will remedy this issue, according to the company. Dubbed the ARCore Depth API, the tool will let developers use their phone cameras to measure the distance to objects in a scene, as well as how far apart these objects stand.
SEE: Executive's guide to the business value of VR and AR (free ebook)
The Depth API can take multiple photos from a single camera as the phone moves around the room, and then compare the images to estimate each object's distance to every pixel.
"With a single moving camera, we can give you a 3D understanding of the world," said Konstantine Tsotsos, software engineer at Google. "Once your camera understands 3D space, your content can collide with the world."
Estimating depth is typically done through specialized time-of-flight sensors, which use artificial light signals to measure the distance between the camera and the subject. But Google said that the Depth API can be carried out with a smartphone – although the addition of depth sensors will improve the quality of the effect.
One key application of the new technology is occlusion, or the ability for digital objects to accurately appear in front or behind real-world objects. Google demonstrated this with a video of one virtual cat perched awkwardly on a sofa and then another positioned more realistically behind the sofa.
Last year, software development company Niantic had already experimented with occlusion and released a short demo showing an augmented Pikachu in Pokemon Go ducking and hiding behind passers-by.
Although the video was recognized as a step forward in augmented realism, occlusion is still far from being the norm in AR. "Content often looks pasted on the screen rather than in the world," said Tsotsos.
In addition to occlusion, the new API will let virtual objects accurately bounce and splash across surfaces, or react to different textures. Developers could, for example, add snow piling up in a scene, or rain splashing against a sidewalk.
SEE: How 5G will affect augmented reality and virtual reality
Dan Miller, developer at video games company Unity Technologies, said: "It's about trying to pull in as much information as possible from the real world into the digital world, so that you can help augmented objects feel grounded in the real world."
"Depth is just one additional layer to really have these objects feel they exist in the real world."
Earlier this year, Google released a new mode for ARCore called Environment HDR, which gathers light data through the user's phone camera and extends it into an augmented scene, applying highlights and shadows accordingly.
The new Depth API, however, doesn't seem to be generally available just yet. Developers interested in working with the tool will have to respond to Google's call for collaborators.