Profundidad en la Realidad Aumentada

New features of Augmented Reality arrive on Android

Developers of Augmented Reality experiences for Android devices will now be able to make use of the new depth API that the search giant just released in ARCore 1.18 for Android and Unity, opening up to a new set of possibilities.

This API used depth of motion algorithms to create depth maps without the need for specialized hardware, although ToF sensors will help in the quality of experiences, offering new sets of possibilities, including occlusion of objectsthat is, the ability for virtual objects to be shown behind real objects with greater precision, practically as if they existed in the real world, among other depth-based possibilities, such as the physics of reality, interactions of reality. surface, environmental path and more.

Since presenting the preview at the end of last year, Google has been working with a number of select contributors about the variety of cases in which depth can be used in Augmented Reality experiences.

In this sense, Google gives as an example of experiences based on the new API to Illumix, which in its game Five Nights at Freddys AR: Special Delivery, uses the occlusion of characters behind objects, to new lenses for Snapchat, including the a new underwater lens exclusively for Android, and Lines of Play, an experimental Android app that combines occlusions with collisions.

Additionally, Google notes that the new depth API can also be used to make annotations in Augmented Reality during video calls, using the TeamViewer Pilot application, a remote assistance solution, as an example, which uses depth to create annotations in the real environment with greater precision that allows experts in their remote maintenance and support tasks to understand them.

It only remains to wait for the new experiences to come through different applications, and even through lenses, filters and other routes from different applications.