Google's Spectacular AR Features are All Set to Hit the Web and Mobile Apps Soon; The Tech Giant Gives Demonstration of Select Features!

Ever since the official launch of ARCore last year, Google has been making efforts to improve it. The Tech Giant is now all set to reveal some advanced upgrades to physics and depth detection.

The Depth-related upgrades will soon enable developers to execute occlusion i.e. real world objects blocking the view of artificial objects in a scene. For example, a virtual cat placed in your living room will disappear from view if you position the camera in such a way that a real-world object such as a couch or table comes in between. This makes the entire scene look more realistic.

According to Google, existing software will be optimized for this purpose and you will not have to use a phone with a certain processor type or sensor. So, all you have to ensure is that you have one of the latest Android phones with support for ARCore.



Google has already set up a test environment to demonstrate the new depth technology and as per various reports, the technology is working fine. It should also be available by now, as it came along with the updates to Houzz (Home design app) and Google’s native AR in Search feature.

Occlusion will now be supported by the furniture items that you discover in Houzz app’s “View in My Room 3D” feature. Moreover, occlusion will be available in more than 200 million Android gadgets for objects with an AR model in Google Search.


Apart from the upgrades that have already been rolled out, there’s much more to the Depth API that is currently in the works and will not be available to the general audience anytime soon. However, developers might get roped in sooner than expected as Google wants to join forces with them to ensure improvement.

These advancements are on a completely different level. Google has basically created a method for AR objects to engage with the real world in a more realistic way.

Google has also come up with some kind of a mini-game to demonstrate AR objects’ ability to move through an environment in a realistic way, and the surface interaction abilities of Depth API. You can find a cooking robot that indulges in a food fight with you, reacts in accordance with the furniture and walls of the environment and appropriately responds to your attacks.

Google has no plans for making these demonstrations available to the general audience just yet. As for now, a timeline for the release of these capabilities isn’t known. However, there is a chance that users might get a hold of them through different apps and AR web experiences over the next 12 months.

Read next: Google Adds Up Focus Mode (Along With Scheduling Features) In Its Suite of Digital Well-Being Tools
Previous Post Next Post