Google Teases New AR Features For Its Search & Lens Apps
Google focuses on practical AR functionality at its annual developer conference.
Google’s annual I/O Developer Conference kicked off earlier this morning at the Shoreline Amphitheatre in Mountain View, CA with a two-hour keynote highlighting a wide range of updates to their lineup of proprietary mobile software.
During the presentation, Google CEO Sundar Pichai took the time to showcase a variety of new camera and AR features coming to both the Google Search and Google Lens applications.
One such improvement is additional AR functionality for Google Search that will allow users to access augmented 3D animated models without the need to switch to separate dedicated app.
Google demoed the new feature by searching for a great white shark through the app and opening an augmented 3D animated model live on-stage in just a matter of seconds; all without leaving the original Google Search app.
Google Lens is also receiving an upgrade in the form of several practical AR-based tools. The company showed how users can point their camera at a restaurants menu to have Lens instantly highlight the menus most popular items and provide additional information on the dishes. When pointed at a check, Lens can also provide automatic tip calculation and even split the bill.
Using their Google Go programming language, they’ve also made several updates to live translation that will allow users to point their cameras at the text to instantly convert it into their preferred language. There’s also the option to listen to the text via translated audio.
AR compatibility between Lens and participating publications—such as AR-enabled cookbooks that project animated visual directly over the text—is on the way as well.
The Google I/O Developer Conference takes place May 7th – May 9th and will cover everything from Android and Pixel, to AR and WearOS. We’ll be staying up-to-date with all the major announcements as they’re revealed.