Google Rolls Out New AR Features For Its Lens App
Copy text, discover new products, and translate foreign languages automatically.
While VR technology may have been absent at this year’s I/O Developer Conference in Mountain View, California, Google did reveal several AR and machine learning-based updates coming soon to Google Lens.
Today, those improvements begin rolling out to iOS and ARCore compatible devices, offering users a bounty of helpful features designed to assist them throughout their day-to-day lives. For example, by activating Google Lens and aiming the camera at a written foreign language (Lens can detect 100 different dialects), the app will instantly detect the language presented and layer a translation directly on top of the original.
Of course, the update does much more than translate. By aiming your camera at a menu, Lens will now automatically search the web for past reviews and then highlight the establishments most popular dishes and display images and recommendations from Google Maps; aim your camera at the bill and the app will calculate the appropriate tip and even split the check.
These improvements also open up new possibilities for shopping. By pointing your camera at certain clothes and furniture, you’re given options for similar items; aim it a barcode and you can gain access to all the relevant information regarding that particular item. Even when you’re not selecting certain items, Google Lens will automatically generate applicable data.
These AR and machine learning updates are rolling out now and will be available to all users by the end of the week. Google Lens is accessible via Google Assistant and Google Photos on Android, and the Google app and Google Photos on iOS; certain Android devices, such as the Pixel series, already have the app built-in to the camera.