Google Lens is getting an incline. At Google I/O 2019, Google announced that it would give Lens several new features, and even allowing the piece to work within Research. Now, some of those extensions are finally working out.
Not all of the new features are working out right now, but some of the more powerful ones are. For beginners, Google Lens will be able to automatically highlight which dishes are popular on an establishment menu. All you have to do is point the Google Lens camera at the menu itself, and popular dishes will be highlighted on your phone’s screen. You’ll also be able to see pieces of dishes, photos of them, and more. Not only that, but Lens can help split the bill.
The new pieces are working out to Lens users now and are expected to be available to all Lens users on both Android and iOS later this week. The lens can be found in Google Assistant and Google Photos on Android and the Google and Google Photos apps on iOS. The feature is also available in the camera app on many Pixel phones. Google Translate is getting injected into Lens, too. With Google Lens, you’ll be able to point your camera at text in a foreign language and a translated version of the text will appear over the sign, menu, or whatever else.
Notably, Lens now has a text-recognition feature that allows you to point it at text and you can then copy and paste that text to other apps and services.
Last but not least, Google Lens is making it a little easier to buy products you see in the real world. With Lens, you’ll be able to point the camera at clothing or furniture and then see similar items available online. If you can find the barcode of the product, you will be able to see that exact product and where it might be available for purchase, which is a nice touch.