Like a pair of sneakers someone’s wearing? Or maybe a dress? There are quite a few apps and services — like Amazon’s Firefly or Samsung’s Bixby Vision — that let you simply point your smartphone camera at the object and search for it, or similar styles. Google is following suit with a similar feature in Google Lens, but it has the potential to reach far more people.
Google Lens is currently built into the Google Assistant on Android phones, as well as Google Photos. It lets you point the smartphone camera at objects to identify them, teach you more about landmarks, recognize QR codes, pull contact information from business cards, and more. At its annual Google I/O developer conference, the search giant announced four new improvements to Lens, and we got to try it out.
SUBSCRIBE FOR THE LATEST VIDEOS
VISIT DIGITAL TRENDS
DT Daily: https://www.youtube.com/playlist?list=PL8110CBCACD741FEC
DT Originals: https://www.youtube.com/playlist?list=PLEA870D36335F60D2
DT Podcasts: https://www.youtube.com/playlist?list=PLZEIwIHCxaFVemFMYm9Uqixqt7RxRJnhf