Today I learned that you can identify plants and flowers using only the camera on your iPhone

Sometimes, even as a technical reporter, you can be caught by how quickly technology is improving. Example: I only learned today that my iPhone offers a feature I have long wanted – the ability to identify plants and flowers from a photo only.

It’s true that various third-party applications have been offering this feature for years, but the last time I tried them, I was disappointed with their speed and accuracy. And, yes, there are Google Lens and Snapchat scanbut it’s always less convenient to open an app that I wouldn’t otherwise use.

But since the introduction of iOS 15 last September, Apple has offered its own version of this feature for visual search. Is called Visual look upand he’s damn good.

It works very simply. Just open a photo or screenshot in the Photos app and look for the blue “i” icon below. If there is a small sparkling ring around it, then iOS has found something in the picture that it can identify with the help of machine learning. Tap the icon, then click “Look up” and it will try to dig up useful information.

Touching the “i” icon usually gives you more information about when you took the photo and the camera settings. However, if the ring shines, there is also Visual Look Up data to see.

After clicking the “i” icon, you will be able to search for more information based on several selected categories.

It works not only for plants and flowers, but also for landmarks, art, pets and “other objects”. It’s not perfect, of course, but it surprises me more than it disappoints me. Here are a few more examples from my camera only:

Visual Look Up works for landmarks, animals and art, as well as plants and flowers.
Image: The Verge

Although Apple announced this feature at WWDC last year, it has not been announced. (I noticed it through a link in one of my favorite technical newsletters, The overflow.) Even the official support page for Visual Look Up gives mixed messages, telling you in one place that it’s “US only” and then listing other compatible regions of different page.

Visual look up is still limited in availability, but access expanded after launch. It is now available in English in the United States, Australia, Canada, the United Kingdom, Singapore and Indonesia; in French in France; in German in Germany; in Italian in Italy; and in Spanish in Spain, Mexico and the United States.

This is a great feature, but it also made me wonder what else visual search can do. Imagine taking a picture of your new houseplant, for example, just for Siri to ask “do you want to set watering schedule reminders?” – or, if you take a picture of a holiday landmark, for Siri to search the web to find opening hours and where to buy tickets.

I learned that long ago it is foolish to place your hopes on Siri doing something too advanced. But these are the types of features we can eventually get with future AR or VR headphones. Hopefully, if Apple introduces this type of functionality, it will make a bigger sensation.

Related Posts

Leave a Reply

Your email address will not be published.