Learn how to use visual intelligence on your iPhone 16 to gather information about places or objects, identify items around you, interact with text in your surroundings, and more.

Visual intelligence is a part of Apple Intelligence that lets you use the iPhone camera to learn about objects, text, places, and other things around you.
For instance, if I want to know more about the plant on my desk, I can point my iPhone 16 at it and get details using visual intelligence. Or, if I spot a cool backpack in an airport lounge, visual intelligence can perform a web search to find more details or purchasing options for the item.
Additionally, if I point my iPhone at a local business, such as a restaurant, visual intelligence can display operational hours, menus, a quick option to reserve a table or order online, contact information, the website address, and more.
And if I’m in a foreign country, visual intelligence can quickly translate and summarize text on buildings, signboards, and other places in just a few steps.
What you need
Visual intelligence is available on iPhone 16, 16 Plus, 16 Pro, and 16 Pro Max running iOS 18.2 or later with Apple Intelligence set up.
You can’t use visual intelligence on iPhone 15 Pro, 15 Pro Max, or earlier iPhone models since these phones lack the new Camera Control button. However, there are simple alternatives, which we will cover at the end of this tutorial.
Identify an object and search for it on Google
Apple’s visual intelligence uses Google to search for things you point your iPhone to and helps you find similar items. It’s similar to Google Lens on Android phones and the Google app.
- Click and hold or long-press the Camera Control on your iPhone 16 to open visual intelligence.
- Now, point your iPhone towards the object as if you were taking a picture. Once the object is properly in the frame, tap the Search button. You can also take a picture and then hit Search.
- Visual intelligence will search for this object and show you relevant results within a few seconds. You can tap a search result to visit the corresponding website, and if needed, you can hit the little Safari icon to visit the page in a full browser.
Learn more about an object by asking ChatGPT
Imagine spotting a beautiful flower in a park and wanting to know more about it. Or encountering a lovely dog and being curious about its breed. In these moments, visual intelligence can provide instant information about the world around you.
- Point your iPhone toward an object and click & hold Camera Control to launch visual intelligence.
- Tap the Ask button on the screen.
- ChatGPT will take a moment and reply with an explanation of what the object is.
- If you have follow-up questions, use the query box there to ask for additional information.
In my test with several household objects, I found that it doesn’t always get things right. For instance, it couldn’t identify a small plant I keep on my desk, and instead of saying it couldn’t help, it just gave me wrong information, which is worse. So, please do your due diligence before relying on it for crucial details.
Get more information about a business, restaurant, shop, and other such places
Point your iPhone 16 at a restaurant, shop, famous building, or business, and visual intelligence will provide details like operating hours, contact info, location, and more. For some U.S.-based businesses, you can even view ratings & reviews, make reservations, or order delivery.
- Launch visual intelligence by clicking and holding the Camera Control button on iPhone 16.
- Point your phone at the business and click Camera Control, hit the on-screen shutter button, or tap the business’ name in the camera viewfinder.
- In addition to showing the options to ask ChatGPT and do a Google search, visual intelligence may show buttons to access the contact information, hours of operation, restaurant menu, directions, and more. Just tap the Schedule, Order, Menu, Reserve, or other buttons on the screen.
Interact with text in your physical space
Visual intelligence can also analyze text on a storefront, banner, pamphlet, and other objects. You can use this to translate foreign words and phrases, summarize long texts, and take action for contact information.
For instance, visual intelligence can identify the phone number, address, business hours, or website of a business, and provide quick action buttons to call, get directions in Apple Maps, or visit the website.
- Click and hold the Camera Control on iPhone 16 to open visual intelligence and point your phone at the text.
- Click the Camera Control once again, and visual intelligence will interpret the text and show you buttons to translate the foreign text, summarize it using Apple Intelligence, read the text aloud, add a calendar event, email, etc.
Here’s another example:
What if you have an older iPhone?
The dedicated Camera Control on the iPhone 16 series makes it easy to use visual intelligence to know more about text and objects in your physical world. However, nothing it does is unique or new. So, if you have an old iPhone or even an Android phone, here’s how to do everything that visual intelligence does:
Search for objects on Google: Use Google Lens in the Google app, and it will do essentially the same thing. You can also assign the Action button or Back Tap to search for what’s on the screen.
Ask ChatGPT about an item: You can take a picture directly inside the ChatGPT app and learn more about it. If you have an iPhone 15 Pro or 15 Pro Max with Apple Intelligence set up, you can direct the camera at an object and then say, “Hey Siri, use ChatGPT and tell me what’s on the screen.”
Get information about a restaurant or place: You can just do a web search by seeing the business name. While using visual intelligence is certainly cooler than web search, the former doesn’t work with all restaurants, shops, or other places. You’re better off using web search anyway.
Translate: Use the Camera option inside Apple’s own Translate app or Google Translate to translate foreign words around you.
Interact with text: Photos, Files, Notes, and other iOS apps can already understand URLs, phone numbers, location, and email addresses and help you take relevant action when you touch and hold the text in an image or note.
What do you think of visual intelligence on iPhone 16?