background

Visual Intelligence on iOS 26 Can Perform On-Screen Searches and Actions

Do more than just a screenshot with Apple Intelligence on your iPhone

Ajaay Srinivasan profile picture
by Ajaay Srinivasan

Published 2 days ago

facebook iconfacebook icon
whatsapp iconwhatsapp icon
twitter icontwitter icon
Link Copied
copy link iconcopy link icon
shot of the Visual Intelligence On-Screen Actions banner with Craig Federighi standing next to it

Image Credit: Apple (via YouTube)

Summary

  • Apple's new Visual Intelligence feature on iOS 26 lets users search their on-screen content.
  • It uses Apple Intelligence to identify text, objects, and places and describe what a user is seeing on their iPhone.
  • The feature can recommend similar objects and extract event details to send them to the calendar.

During today's WWDC 2025 Keynote, Apple introduced a ton of new Apple Intelligence features. One of the standout features include an update to Visual Intelligence, which can now perform actions from an iPhone's screen.

With the new Visual Intelligence feature on iOS 26, users will be able to "search and take action on anything" that's visible on their iPhone screen. For instance, users can ask ChatGPT to describe what's on their screen to find out more about objects or places that they're looking at. Apple says this feature will also work on Google, Etsy, or other supported apps and help users find relevant images and similar products.

shot of hands holding an iPhone showcasing the on-screen visual intelligence for screenshots on iOS 26
Image Credit: Apple
shot of hands holding an iPhone showcasing the on-screen visual intelligence for screenshots on iOS 26
Image Credit: Apple

If a user is interested in a product on their screen, they'll be able to "search for that specific item or similar objects online", similar to how Circle to Search works on Android. Similarly, if a user is looking at events or meeting schedules on their screen, Visual intelligence will be able to recognise it and suggest them to add the event to their calendar. The feature will utilise Apple Intelligence which will then identify key details like the date, time, and location for the event to add to their calendar.

shot of hands holding an iPhone showcasing the on-screen visual intelligence for adding events to calendar on iOS 26
Image Credit: Apple
shot of hands holding an iPhone showcasing the on-screen visual intelligence for adding events to calendar on iOS 26
Image Credit: Apple

To use the new on-screen actions, users can press the Volume Up and Power buttons (same as a screenshot) and choose to explore more with visual intelligence.

The new feature is available for testing through the Apple Developer Program, starting today. It's important to note that this, being an Apple Intelligence-powered feature, will need users to own iPhone 15 Pro, iPhone 15 Pro Max or any iPhone 16 models and run iOS 26 Developer Beta.

#Tags

Source

Apple
Ajaay Srinivasan profile picture
linkedin iconlinkedin icon
twitter X icontwitter X icon
email iconemail icon
Ajaay Srinivasan

Guides Editor

Expertise :

Ajaay's love affair with technology started young, with the Nokia N-Gage piquing his interest. Since 2016, he's channeled his passion for tech into crafting explainers and guides on iOS, macOS, Android, social media, privacy & cybersecurity, and AI. When it's time to unplug, Ajaay enjoys playing EAFC, unwinding to music on a pair of open-backs, building his dream audiophile gear, or watching Arsenal struggle to keep a clean sheet.