background

Visual Intelligence on iOS 26 Can Perform On-Screen Searches and Actions

Do more than just a screenshot with Apple Intelligence on your iPhone

Link Copied
copy link iconcopy link icon
shot of the Visual Intelligence On-Screen Actions banner with Craig Federighi standing next to it

Image Credit: Apple (via YouTube)

Summary

  • Apple's new Visual Intelligence feature on iOS 26 lets users search their on-screen content.
  • It uses Apple Intelligence to identify text, objects, and places and describe what a user is seeing on their iPhone.
  • The feature can recommend similar objects and extract event details to send them to the calendar.
Click Here to Add Beebom Gadgets As A Trusted SourceGoogleAdd as a preferred source on Google

During today's WWDC 2025 Keynote, Apple introduced a ton of new Apple Intelligence features. One of the standout features include an update to Visual Intelligence, which can now perform actions from an iPhone's screen.

With the new Visual Intelligence feature on iOS 26, users will be able to "search and take action on anything" that's visible on their iPhone screen. For instance, users can ask ChatGPT to describe what's on their screen to find out more about objects or places that they're looking at. Apple says this feature will also work on Google, Etsy, or other supported apps and help users find relevant images and similar products.

shot of hands holding an iPhone showcasing the on-screen visual intelligence for screenshots on iOS 26
Image Credit: Apple
shot of hands holding an iPhone showcasing the on-screen visual intelligence for screenshots on iOS 26
Image Credit: Apple

If a user is interested in a product on their screen, they'll be able to "search for that specific item or similar objects online", similar to how Circle to Search works on Android. Similarly, if a user is looking at events or meeting schedules on their screen, Visual intelligence will be able to recognise it and suggest them to add the event to their calendar. The feature will utilise Apple Intelligence which will then identify key details like the date, time, and location for the event to add to their calendar.

shot of hands holding an iPhone showcasing the on-screen visual intelligence for adding events to calendar on iOS 26
Image Credit: Apple
shot of hands holding an iPhone showcasing the on-screen visual intelligence for adding events to calendar on iOS 26
Image Credit: Apple

To use the new on-screen actions, users can press the Volume Up and Power buttons (same as a screenshot) and choose to explore more with visual intelligence.

The new feature is available for testing through the Apple Developer Program, starting today. It's important to note that this, being an Apple Intelligence-powered feature, will need users to own iPhone 15 Pro, iPhone 15 Pro Max or any iPhone 16 models and run iOS 26 Developer Beta.

Related Articles

Source

Apple

Trending News

WhatsApp to Finally Let New Group Members See past Messages thumbnail
WhatsApp to Finally Let New Group Members See past Messages
Abubakar Mohammed
Infinix Note 60, Note 60 Pro and Note 60 Ultra RAM and Storage Options Leak Ahead of Launch thumbnail
Infinix Note 60, Note 60 Pro and Note 60 Ultra RAM and Storage Options Leak Ahead of Launch
Siddhartha Samaddar
Vivo V70 Series India Launch Confirmed for February thumbnail
Vivo V70 Series India Launch Confirmed for February
Abubakar Mohammed
Poco X8 Pro and X8 Pro Max Prices and Design Accidentally Leaked Ahead of Launch thumbnail
Poco X8 Pro and X8 Pro Max Prices and Design Accidentally Leaked Ahead of Launch
Sagnik Das Gupta
Redmi Note 15 Pro and Note 15 Pro+ Are Now Available to Purchase in India thumbnail
Redmi Note 15 Pro and Note 15 Pro+ Are Now Available to Purchase in India
Siddhartha Samaddar
View Alltrending-blue-icon