Apple Unveils Visual Intelligence - Its Take on Google Lens

Apple Unveils Visual Intelligence - Its Take on Google Lens

Apple has a new AI-powered feature on the iPhone 16 that turns the camera into a visual search engine It is similar to Android's Google Lens, driven by Apple Intelligence, but includes integration with any app or service running on the phone

Visual Intelligence is essentially AI vision, where a language model can analyze and understand images This is something that Claude, Gemini, and ChatGPT also excel at

Apple's approach, which is deeply integrated into the iPhone 16, including access to the new camera control buttons, is likely to be more user-friendly; one example shown during the Glowtime event was to add poster events to the phone's calendar use

Apple Visual Intelligence was one of the announcements that caught my attention at Glowtime; Vision AI will likely be the most user-friendly AI feature as it allows AI to see the world around us

Some of the Vision AI features, such as being able to copy text from an image or identify the type of animal in a photo, are already in use on the iPhone and have been for some time, but this brings those features to the real world via the camera

Using a combination of onboard and cloud-based (through Apple's Private Cloud Compute), the AI model can analyze what the camera is seeing in near real time and provide feedback

How an image is processed depends on the user For example, if they identify an event in an image, they may add the event to their calendar, or they may tell you the breed of dog They can also have Apple Intelligence redirect them to Google if there is a product they want to purchase

According to Apple, images captured by the AI as part of an Apple Intelligence search are not saved, and images sent to the cloud for deeper analysis are deleted

Much of the data collected by the Apple Intelligence features, including visual intelligence, is processed on the device, especially on the iPhone 16 with the new powerful A18 processor, but when sent to the cloud, Apple takes great stated that it will make every effort to protect the information

This is primarily driven by Private Cloud Compute, a new cloud system built on Apple Silicon and a custom version of the iPhone operating system This architecture not only ensures that nothing but the user has access, but is also open to third-party auditing

If a user opts in to send data to third parties like Google for search or OpenAI's ChatGPT for deeper analysis, they cannot have the same security, but according to Apple, it is always opt-in and explicit permission It will never be transmitted without it, and is optional, according to the company

Apple Visual Intelligence shows AI the world outside of your phone You can take a picture of a bag of groceries and have AI generate a recipe, or take a picture of an empty refrigerator and have AI generate a shopping list

Outside of food, it can be used to live translate signs, identify potentially dangerous ingredients for people with food allergies, or locate a place from a simple photo

If you take a picture of a dog, it will go into the picture and Apple Intelligence will tell you the breed This can also be used for spiders and other animals

There are as many use cases as there are types of things to look at It could be used to look up the history of a building, find a book review, or even get a link to buy a bicycle, which is impressive and logical for a feature built into the iPhone 16

Categories