During the iPhone and Apple Watch “Glowtime” event, Apple declared that the iPhone 16, including the base and Pro models, was built from the ground up with Apple Intelligence in mind This includes updated Apple silicon with an improved neural engine, new hardware controls, and changes to the operating system coming as early as next month
One of the most notable hardware upgrades in all four iPhone 16 models is the Camera Control button, which can trigger Apple Intelligence actions such as adding an event to the calendar from a flyer photo It will also be possible to look up information about restaurants and dog breeds from images taken with Camera Control - two examples Apple cited in the Glowtime preview
The iPhone 16 will also include other Apple Intelligence features such as a new conversational Siri, integrated writing tools in all apps, and smart summaries of notifications and emails
Many of the Apple Intelligence features are part of iOS 18 and will work on the latest iPad and iPhone 16 Pro models, but the action buttons, camera control buttons, and faster Neural Engine make iPhone 16 the best experience
Initially available only in US English, it will be available in other versions of English starting in December, with other languages to follow next year Also, not all features will be available immediately, as many Apple Intelligence updates will be added over the next few months
The initial Apple Intelligence features are part of the iOS 181 beta version available to developers; Apple has stated that the feature will be available to all users in beta form next month The main iOS 18 update will be released next Monday, September 16
Craig Federighi, Apple's senior vice president of software, said Apple Intelligence can “understand and create language and images, and take action on your behalf to simplify your daily life, all while being based on your personal context”
Apple Intelligence states that it can
Apple will include its own version of one of the most powerful AI, Vision, in the iPhone 16 in the form of Apple Visual Intelligence It uses AI to analyze images and perform tasks based on their content It not only works on the image itself, but also on the text and location information within the image
In a demo video, Apple showed a person holding up their camera to an event advertising poster on a wall and being offered the chance to add it to their calendar They then held their phone up to the dog to identify its breed using Apple Intelligence
If these features sound familiar, it's something Google has offered before through Google Lens Vision AI is also something you can already do with Claude and ChatGPT on the iPhone Apple has even stated that integrating Visual Intelligence in its camera app with Google Search and ChatGPT will allow for more detailed responses
One of the Apple Intelligence features that most people will use all the time is the writing tools These will be deeply integrated into iOS 18 and accessible from Slack, Messages, the browser, and any other app that requires writing
Writing tools features include complete paragraph rewrites for specific readers and simple updates such as spelling and grammar corrections They can also be used for proofreading and turning prose paragraphs into bulleted lists
The iPhone 16 adds a new default app called Image Playground This will allow users to use AI to create images of themselves and others as well as common photos for social sharing
These features will be available in the Messages and Mail apps as well as the ability to use AI to create custom emojis, simply by using Siri to describe what you need According to Apple, simply describe to Siri what you want to see and it will create an image for you
The most powerful picture feature is not in generating images, but in analyzing the images you already have For example, if you describe a dress that you think someone once wore in the “Photos” app, you can find all the images that show that dress
What's even more impressive is that it can do this with video as well, finding a specific moment in a video stored in the Photos library
What caught my eye in the Apple Intelligence demo was the automation element For example, instead of showing the first line of an email in the preview window, Apple Intelligence shows an AI-generated summary of the email, giving the user a better idea of the content of the email before opening it
This level of summary also applies to notifications, where Apple Intelligence automatically summarizes the purpose of the notification, making it easy to know if it is worth opening or just throwing away Speaking of notifications, the AI in your phone will also be able to automatically put the most important or urgent notifications at the top of the list
Siri is one of the most notable upgrades, with a new look and more conversational language capabilities, making it the “face” of Apple Intelligence
Currently, you speak to the AI assistant as if you were typing a query into Google, but the iPhone 16 will understand what you are saying even if you change what you are saying in the middle of the sentence It can even speak natural language
It also works when you are typing a query, providing a similar but cut-down experience to ChatGPT and Google Gemini This is due to the large language model behind it, which has a huge training dataset
This extends to a deep understanding of the phone and operating system, allowing users to get advice on how to perform functions and tasks directly from Siri
Comments