Apple's WWDC this year focused on design changes with iOS 26 and Liquid Glass, but we also saw some updates and new announcements for Apple Intelligence, the company's suite of AI features.
New Apple Intelligence announcements build on existing AI-powered features like Writing Tools, Message and Mail summaries, the ChatGPT integration, and others.
Foundation Models Framework
Apple unveiled a way for third-party apps to tap into Apple Intelligence called the Foundation Models Framework. This means developers can use Apple's API to integrate their features into Apple Intelligence.
New AI features for iOS 26
Voicemail features
Apple Intelligence already provides voicemail transcripts, but now it's adding call screening for scammers and Hold Assist, which conveniently notifies you when you're off hold.
Mashable Light Speed
Hold Assist will let you know when you're off hold. Credit: Screenshot: Mashable
Polls and emojis in Messages
Within group chats, you can now create polls and Apple Intelligence will compile the results. Using Genmoji, you can also mix together emojis and use Image Playground to make new emojis. Image Playground also got a ChatGPT integration, so you can create images with OpenAI's model too.
AI-assisted polls on iOS 26. Credit: Screenshot: Mashable
Live translation
Apple Intelligence now supports live translation for real-time text and voice translations.
Live Translation for real-time text and voice translation. Credit: Screenshot: Mashable
Visual Intelligence for screenshots
iOS 26 is getting a visual search feature by combining Visual Intelligence with Apple Intelligence. By taking a screenshot of any app you're looking at, you can use a new search function on the bottom of your screen. It can also recognize screenshots like events and pre-populate your calendar.
Use screenshots to search visually. Credit: Screenshot: Mashable
This story is developing...