24 Sources
[1]
Apple tiptoes with modest AI updates while rivals race ahead
On Monday, Apple announced a series of incremental Apple Intelligence updates at its annual Worldwide Developers Conference, focusing on practical features like live phone call translation and visual search rather than the ambitious race for AI breakthroughs that rivals have been promoting. Notably absent was any concrete update on the much-needed "more personalized" Siri that Apple first announced at last year's WWDC but has yet to demo publicly or provide specifics about. (Siri still feels woefully outdated after using ChatGPT's Advanced Voice Mode, for example.) In our WWDC keynote preview from last week, we pointed out that Apple has faced intense pressure to deliver on AI after overpromising features it wasn't ready to launch -- a controversy that led to an executive reshuffle of those handling Apple's AI efforts. But that delivery did not materialize today. Reuters notes that Apple shares closed down 1.2 percent on Monday, potentially reflecting investor disappointment. "In a moment in which the market questions Apple's ability to take any sort of lead in the AI space, the announced features felt incremental at best," said Thomas Monteiro, senior analyst at Investing.com, in an interview with the news agency. "It just seems that the clock is ticking faster every day for Apple." Incremental AI updates The modest scope of Monday's AI announcements suggests Apple took a more cautious approach this time, baking improvements into various apps and OSes rather than making high-level, sweeping AI announcements. (Instead of AI, Apple's big marquee announcements at WWDC '25 centered around a graphical design update called "liquid glass" and a shift to year-based naming for its operating systems.) For example, the iPhone maker unveiled a feature named"Call Screening," which automatically answers unknown numbers and transcribes the caller's purpose. It also introduced updates to Visual Intelligence, which helps users find products similar to those they photograph or see on their screens and link them to shopping apps. During Monday's keynote video, Apple demonstrated finding a jacket online and using the feature to locate similar items for sale in apps already installed on the user's device. Developers, developers, developers? Being the Worldwide Developers Conference, it seems appropriate that Apple also announced it would open access to its on-device AI language model to third-party developers. It also announced it would integrate OpenAI's code completion tools into its XCode development software. "We're opening up access for any app to tap directly into the on-device, large language model at the core of Apple," said Craig Federighi, Apple's software chief, during the presentation. The company also demonstrated early partner integration by adding OpenAI's ChatGPT image generation to its Image Playground app, though it said user data would not be shared without permission. For developers, Apple's inclusion of ChatGPT's code-generation capabilities in XCode may represent Apple's attempt to match what rivals like GitHub Copilot and Cursor offer software developers in terms of AI coding augmentation, even as the company maintains a more cautious approach to consumer-facing AI features. Meanwhile, competitors like Meta, Anthropic, OpenAI, and Microsoft continue to push more aggressively into the AI space, offering AI assistants (that admittedly still make things up and suffer from other issues, such as sycophancy). Only time will tell if Apple's wariness to embrace the bleeding edge of AI will be a curse (eventually labeled as a blunder) or a blessing (lauded as a wise strategy). Perhaps, in time, Apple will step in with a solid and reliable AI assistant solution that makes Siri useful again. But for now, Apple Intelligence remains more of a clever brand name than a concrete set of notable products.
[2]
Here are Apple's top AI announcements from WWDC 2025 | TechCrunch
Last year, Apple's WWDC keynote highlighted the company's ambitious strides in AI. This year, the company toned down its emphasis on Apple Intelligence and concentrated on updates to its operating systems, services, and software, introducing a new aesthetic it calls "Liquid Glass" along with a new naming convention. Nevertheless, Apple still attempted to appease the crowd with a few AI-related announcements, such as an image analysis tool, a workout coach, a live translation feature, and more. Visual Intelligence is Apple's AI-powered image analysis technology that allows you to gather information about your surroundings. For example, it can identify a plant in a garden, tell you about a restaurant, or recognize a jacket someone is wearing. Now, the feature will be able to interact with the information on your iPhone's screen. For instance, if you come across a post on a social media app, Visual Intelligence can conduct an image search related to what you see while browsing. The tool performs the search using Google Search, ChatGPT, and similar apps. To access Visual Intelligence, open the Control Center or customize the Action button (the same button typically used to take a screenshot). The feature becomes available with iOS 26 when it launches later this year. Read more. Apple integrated ChatGPT into Image Playground, its AI-powered image generation tool. With ChatGPT, the app can now generate images in new styles, such as "anime," "oil painting," and "watercolor." There will also be an option to send a prompt to ChatGPT to let it create additional images. Read more. Apple's latest AI-driven workout coach is exactly what it sounds like - it uses a text-to-speech model to deliver encouragement while you exercise, mimicking a personal trainer's voice. When you begin a run, the AI within the Workout app provides you with a motivational talk, highlighting key moments such as when you ran your fastest mile and your average heart rate. After you've completed the workout, the AI summarizes your average pace, heart rate, and whether you achieved any milestones. Read more. Apple Intelligence is powering a new live translation feature for Messages, FaceTime, and phone calls. This technology automatically translates text or spoken words into the user's preferred language in real time. During FaceTime calls, users will see live captions, whereas for phone calls, Apple will translate the conversation aloud. Read more. Apple has introduced two new AI-powered features for phone calls. The first is referred to as call screening, which automatically answers calls from unknown numbers in the background. This allows users to hear the caller's name and the reason for the call before deciding whether to answer. The second feature, hold assist, automatically detects hold music when waiting for a call center agent. Users can choose to stay connected while on hold, allowing them to use their iPhone for other tasks. Notifications will alert them when a live agent becomes available. Read more. Apple also introduced a new feature that allows users to create polls within the Messages app. This feature uses Apple Intelligence to suggest polls based on the context of your conversations. For instance, if people in a group chat are having trouble deciding where to eat, Apple Intelligence will recommend starting a poll to help land on a decision. Read more. The Shortcuts app is becoming more useful with Apple Intelligence. The company explained that when building a shortcut, users will be able to select an AI model to enable features like AI summarization. Read more. A minor update is being introduced to Spotlight, the on-device search feature for Mac. It will now incorporate Apple Intelligence to improve its contextual awareness, providing suggestions for actions that users typically perform, and tailored to their current tasks. Read more. Apple is now allowing developers to access its AI models even when offline. The company introduced the Foundation Models framework, which enables developers to build more AI capabilities into their third-party apps that utilize Apple's existing systems. This is likely intended to encourage more developers to create new AI features as Apple competes with other AI companies. Read more. The most disappointing news to emerge from the event was that the much-anticipated developments for Siri aren't ready yet. Attendees were eager for a glimpse of the promised AI-powered features that were expected to debut. However, Craig Federighi, Apple's SVP of Software Engineering, said they won't have more to share until next year. This delay may raise questions about Apple's strategy for the voice assistant in an increasingly competitive market. Read more.
[3]
Apple Is Pushing AI Into More of Its Products -- but Still Lacks a State-of-the-Art Model
Among the buzzier AI announcements at the event was Live Translation, a feature that translates phone and FaceTime calls from one language to another in real time. Apple also showed off Workout Buddy, an AI-powered voice helper designed to provide words of encouragement and useful updates during exercise. "This is your second run this week," Workout Buddy told a jogging woman in a demo video. "You're crushing it." Apple also announced an upgrade to Visual Intelligence, a tool that uses AI to interpret the world through a device's camera. The new version can also look at screenshots to do things like identify a product or summarize a webpage. Apple showcased upgrades to Genmoji and Image Playground, two tools that generate stylized images with AI. And it showed off ways of using AI to automate tasks, generate text, summarize emails, edit photos, and find video clips. The incremental announcements did little to dispel the notion that Apple is playing catch up on AI. The company does not yet have a model capable of competing with the best offerings of OpenAI, Meta, or Google, and still hands some challenging queries off to ChatGPT. Some analysts suggest that Apple's more incremental approach to AI development is warranted. "The jury is still out on whether users are gravitating towards a particular phone for AI driven features," says Paolo Pescatore, an analyst at PP Foresight. "Apple needs to strike the fine balance of bringing something fresh and not frustrating its loyal core base of users," Pescatore adds. "It comes down to the bottom line, and whether AI is driving any revenue uplift." Francisco Jeronimo, an analyst at IDC, says Apple making its AI models accessible to developers is important because of the company's vast reach with coders. "[It] brings Apple closer to the kind of AI tools that competitors such as OpenAI, Google and Meta have been offering for some time," Jeronimo said in a statement. Apple's AI models, while not the most capable, run on a personal device, meaning they work without a network connection and don't incur the fees that come with accessing models from OpenAI and others. The company also touts a way for developers to use cloud models that keeps private data secure through what it calls Private Cloud Compute. But Apple may need to take bigger leaps with its use of AI in the future, given that its competitors are exploring how the technology might reinvent personal computing. Both Google and OpenAI have shown off futuristic AI helpers that can talk in real time and see the world through a device's camera. Last month OpenAI announced it would acquire a company started by the legendary Apple designer, Jony Ive, in order to develop new kinds of AI-infused hardware.
[4]
Apple Intelligence Takes a Backseat at WWDC 2025
Apple Intelligence was the focal point of WWDC 2024, with CEO Tim Cook promising it would "transform what users can do with our products -- and what our products can do for our users." But after an underwhelming rollout, excitement around Apple Intelligence was more muted at WWDC 2025 today. During an opening keynote, Craig Federighi, Apple's SVP of Software Engineering, said Apple needs "more time to reach a high-quality bar" for an AI-enhanced Siri, reiterating what Cook said during a recent earnings call. Apple will share more on that "in the coming year," he said. Instead, the big Apple Intelligence announcement is the company giving developers access to Apple's on-device large language model (LLM). Federighi highlighted two apps that will integrate Apple Intelligence. Kahoot, for example, will be able to create study guides, while the AllTrails camping app could suggest a hike based on what you tell it you want. Apple also says Automattic will add it to its Day One journaling app for "privacy-centric intelligence features." "Developers play a vital role in shaping the experiences customers love across Apple platforms," Susan Prescott, Apple's VP of Worldwide Developer Relations, said in a statement. "With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we're empowering developers to build richer, more intuitive apps for users everywhere." The Foundation Models framework has native support for Swift, "so developers can easily access the Apple Intelligence model with as few as three lines of code," Apple says. Apple also plans to expand language support for Apple Intelligence, and some OS updates will use the company's AI, like Workout Buddy in watchOS 26. But overall, OpenAI and Google don't have much to worry about just yet. When it launched in October, Apple Intelligence had only a few of its promised features, including text-to-Siri, summarized notifications, and Writing Tools. Notification summaries got messy, prompting Apple to pause them for news-related content. Siri's upgrades were also rather subtle, which resulted in an executive shuffle. Overall, the piecemeal releases failed to impress. Some iPhone 16 users sued for false advertising.
[5]
Apple Intelligence at WWDC: Everything Apple announced for iOS, macOS and more
It's safe to say Apple Intelligence hasn't landed in the way Apple likely hoped it would. However, that's not stopping the company from continuing to iterate on its suite of AI features. During its WWDC 2025 conference on Monday, Apple announced a collection of new features for Apple Intelligence, starting with upgrades to Genmoji and Image Playground. In Messages, for instance, you'll be able to use Image Playground to generate colorful backgrounds for your group chats. At the same time, Apple has added integration with ChatGPT to the tool, meaning it can produce images in entirely new styles. As before, if you decide to use ChatGPT directly through your iPhone in this way, your information won't be shared with OpenAI without your permission. Separately, Genmoji will allow users to combine two emoji from the Unicode library to create new characters. For example, you might merge the sloth and light bulb emoji if you want to poke fun at yourself for being slow to understand a joke. Visual Intelligence is also in line for an upgrade. Now in addition to working with your iPhone's camera, the tool can scan what's on your screen. Like Genmoji, Visual Intelligence will also benefit from deeper integration with ChatGPT, allowing you to ask the chat bot questions about what you see. Alternatively, you can search Google, Etsy and other supported apps to find images or products that might be a visual match. And if the tool detects when you're looking at an event, iOS 26 will suggest you add a reminder to your calendar. Nifty that. If you want to access Visual Intelligence, all you need to do is press the same buttons you would to take a screenshot on your iPhone. As expected, Apple is also making it possible for developers to use its on-device foundational model for their own apps. "With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they're offline, and that protect their privacy, using AI inference that is free of cost," the company said in its press release. Apple suggests an educational app like Kahoot! might use its on-device model to generate personalized quizzes for users. According to the company, the framework supports Swift, Apple's own coding language, and the model is as easy as writing three lines of code.
[6]
Apple Intelligence announcements at WWDC: Everything Apple revealed for iOS, macOS and more
Apple Intelligence hasn't landed in the way Apple likely hoped it would, but that's not stopping the company from continuing to iterate on its suite of AI tools. During its WWDC 2025 conference on Monday, Apple announced a collection of new features for Apple Intelligence, starting with upgrades to Genmoji and Image Playground that will arrive alongside iOS 26 and the company's other updated operating systems. In Messages, you'll be able to use Image Playground to generate colorful backgrounds for your group chats. At the same time, Apple has added integration with ChatGPT to the tool, meaning it can produce images in entirely new styles. As before, if you decide to use ChatGPT directly through your iPhone in this way, your information will only be shared with OpenAI if you provide permission. Separately, Genmoji will allow users to combine two emoji from the Unicode library to create new characters. For example, you might merge the sloth and light bulb emoji if you want to poke fun at yourself for being slow to understand a joke. Across Messages, FaceTime and its Phone app, Apple is bringing live translation to the mix. In Messages, the company's on-device AI models will translate a message into your recipient's preferred language as you type. When they responded, each message will be instantly translated into your language. In FaceTime, you'll see live captions as the person you're chatting with speaks, and over a phone call, Apple Intelligence will generate a voiced translation. Visual Intelligence is also in line for an upgrade. Now in addition to working with your iPhone's camera, the tool can scan what's on your screen. Like Genmoji, Visual Intelligence will also benefit from deeper integration with ChatGPT, allowing you to ask the chat bot questions about what you see. Alternatively, you can search Google, Etsy and other supported apps to find images or products that might be a visual match. And if the tool detects when you're looking at an event, iOS 26 will suggest you add a reminder to your calendar. Nifty that. If you want to access Visual Intelligence, all you need to do is press the same buttons you would to take a screenshot on your iPhone. As expected, Apple is also making it possible for developers to use its on-device foundational model for their own apps. "With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they're offline, and that protect their privacy, using AI inference that is free of cost," the company said in its press release. Apple suggests an educational app like Kahoot! might use its on-device model to generate personalized quizzes for users. According to the company, the framework supports Swift, Apple's own coding language, and the model is as easy as writing three lines of code. An upgraded Shortcuts app for both iOS and macOS is also on the way, with support for actions powered by Apple Intelligence. You'll be able to tap into either of the company's on-device or Private Cloud Compute model to generate responses that are part of whatever shortcut you want carried out. Apple suggests students might use this feature to create a shortcut that compares an audio transcript of a class lecture to notes they wrote on their own. Here again users can turn to ChatGPT if they want. There are many other smaller enhancements enabled by upgrades Apple has made to its AI suite. Most notably, Apple Wallet will automatically summarize tracking details merchants and delivery carriers send to you so you can find them in one place. A year since its debut at WWDC 2024, it's safe to say Apple Intelligence has failed to meet expectations. The smarter, more personal Siri that was the highlight of last year's presentation has yet to materialize. In fact, the company delayed the upgraded digital assistant in March, only saying at the time that it would arrive sometime in the coming year. Other parts of the suite may have shipped on time, but often didn't show the company's usual level of polish. For instance, notification summaries were quite buggy at launch, and Apple ended up reworking the messages to make it clearer they were generated by Apple Intelligence. With today's announcements, Apple still has a long way to go before it catches up to competitors like Google, but at least the company kept the focus on practical features.
[7]
Apple Intelligence Can Now Creep on Your iPhone Screen
Visual Intelligence brings Apple's AI features a little bit closer to competitors with the ability to see what you're looking at on your phone. It wouldn't be a developer keynote in 2025 without a little AI, right? As underwhelming as Apple Intelligence has been since its rollout in October last year, Apple seems to be committed to upgrading that experience with pivotal upgrades like... new Genmojis? Okay, so maybe WWDC 2025 wasn't a revolutionary year for Apple Intelligence, but there are still some upgrades worth noting, including a new feature that can watch what you're doing on your phone and then take specific actions depending on the scenario. Visual Intelligence, as Apple is calling it, is a feature that expands multimodal capabilities beyond the Camera app and into your iPhone screen. "Users can ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products," says Apple. "If there’s an object a user is especially interested in, like a lamp, they can highlight it to search for that specific item or similar objects online." That doesn't sound novel by any means, but it does bring Apple Intelligence closer to competitors like Google, which has a Gemini feature that does pretty much the same thing. It also brings Apple Intelligence closer to the Holy Grail of "agentic AI," which is the tech world's way of describing AI that can do stuff for you. As ho-hum as multimodal features like Visual Intelligence have become in a very short period of time, they still have the power to actually make the phone experience better, in my opinion. I think I speak for most people when I say that using your iPhone isn't quite as simple as it used to be, and there are a few reasons for that. One reason is that we expect our phones to do a lot more than they used to, which means devices need to have more features to do all of those things. The problem is that keeping track of those features and finding a spot for them to exist in a UI isn't easyâ€"it makes software feel more bloated. Agentic AI has the ability to cut through the bloat and bring you to the thing you want to do faster. If that means I get to spend less time entering payment card information or navigating between apps on my phone, then I'm all for it. This is all theoretical right now since Visual Intelligence was just released, and we can't say for certain whether it works as promised, but I'm certainly not mad about the idea, even despite being a little underwhelmed. Visual Intelligence should also run on-device AI, which is great because sending data from my phone screen anywhere really wouldn't be high on my to-do list. It wasn't all about Visual Intelligence; Apple also unveiled new AI features like Live Translation in the Messages and FaceTime to translate while you're texting or calling with someone. There were also updates to Genmoji and Image Playground that add further customization and new art styles for generated images and emoji. Additionally, Apple will open up its on-device foundation model for Apple Intelligence and invite third-party app developers to design their own AI features. "App developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re offline, and that protect their privacy, using AI inference that is free of cost," Apple said in a statement. "For example, an education app can use the on-device model to generate a personalized quiz from a user’s notes, without any cloud API costs, or an outdoors app can add natural language search capabilities that work even when the user is offline." Again, that isn't exactly the flashiest news for Apple Intelligence, but it may be a solid way of expediting the development of new AI features, especially while Apple lags behind in the field of generative AI and large language models. Speaking of lagging behind, one notable thing that was missing was Apple's AI-powered Siri upgrade, though Apple did address the AI elephant in the room, stating that we would hear more "later this year." That's not surprising by any means, but it's definitely indicative of Apple's stumbles on the AI front. This year's WWDC did little to assuage any concerns you may have over Apple's progress on the AI front, but it did move the needle forward just a bit, and that may be enough for most. Despite industry emphasis on AI features, consumers have a decidedly smaller appetite for those features, so I doubt this year's update will prevent anyone from running out this year and buying the latest iPhone. Anyone who is a part of the Apple Developer Program can use the new Apple Intelligence features today, while the first public beta will be available next month. If you're not interested in betas or you're not a developer, you'll have to wait until the fall to try these new features in full.
[8]
Apple unveils AI upgrades with live translation, creative tools
Search also gets sharper with expanded visual intelligence, letting users interact with what they're viewing. Be it asking questions about content, searching for similar products, or auto-adding events to the calendar, visual intelligence will do it all using the same buttons you'd press to take a screenshot. Apple Intelligence further elevates fitness via Workout Buddy on Apple Watch, providing personalized, real-time coaching based on workout history and live metrics processed privately on the device. Shortcuts also become more dynamic, allowing users to build smart workflows that summarize, generate, or compare content using on-device or Private Cloud Compute models. Developers, too, can now tap directly into Apple's foundation model through a Swift-supported framework, unlocking new possibilities for offline, privacy-first experiences inside their apps. Apple Intelligence can also automatically identify and categorize relevant actions in emails, websites, and notes for Reminders, summarize order tracking across Apple Wallet, and suggest polls or personalized chat backgrounds in Messages. It will also enhance productivity with mail and call transcription summaries, priority notifications, and smarter Siri interactions that maintain context and support typing queries. Crucially, Apple is prioritizing privacy. Most AI models run entirely on-device, and when cloud processing is needed, Private Cloud Compute ensures data is never stored or shared with Apple, verified continuously by independent experts. These privacy-first innovations, along with expanded language support and developer access, make Apple Intelligence smarter, more secure, and deeply woven into users' daily lives. Behind the scenes, Apple also opened access to its on-device large language model for third-party developers, allowing them to integrate powerful AI directly into their apps while maintaining user privacy. The company also integrated OpenAI's ChatGPT capabilities into some tools like Image Playground, underscoring a strategy of complementing its own AI with third-party offerings, similar to moves by Microsoft. Despite these advances, some analysts view Apple's AI updates as modest compared to the sweeping ambitions announced by rivals, with concerns growing about the company's pace in the fast-moving AI landscape. Apple Intelligence will support eight additional languages by year-end. This includes Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Traditional Chinese, and Vietnamese. The new features will be available for testing with a public beta launching next month through the Apple Beta Software Program.
[9]
Apple Intelligence just got smarter -- here are the best new features from WWDC
At WWDC 2025, Apple introduced a new phase of its AI strategy by giving third-party developers access to its proprietary Apple Intelligence models. The move signals a significant shift in how AI will be integrated across Apple's ecosystem -- though one notable feature was missing: the expected overhaul of Siri. Apple's giving developers a lot more power with its new AI framework, letting them tap directly into its large language models (LLMs) for the first time. That means we'll start seeing smarter features like image generation, natural language tools and predictive suggestions built right into third-party apps across iOS, macOS, iPadOS and more. Craig Federighi called it a major step up from previous tools like Shortcuts and App Intents. While Apple continues to emphasize its privacy-first approach, this new framework provides developers with deeper access to on-device intelligence than ever before. The developer update arrives alongside a redesign of Apple's user interfaces. A new "Liquid Glass" visual aesthetic and a simplified versioning system -- iOS 26, macOS 26, and so on -- aim to create a more cohesive and modern user experience across devices. Apple's translation tools are getting a big upgrade thanks to Apple Intelligence. The Translate app now features a real-time "Conversation" mode, making it easier to communicate face-to-face. Speak or type a phrase, and the app will instantly translate it and play the audio in another language. It's a feature designed to support users when traveling or chatting with someone who speaks a different language. Apple Intelligence can now translate text that's on your screen -- whether it's inside an app, on an image or in a video. Just ask Siri to translate the page, and it will take a screenshot, figure out the language and show you the translated text. It's a faster, seamless way to understand content without jumping between apps. One of the standout Apple Intelligence features announced today is Visual Intelligence, a new way to interact with what's on your screen. Billy Sorrentino, Apple's senior director of Human Interface, introduced the feature by showing how users can now screenshot anything -- from a social media post to an event flyer -- and immediately gain deeper insights. For instance, if you spot a photo of a concert poster, Apple Intelligence can automatically extract the date, time and location, offering to create a Calendar invite on the spot. However, it extends beyond simply reading text. Visual Intelligence also understands images contextually. In one demo, a screenshot of a mandolin led to ChatGPT (integrated via Apple Intelligence) suggesting songs that feature the instrument. This screen-aware capability bridges your visual input with Apple's AI engine, making it easier than ever to act on what you see, whether you're organizing your life or just following your curiosity. Apple Intelligence is also making its debut on the Apple Watch, with Workout Buddy, a new feature designed to bring personalized motivation to your wrist. Introduced by Stephanie Postlewaite, the feature uses a text-to-speech model trained on a real Fitness+ trainer, offering encouragement that sounds natural and familiar. It draws from your workout history and health data to provide context-aware coaching, like suggesting recovery when you've had a tough week or pushing you on a streak. This marks the first significant implementation of Apple Intelligence on watchOS, signaling a major step toward smarter, more adaptive fitness experiences powered by on-device AI. While Apple Intelligence received a prominent spotlight, the same couldn't be said for Siri. Expected updates that would enable more conversational capabilities, broader app integration, and contextual awareness were absent from the keynote, signaling we might not see those features until 2026. It's still unclear whether Apple's closed ecosystem can deliver the same level of flexibility and performance as more open AI platforms. The delay in Siri's upgrades doesn't help, and it may add to the sense that Apple is playing it safe while competitors push ahead in the voice assistant space. For users: Many apps could begin offering smarter, more adaptive features later this year. However, those waiting for a more advanced, conversational Siri will need to wait until at least 2026. For developers: The updated AI framework offers a new set of tools for building intelligent app experiences. However, Apple's privacy standards and system constraints may continue to limit how much developers can customize those interactions. Apple is positioning its developer community as a key driver of AI innovation across its platforms. The tools introduced at WWDC are set to roll out with new operating system updates this fall. Whether this approach will satisfy users looking for a more dynamic, voice-driven experience remains to be seen, especially as competitors continue to push the boundaries of what AI assistants can do.
[10]
Apple Intelligence was firmly in the background at WWDC 2025 as iPad finally had its chance to shine
Apple put last year's wonderkid, Apple Intelligence, firmly in the corner today and focused instead on iOS 26, watchOS 26, tvOS 26, macOS 26, and iPadOS 26 (all Apple's new operating systems now have a new name, reflecting the year they will be most active in) at this year's WWDC 2025. In fact, the whole keynote built steadily to the real star of the show, the iPad. The new windowing system on the new iPadOS 26 looks like it finally makes it capable of switching between multiple running apps with ease, and also adds a menu bar, which is context sensitive to whichever app is in the foreground. This essentially makes the humble iPad less of a large iPhone and more like an extremely lightweight and portable Mac. It won't run Mac software, of course, but it will finally work like one, especially when plugged into a keyboard and trackpad. iPadOS 26 even gets its own version of the Preview app from macOS to look at PDFs with, and a new Files app that is more powerful and Finder-like. The iPad upgrade got by far the most animated reaction from Craig Federighi, Apple's senior vice President of Software Engineering. Federighi enthused about the new iPadOS 26 with a passion I haven't seen since he introduced Apple Intelligence to us last year, calling it "the biggest iPadOS release ever." And there was no new hardware from Apple either! I was hoping for at least an upgrade to HomePod, but everything this year was about the various Apple OSs. Where was Apple Intelligence in the keynote? It had a little recap right at the start, which focused on what Apple Intelligence features Apple had actually released over the course of the last year, you know, Genmoji, Writing Tools, Notification Summaries, etc, and then it just faded into the background. Sure, Apple Intelligence was mentioned frequently throughout Apple's keynote, powering some of the most innovative features on display, like Visual Intelligence now being available every time you take a screenshot, or the ability to suggest when a poll might be a good idea in a group chat, or even letting you create your own original chat backgrounds. But Apple Intelligence, which last year was the new kid on the block, has now become just another part of the furniture of Apple's operating systems. There was no talk about a fully AI-powered Siri, or really any groundbreaking new Apple Intelligence features, although there were quite a few minor ones like Live Translation and new AI-powered Shortcuts. But perhaps the background is where Apple Intelligence really belongs? It's fair to say that the world has gone crazy for AI, thanks to OpenAI and Google steaming ahead with ChatGPT and Gemini. It's almost impossible for companies not to get swept up in the unlimited possibilities that AI offers. And yet, are people actually asking for AI features in Macs, iPads, and iPhones? From all of the Apple Intelligence features that Apple has released over the last year, I don't really use any of them regularly, if at all. I played around with Genmoji for a day, then got bored. Notification summaries' attempts at summarizing very short text messages were so annoying, I'd rather just read the actual message, which in most cases were just a few words longer. I do use AI every day, but I prefer to use it inside the fully-featured apps from Google and OpenAI, which work fine on my iPhone and contain advanced voice modes for natural human-like language interaction. This, for me, is where AI really shines, and not when it comes to trying to rewrite, or even read, my emails for me. Apple did reveal one key detail at this year's WWDC 2025 that I think could change the game for Apple Intelligence. With iOS 26, Apple is making its Foundation AI Models Framework available to developers for the first time. Now, while this doesn't sound like big news right now, it's going to mean that app developers are going to be able to integrate on-device AI into their apps going forward. The possibilities here are endless, and frankly, I think developers will do a better job than Apple has of coming up with creative ways to use AI. At a time when investors must be starting to wobble as Apple seems to have dropped the ball on AI, by opening up its AI to developers ,Apple might have just secured its future.
[11]
WWDC 2025: All the Apple Intelligence AI features coming to your devices
All the Apple Intelligence features coming your way. Credit: Screenshot: Mashable Apple's WWDC this year focused on design changes with iOS 26 and Liquid Glass, but we also saw some updates and new announcements for Apple Intelligence, the company's suite of AI features. New Apple Intelligence announcements build on existing AI-powered features like Writing Tools, Message and Mail summaries, the ChatGPT integration, and others. Apple unveiled a way for third-party apps to tap into Apple Intelligence called the Foundation Models Framework. This means developers can use Apple's API to integrate their features into Apple Intelligence. Apple Intelligence already provides voicemail transcripts, but now it's adding call screening for scammers and Hold Assist, which conveniently notifies you when you're off hold. Within group chats, you can now create polls and Apple Intelligence will compile the results. Using Genmoji, you can also mix together emojis and use Image Playground to make new emojis. Image Playground also got a ChatGPT integration, so you can create images with OpenAI's model too. Apple Intelligence now supports live translation for real-time text and voice translations. iOS 26 is getting a visual search feature by combining Visual Intelligence with Apple Intelligence. By taking a screenshot of any app you're looking at, you can use a new search function on the bottom of your screen. It can also recognize screenshots like events and pre-populate your calendar.
[12]
Every Apple Intelligence upgrade coming to your Apple devices in iOS 26, iPadOS 26, macOS 26, and watchOS 26
Free AI upgrades coming to your iPhone, iPad, Mac, and Apple Watch Apple just announced major free upgrades coming to Apple Intelligence-compatible devices, set to arrive as part of iOS 26, iPadOS 26, macOS 26, and watchOS 26 later this year. The new AI features coming to these devices were sporadically showcased throughout WWDC 2025, so we've compiled a list of all the major announcements to give you a breakdown of every Apple Intelligence announcement at the event. Unfortunately, Apple didn't showcase the Siri AI upgrade we'd been hoping for, but the Cupertino-based company did unveil a lot of new software improvements powered by Apple Intelligence. Here are the six major Apple Intelligence upgrades announced at WWDC 2025. Live Translation "helps users communicate across languages when messaging or speaking," and is integrated directly into Messages, FaceTime, and the Phone app. Live Translation will be able to automatically translate messages, add translated live captions to FaceTime, and on a phone call the translation will be spoken aloud throughout the conversation, completely removing language barriers using AI. Privacy won't be an issue either, as Apple says the new translation tool runs on Apple's own AI models and "users' personal conversations stay personal." Apple launched Genmoji and Image Playground as part of the first wave of Apple Intelligence features, and now the company is improving its generative AI image tools. Users can now turn text descriptions into emojis as well as mix together emojis and combine them with descriptions to create something new. You'll also be able to change expressions and adjust personal attributes of Genmojis made from photos of friends and family members. Image Playground is now getting ChatGPT support to allow users to access brand-new styles such as oil painting and vector art. Apple says, "users are always in control, and nothing is shared with ChatGPT without their permission." Visual Intelligence might've already been the best Apple Intelligence feature, but now the exclusive iPhone 16 AI tool is even better. At WWDC, Apple announced that Visual Intelligence can now scan your screen, allowing users to search and take action on anything they're viewing across apps. You'll be able to ask ChatGPT questions about content on your screen via Apple Intelligence, and this new feature can be accessed by taking a screenshot. When using the same buttons as a screenshot, you'll be asked to save, share the screenshot, or explore more with Visual Intelligence. As someone who loves Gemini's ability to see your screen, I'm incredibly excited to see how Visual Intelligence tackles its newfound power that lets it analyze what you're doing on your device. The world's most popular smartwatch just got AI functionality in the form of Workout Buddy, a workout experience with Apple Intelligence that "incorporates a user's workout data and fitness history to generate personalized, motivational insights during their session." Apple says the new feature is a "first-of-its-kind workout experience" and will offer "meaningful inspiration in real time" to keep you motivated on your exercise. Once Apple Intelligence has analyzed your workout data, "a new text-to-speech model then translates insights into a dynamic generative voice built using voice data from Fitness+ trainers, so it has the right energy, style, and tone for a workout." Workout Buddy is the first exclusive Apple Intelligence feature on Apple Watch and will require an Apple Intelligence-supported iPhone nearby. At launch, Workout Buddy will be available in English and across the following workout types: "Outdoor and Indoor Run, Outdoor and Indoor Walk, Outdoor Cycle, HIIT, and Functional and Traditional Strength Training." While this announcement might not grab any headlines, it's a big one for the future of Apple Intelligence: Developers now have access to Apple's Foundation Models. What does that mean exactly? Well, app developers will be able to "build on Apple Intelligence to bring users new experiences that are intelligent, available when they're offline, and that protect their privacy, using AI inference that is free of cost." Apple's example is an education app using the Apple Intelligence model to generate a quiz from your notes, without any API costs. This framework could completely change the way we, users, interact with our favorite third-party apps, now with the ability to tap into Apple's AI models and make the user experience even more intuitive. Last but not least, Apple announced Apple Intelligence powers for the Shortcuts app. This is a major upgrade to one of the best apps on Apple devices, allowing users to "tap into intelligent actions, a whole new set of shortcuts enabled by Apple Intelligence." Apple says "Shortcuts are supercharged with Apple Intelligence," and you'll also be able to tap into ChatGPT to superpower your Shortcuts. Just like the Shortcuts app, the true power here will come down to user creations and how people tap into this new ability. As someone who uses Shortcuts on a daily basis, I'm incredibly excited to see how Apple Intelligence improves the experience. Alongside these six major announcements, Apple also announced that Apple Intelligence will scan and identify relevant actions from your emails, websites, notes, and other content, and then automatically categorize them in the Reminders app. Elsewhere, Apple Wallet can now "identify and summarize order tracking details from emails sent from merchants or delivery carriers. This works across all of a user's orders, giving them the ability to see their full order details, progress notifications, and more, all in one place." Finally, Messages is getting Apple Intelligence poll functionality, which can detect when a poll might come in handy. The Messages app is also getting AI-generated backgrounds that can be created for each conversation using Image Playground.
[13]
Apple falls further behind in the AI race
Why it matters: AI is widely seen as the largest technology shift in decades and could easily serve as an inflection point where existing leaders are dethroned and new ones crowned. Driving the news: One year after unveiling an expansive vision for personalized AI that it has largely failed to deliver, the iPhone maker focused on a smaller set of tweaks and enhancements to Apple Intelligence. Yes, but: The list of things Apple left unsaid looms larger than the improvements they did announce. The big picture: Apple's incrementalism stands in sharp contrast to Google, which unveiled a host of AI features, many of which were the kinds of things that users can touch and use, such as its new tools for video creation. Between the lines: Apple appeared eager not to overpromise this year, announcing only features it expects to be part of the fall release. What they're saying: Angelo Zino, a senior vice president and equity analyst at CFRA Research said he remains positive on Apple for the long-term but called Monday's developer conference a "dud" that is testing investors' patience.
[14]
AI takes backseat as Apple unveils software revamp and new apps
AI announcements at WWDC limited to incremental features and upgrades despite pressure to compete Apple's artificial intelligence features took a backseat on Monday at its latest annual Worldwide Developers Conference. The company announced a revamped software design called Liquid Glass, new phone and camera apps as well as new features on Apple Watch and Vision Pro. But in spite of pressure to compete with firms that have gone all-in on AI, Apple's AI announcements were limited to incremental features and upgrades. Users will have a few new Apple Intelligence-powered features to look forward to including live translation, a real-time language translation feature that will be integrated into messages, FaceTime and the Phone app. The Android operating system has offered a similar feature for several years. Apple also introduced a new fitness app called Workout Buddy, which uses an AI-generated voice to speak to you during your workouts. Consumers might soon notice some AI improvements to the non-Apple apps on their phone as well. The company said it was enabling app developers to tap into Apple's on-device large language model to improve their experience with AI features within third-party apps. Consumers will be able to choose whether they want their data or information shared off-device and with the developers. At last year's WWDC, Apple announced a suite of AI upgrades to Siri that were intended to make the virtual assistant more personable and dynamic. Many of those features have yet to be released in spite of specific commitments from Apple. "This work needed more time to reach our high-quality bar," Craig Federighi, Apple's vice president of software engineering, previously said of the delay. The silence on Siri was "deafening", wrote Forrester's VP principal analyst Dipanjan Chatterjee. "The topic was swiftly brushed aside to some indeterminate time next year. Apple continues to tweak its Apple Intelligence features, but no amount of text corrections or cute emojis can fill the yawning void of an intuitive, interactive AI experience," Chatterjee wrote. "The end of the Siri runway is coming up fast, and Apple needs to lift off." Apple also announced a partnership with ChatGPT, a play to help the iPhone-maker catch up OpenAI, Microsoft and Google in the AI race. Wedbush Securities analyst Dan Ives said he expects Apple may need to forge more relationships with outside players to catch up with its competitors. "Overall, WWDC laid out the vision for developers but was void of any major Apple Intelligence progress as Cupertino is playing it safe and close to the vest after the missteps last year," Ives said. "We get the strategy but this is a big year ahead for Apple to monetize on the AI front, as ultimately Cook and co may be forced into doing some bigger AI acquisitions to jumpstart this AI strategy."
[15]
Apple plays catch-up
While Apple will unveil sweeping visual redesigns across its operating systems -- including a new "digital glass" interface inspired by its Vision Pro headset -- the AI announcements are expected to be underwhelming compared to the rapid-fire innovations competitors have unleashed in recent months. The contrast with Google I/O just three weeks ago couldn't be starker. Google's developer conference was a showcase of AI muscle-flexing: new models that generate music in real-time, realistic video creation from simple prompts, coding assistants that can tackle entire backlogs, and text-to-speech capabilities with customizable voices and accents. Google even introduced tools that can browse the web and use software under user direction -- early examples of the autonomous AI agents that many see as the next frontier.
[16]
Apple's AI event falls flat as iPhone maker struggles
Apple's flagship annual showcase has fallen flat as the iPhone maker fails to allay concerns that it is falling behind on artificial intelligence. The American tech giant was struck by a sell-off on Monday evening after it became apparent that its keynote presentation, led by chief executive Tim Cook, would only include minor software upgrades. Investors sent the stock price down by as much as 1.9pc during the presentation, wiping as much as $65.3bn (£48bn) off its market capitalisation. It comes amid growing frustration over Apple's failure to keep up with rivals on cutting-edge artificial intelligence (AI) developments. Last month, Apple's former chief designer Sir Jony Ive seemingly took aim at the company, hitting out at the "legacy" products on the market and the "decades old" technology within them. Sir Jony was speaking as he announced he was joining artificial intelligence rival OpenAI, in a $6.5bn (£4.8bn) deal expected to create a new generation of devices that could challenge the iPhone On Monday night, Apple put great emphasis on a range of visual improvements it is making to all of its operating systems. It said iPads would now be able work like traditional Mac computers, with apps able to run in windows. However, Daniel Ives, an analyst at Wedbush Securities, said the event was "overall a yawner" and that investors' patience was "wearing thin". It follows years of setbacks for Apple as it battles to compete on artificial intelligence. While it was one of the first Silicon Valley companies to embrace AI with its voice assistant Siri, rivals have since rapidly overtaken its technology. Last month, Mr Cook admitted on an earnings call that "we need more time to complete our work on [improved Siri] features so they meet our high-quality bar". On Monday, Apple failed to announce any major AI updates for Siri. Reports have suggested that progress has been hampered by complications updating Siri using large language models. The most notable AI announcement at the annual showcase was that it would give software developers access to the AI technology that is built into recent Apple devices. Craig Federighi, Apple's software chief, said: "We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can't wait to see what developers create." The company introduced a live translation feature to phone calls. Customers will be able to speak in England to a French person and then hear their words read out in French. Apple said it would also have a call screening feature that will work like a personal assistant, automatically answering calls from unknown people and asking them for information so that a customer can decide whether to take the call. Thomas Monteiro, an analyst at Investing.com, said: "In a moment in which the market questions Apple's ability to take any sort of lead in the AI space, the announced features felt incremental at best."
[17]
Apple plays it safe on AI despite Wall Street pressure
Apple on Monday remained on its cautious path to embracing generative AI even as rivals race ahead with the technology and Wall Street expresses doubts over its strategy. The pressure was on Apple not to disappoint at its annual Worldwide Developers Conference (WWDC) a year after the iPhone juggernaut made a promise it failed to keep -- to improve its Siri voice assistant with generative AI. The annual WWDC is addressed to developers who build apps and tools to run on the company's products. Despite last year's disappointment, Apple insisted on Monday it was still very much in the AI race, announcing incremental updates to its Apple Intelligence software, including the ability for app makers to directly access a device's AI capabilities. This would allow users to engage with apps using generative AI while offline, letting them interact ChatGPT-style with a hiking app, for example, while in remote areas without a connection. Apple CEO Tim Cook briefly mentioned that Siri's AI makeover was still under development and "needed more time to meet our high quality bar," which includes Apple's standards on privacy and data security. "We are making progress, and we look forward to getting these features into customers' hands," he added. For Gadjo Sevilla, senior analyst for Emarketer, "the delays to Apple's in-house AI efforts will continue to draw scrutiny." "Especially since rivals like Google and Samsung are moving ahead by introducing new on-device AI capabilities, or partnering with AI startups like Perplexity (in Samsung's case) to provide users with AI features," he added. The biggest announcement at the event was the renaming of Apple's operating systems so that releases better match their release year. The next operating system will be iOS 26 and will be available across all of Apple's devices -- including the Mac, Watch and Vision Pro headset -- in the fall, in time for the likely release of the next iPhone 17. Today, Apple's operating systems have vastly different nomenclatures across devices, including the current iOS 18 for iPhone or macOS 15 for Mac computers. Apple also announced that the new operating system will be the first major iOS redesign since 2013, calling the new look "Liquid Glass." Wall Street divided The relationship between Apple and app-making developers has been strained in recent years, with developers chafing at the iPhone maker's high fees for getting access to the App Store. A marathon lawsuit by Fortnite maker Epic Games ended with Apple being ordered to allow outside payment systems to be used in the US App Store. Adding to doubts about Apple's direction is the fact that the legendary designer behind the iPhone, Jony Ive, has joined with ChatGPT maker OpenAI to create a potential rival device for engaging with AI. Apple also has to deal with tariffs imposed by US President Donald Trump in his trade war with China, a key market for sales growth and the place where most iPhones are manufactured. Trump has also threatened to hit Apple with tariffs if iPhone production wasn't moved to the US, a change which analysts say would be impossible given the costs and capabilities required. Wall Street analysts remain divided on Apple's prospects, with the stock down about 17% since the start of the year, wiping over $600 billion from its market value and far outshone by its Big Tech rivals. While some analysts remain optimistic about Apple's long-term AI monetization potential, others worry the company's cautious approach may prove costly in the longer term. WWDC "was void of any major Apple Intelligence progress as Cupertino is playing it safe and close to the vest after the missteps last year," said Dan Ives of Wedbush Securities. "We have a high level of confidence Apple can get this right, but they have a tight window to figure this out," he added.
[18]
Apple Tones Down AI Hype While Showcasing Next iPhone, Mac Features - Decrypt
Updates include new messaging features, real-time translation, improved developer tools, and OS upgrades for iPhone, iPad, Mac, Watch, and Vision Pro. Apple wrapped its WWDC 2025 keynote on Monday with sweeping updates to its device operating systems and a striking new design. But for all the refinements across iPhone, iPad, Mac, Apple Watch, and Vision Pro, one question lingered: What happened to Apple Intelligence? When Apple's big push into AI was introduced at WWDC 2024, CEO Tim Cook described it as a "new chapter" for the company -- one that combined Apple's hardware with the growing momentum of generative AI. Apple Intelligence was meant to place the company in the same league as OpenAI, Nvidia, Google, and Microsoft. A year later, that promise remains largely unfulfilled, and has drawn industry-wide criticism as well as corporate upheaval. Indeed, its most significant impact on the AI landscape might be a research paper titled "The Illusion of Thinking" last week, in which the company outlined the limitations of large language models and warned against overestimating their reasoning capabilities. The paper emphasized that while LLMs may appear intelligent, they mainly rely on pattern recognition. Nonetheless, today's conference opened with Apple's Senior Vice President of Software Engineering Craig Federighi heralding its AI integration: "We're making the generative models that power Apple Intelligence more capable and more efficient, and we're continuing to tap into Apple Intelligence in more places across our ecosystem," he said. Federighi announced that Apple is opening its AI infrastructure to developers through a new "Foundation Models Framework" that allows apps to tap directly into the same on-device intelligence that powers Apple's own software. Updates to Xcode introduce generative tools for developers, including integration with ChatGPT, predictive code completion, and conversational programming via Swift Assist. Perhaps Apple was under-promising in the hopes of over-delivering after the debacle of the Apple Intelligence rollout. Instead, the presentation today devoted more attention to a sweeping visual redesign of OS X and iOS, bringing more UX conformity across Apple's entire product suite. The redesign features "Liquid Glass" -- a responsive, context-aware design element that adapts to touch, content, and context across devices. The redesign affects everything from the lock screen to system icons, aiming to make transitions between Apple devices more seamless. Other updates unveiled at WWDC include enhancements to Messages, which now support polls, custom backgrounds, typing indicators, group payments, and improved spam filtering. Live translation enables real-time language translation in Messages, FaceTime, and phone calls using AI. The Phone app is receiving upgrades, including Hold Assist -- a way to retaliate against being put on hold via a standby mode that alerts you when the person you're trying to reach finally answers -- and Call Screening, which prompts callers to identify themselves before connecting. That feature, it's worth noting, came with Google Voice when it rolled out in 2009. Other updates include: Apple said the updates will be available in a public beta in July, with full releases coming in the fall.
[19]
iOS 26 Brings Plenty of New Updates to Apple Intelligence | AIM
Will Apple Intelligence finally be able to deliver the promise? Apple announced a list of updates to its artificial intelligence feature suite, Apple Intelligence, at the World Wide Developers Conference (WWDC) 2025 on Monday. These new features will be available for supported iPhone models later this year, with iOS 26. The company announced a live translation feature integrated into Messages, FaceTime, and Phone applications. This feature helps translate text messages, phone calls, and audio from FaceTime calls in real time. In addition, Apple also announced an update to its AI-enabled emoji maker feature. Users can now mix two emojis to create one. This is in addition to the previously launched Genmoji feature, which lets users develop emojis with prompts in natural language. Apple also announced a new update to Visual Intelligence features on Apple Intelligence, allowing AI to analyse screen content and answer relevant queries or take necessary actions. Users will be able to capture screenshots and highlight objects to search for similar items online. This feature is largely similar to the 'Circle to Search' feature available on Android devices. In addition, Apple Intelligence can take necessary actions based on the data in the image. For instance, if Apple Intelligence detects event details in the image, it will suggest an "Add to Calendar" button that pre-populates the date, time, and location. Users can also use the 'Ask' button to pose a question directly to ChatGPT about the highlighted object. Apple also announced 'Workout Buddy' on the Apple Watch, which works with Apple Intelligence and uses workout data and fitness history to generate personalised and motivational insights during a workout session. Apple said a text-to-speech model analyses a user's workout and fitness history and provides voice-based motivational insights with the 'right energy, style and tone, for a workout'. Furthermore, Apple also announced a new 'Foundation Models Framework', where developers can build on top of Apple Intelligence to bring new features via third-party apps. "For example, an education app can use the on-device model to generate a personalised quiz from a user's notes, without any cloud API costs," said Apple. Moreover, Apple announced the integration of Apple Intelligence to Shortcuts, where users can automate actions and tasks on their iPhone. Users can now use Apple Intelligence capabilities (like writing or image generation capabilities) to create a Shortcut. This expands the list of features, alongside the ones currently available with iOS 18. However, Apple has faced criticism throughout for the underwhelming features of Apple Intelligence so far. Recently, Eddy Cue, Apple's senior VP of services, revealed the company's intention to incorporate AI search into Safari, with support from either Perplexity, Google, or OpenAI. Additionally, Google CEO Sundar Pichai disclosed that Google intends to partner with Apple to integrate Gemini into Apple Intelligence this year. It was also reported that CEO Tim Cook, in a discussion with Pichai, mentioned that 'more third-party AI models' will be delivered to Apple Intelligence later this year.
[20]
Apple unveils new AI features at WWDC 25: What it means for iPhone users
Apple announced new artificial intelligence features at the Worldwide Developers Conference keynote June 9. The company launched testing versions of live translation, visual search and a workout assistant. Apple Intelligence will also be a part of the Reminder, Messages and Apple Wallet apps. "Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy," Craig Federighi, Apple's senior vice president of Software Engineering, said. "Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems." The company also said it will open its large language model to third party developers. "App developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they're offline, and that protect their privacy, using AI inference that is free of cost," the company said in a news release accompanying the announcement. Federighi also teased that updates to Siri would be coming later this year. How AI features will be implemented The company said that its visual search feature, dubbed visual intelligence, will be extended to the screen from what it has already released using device cameras. Users will be able to ask ChatGPT questions about what they're looking at on their screen as well as search-supported apps to find similar images and products. They will also be able to add events to their calendar from images on their screen. The live translation feature will automatically translate messages, generate translated captions on Facetime and audio translations on phone calls. Apple said that the "Workout Buddy" will analyze data from a user's current workout along with their fitness history to provide inspiration in real time. The feature will be available on Apple Watch and require an Apple Intelligence-supported iPhone nearby as well as Bluetooth earbuds. Other uses of Apple Intelligence include: WWDC 2025 AI announcements could help Apple catch up The opening of the Foundation Models appears to be an attempt to spur the creation of new AI features and apps to help Apple catch up in the artificial intelligence market. "We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day," Federighi said. 9to5Mac noted before the conference that software announcements made last year rolled out slowly and AI upgrades to Siri and Swift Assist never materialized. "They had the big vision announcement last year. Not a lot of progress this year as they tried to build it. It is hard to build," Alex Kantrowitz, founder of the Big Technology podcast and newsletter, said on CNBC on Monday, Apple is expected to partner with Alibaba on AI in China, according to Reuters, but that has been delayed because of the tariff showdown between President Donald Trump and China. There is "growing confidence inside and outside China this AI launch will ultimately happen between Apple and Alibaba," Ives said in a note to investors on Monday, June 9. Apple stock and reactions The affair was fairly subdued, Craig Moffett, co-founder and senior analyst at equity research firm MoffettNathanson, told CNBC. "What's the old joke, they set expectations really low and then they proceeded to meet them. They did pretty much what everybody expected," said Moffett. "The real question is where's the growth going to come from and that's still a huge question mark." Apple stock fell more than 1% as the WWDC event played out. Shares opened Monday, June 9, at $204.39, but fell to below $202 during the event. The company stock price hit $259 in late December 2024, near the end of a three-month period in which Apple set an all-time record for iPhone upgrades. But shares dipped to $172 in April 2025 a few weeks after the company announced it was delaying some artificial intelligence upgrades to Siri to 2026.
[21]
WWDC 2025: Apple announces new features for Apple Intelligence
At its Worldwide Developers Conference (WWDC) event, Apple said these new features run on-device, meaning they don't require an internet connection and offer enhanced data security.Apple on Monday announced significant updates for Apple Intelligence, adding new AI-powered features to iPhone, iPad, Mac, Apple Watch, and Vision Pro. At its Worldwide Developers Conference (WWDC) event, Apple said these new features run on-device, meaning they don't require an internet connection and offer enhanced data security. Developers can also integrate Apple's on-device AI model into their apps starting today. Here are the key features introduced with Apple Intelligence: Live Translation: Apple Intelligence now includes live translation for messages and phone calls. It supports eight new languages, including traditional Chinese, Turkish, and Portuguese. This allows real-time translation of typed or spoken content in Messages, FaceTime, and Phone apps. Genmoji and Image Playground: Apple introduced Genmoji and Image Playground for creating customised emojis and illustrations from text descriptions. Apple partnered with OpenAI's ChatGPT for artistic styles like oil painting and vector art. Workout Buddy on Apple Watch: The Apple Watch now features "Workout Buddy," a personalised fitness assistant. Powered by Apple Intelligence, it uses your history and preferences to provide real-time motivational insights during workouts. Apple also stated that Apple Intelligence features will support eight more languages by the end of 2025, including Dutch, Swedish, and Vietnamese. These features are available to developers for testing now and will roll out to users soon.
[22]
Apple Expands OpenAI Partnership Amid Rising AI Pressures | PYMNTS.com
Among the new offerings from the event include live translation capabilities added to the company's Messages, Phone and FaceTime apps, making it easier to have conversations when traveling abroad or talking with someone who speaks another language. Beginning with iOS 26, Apple's visual intelligence feature will be able to analyze images and text on an iPhone's screen. Users can also ask ChatGPT for more info about what they're looking at, and search Google for similar images or products. "Despite the breadth of its rollout, the underlying strategy underscores a fundamental tension," PYMNTS wrote following Monday's event. "Apple is betting that measured integration, meticulous design and a deep commitment to user privacy will matter more than rapid innovation in generative AI. Investors were less convinced, with the company's stock closing down 1.2% for the day." While Apple has long been known for excitement and innovation, the company "delivered nothing particularly exciting or innovative during the first day of the WWDC." In terms of the Siri upgrades, Apple Senior Vice President of Software Engineering Craig Federighi said the company needed "more time to meet our high-quality bar," per a report by The Wall Street Journal. "This restrained approach stands in contrast to rivals like Amazon, Google and Microsoft, which are embracing large language models and enterprise-scale AI solutions in aggressive and sometimes experimental ways," PYMNTS wrote. In the context of the larger AI economy, Apple is at risk of "strategic drift," PYMNTS added, as most enterprise AI innovation is taking place in the cloud, powered by APIs and platforms that allow fine-tuning, multi-modal inputs and integration with large data sets. "Apple's refusal to enter this space leaves it reliant on consumer hardware cycles and developer goodwill -- both of which may wane as competitors offer richer, more adaptable platforms," the report said.
[23]
Apple opens its AI to developers but keeps its broader ambitions modest
CUPERTINO, California (Reuters) -Apple announced on Monday a slew of artificial intelligence features including opening up the underlying technology it uses for Apple Intelligence in a modest update of its software and services as it lays the groundwork for future advances. The presentations at its annual Worldwide Developers Conference focused more on incremental developments, including live translations for phone calls and design changes to its operating systems, that improve everyday life rather than the sweeping ambitions for AI that Apple's rivals are marketing. A year after it failed to deliver promised AI-based upgrades to key products such as Siri, Apple kept its AI promises to consumers modest, communicating that it could help them with tasks like finding where to buy a jacket similar to one they' have seen online. Behind the scenes in its tools for developers, Apple hinted at a strategy of offering its own tools alongside those from rivals, similar to the tack taken by Microsoft last month. Apple software chief Craig Federighi said the company will offer both its own and OpenAI's code completion tools in its key Apple developer software and that the company is opening up the foundational AI model that it uses for some of its own features to third-party developers. "We're opening up access for any app to tap directly into the on-device, large language model at the core of Apple," Federighi said. In an early demonstration of how partners could improve Apple apps, the company added image generation from OpenAI's ChatGPT to its Image Playground app, saying that user data would not be shared with OpenAI without a user's permission. Apple is facing an unprecedented set of technical and regulatory challenges as some of its key executives kicked off the company's annual software developer conference on Monday. Shares of Apple, which were flat before the start of the event, closed about 1.2% lower on Monday. "In a moment in which the market questions Apple's ability to take any sort of lead in the AI space, the announced features felt incremental at best," Thomas Monteiro, senior analyst at Investing.com, said. Compared with what other big AI companies are introducing, he added, "It just seems that the clock is ticking faster every day for Apple." As Apple executives discussed new features at the event in Cupertino, California, OpenAI announced a new financial milestone on Monday, reaching $10 billion in annualized revenue run rate as of June. While Apple shied away from making big promises of AI improvements for consumers, small touches behind the scenes such as allowing developers to use ChatGPT's code generation tools in its XCode tools, which are required to make apps for Macs and iPhones, showed the company trying to keep pace with what its rivals are offering to software developers. OS UPDATES Federighi also said Apple plans a design overhaul of all of its operating systems. Apple's redesign of its operating systems centered on a design it calls "liquid glass" where icons and menus are partially transparent, a step Apple executives said was possible because of the more powerful custom chips in Apple devices versus a decade ago. Federighi said the new design will span operating systems for iPhones, Macs and other Apple products. He also said Apple's operating systems will be given year names instead of sequential numbers for each version. That will unify naming conventions that have become confusing because Apple's core operating systems for phones, watches and other devices kicked off at different times, resulting in a smattering of differently numbered operating systems for different products. In other new features, Apple introduced "Call Screening" where iPhones will automatically answer calls from an unknown number and ask the caller the purpose of their call. Once the caller states their purpose, the iPhone will show a transcription of the reason for the call, and ring for the owner. Apple also said it will add live translation to phone calls, as well as allow developers to integrate its live translation technology into their apps. Apple said the caller on the other end of the phone call will not need to have an iPhone for the live translation feature to work. Apple's Visual Intelligence app - which can help users find a pair of shoes similar to ones at which they have pointed an iPhone camera - will be extended to analyzing items on the iPhone's screen and linked together with apps. Apple gave an example of seeing a jacket online and using the feature to find a similar one for sale on an app already installed in the user's iPhone. (Reporting by Stephen Nellis and Kenrick Cai in Cupertino, California; Additional reporting by Akash Sriram in Bengaluru; Editing by Kenneth Li and Matthew Lewis)
[24]
Apple keeps it simple on AI, unveils 'liquid glass' redesign
STORY: Apple is seemingly in no rush on artificial intelligence. The tech giant on Monday unveiled a series of upgrades to its AI tech. But analysts said they were more incremental than revolutionary, focusing on tools to help with everyday life. That included live translation of phone calls and messages. The relatively modest changes come a year after Apple set out a bold vision for AI, but then failed to deliver on promised upgrades to products like its Siri voice assistant. One observer told Reuters the company now seemed focused on making sure it could deliver on those promises, rather than making new claims. But Apple did have big news for developers, saying it would open up its AI model to them. Company Senior Vice President Craig Federighi says the change will open up huge opportunities for third-party apps: "We think this will ignite a whole new wave of intelligent experiences in the apps you use every day. For example, if you're getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging." The strategy is similar to one unveiled by Microsoft last month. And tech expert Carolina Milanesi says it will excite app creators: "This is the most significant tool that they're able to deliver to developers so that they can take advantage of Apple Intelligence on device in a secure and private way, but also, for especially smaller developers being able to do so without the added cost of using the model in the cloud." Apple also unveiled a new design language for its operating systems, dubbed "liquid glass". That makes icons and menus partially transparent. The company says it adds fluidity and expressiveness to the user experience.
Share
Copy Link
Apple's WWDC 2025 showcases modest AI updates, focusing on practical features while competitors push ahead with more ambitious AI projects. The company's cautious approach raises questions about its position in the rapidly evolving AI landscape.
At the Worldwide Developers Conference (WWDC) 2025, Apple unveiled a series of incremental AI updates, branded as Apple Intelligence, focusing on practical features rather than groundbreaking AI advancements 1. This cautious approach comes as competitors like Meta, Anthropic, OpenAI, and Microsoft aggressively push into the AI space with more ambitious projects 1.
Source: Decrypt
Apple introduced several modest AI-powered features across its ecosystem:
Source: Tom's Guide
Live Translation: A feature that automatically translates text or spoken words in real-time for Messages, FaceTime, and phone calls 2.
Call Screening and Hold Assist: AI-powered features for managing phone calls, including automatic answering of unknown numbers and detection of hold music 2.
Workout Buddy: An AI-driven workout coach that provides encouragement and summarizes exercise data 2.
ChatGPT Integration: Apple integrated OpenAI's ChatGPT into its Image Playground app for enhanced image generation capabilities 12.
In a significant move, Apple announced it would open access to its on-device AI language model to third-party developers 13. The company introduced the Foundation Models framework, enabling developers to build more AI capabilities into their apps using Apple's existing systems 25. This move is seen as an attempt to encourage more AI feature development within Apple's ecosystem 2.
Source: engadget
Notably absent from the announcements was any concrete update on the "more personalized" Siri that Apple had promised at the previous year's WWDC 1. Craig Federighi, Apple's SVP of Software Engineering, stated that they would not have more to share until next year, potentially raising concerns about Apple's strategy for its voice assistant in an increasingly competitive market 24.
The modest scope of Apple's AI announcements led to a 1.2% drop in the company's shares, reflecting potential investor disappointment 1. Some analysts suggest that Apple's incremental approach to AI development may be warranted, given the uncertain user demand for AI-driven features on smartphones 3.
Apple's approach to AI differs from its competitors in several ways:
On-device processing: Apple emphasizes privacy and offline functionality by running AI models on personal devices 3.
Developer focus: By making its AI models accessible to developers, Apple leverages its vast reach with coders 3.
Cautious integration: The company is integrating AI features gradually into its existing products and services rather than launching standalone AI products 14.
While Apple's current AI offerings may seem modest compared to its competitors, the company's strategy of focusing on practical, privacy-centric features could pay off in the long run. However, as the AI landscape evolves rapidly, Apple may need to accelerate its AI development to maintain its competitive edge in the tech industry 135.
As the AI race continues, only time will tell if Apple's cautious approach will be viewed as a strategic blessing or a missed opportunity in the fast-paced world of artificial intelligence 1.
Apple is reportedly in talks with OpenAI and Anthropic to potentially use their AI models to power an updated version of Siri, marking a significant shift in the company's AI strategy.
29 Sources
Technology
17 hrs ago
29 Sources
Technology
17 hrs ago
Cloudflare introduces a new tool allowing website owners to charge AI companies for content scraping, aiming to balance content creation and AI innovation.
10 Sources
Technology
1 hr ago
10 Sources
Technology
1 hr ago
Elon Musk's AI company, xAI, has raised $10 billion in a combination of debt and equity financing, signaling a major expansion in AI infrastructure and development amid fierce industry competition.
5 Sources
Business and Economy
9 hrs ago
5 Sources
Business and Economy
9 hrs ago
Google announces a major expansion of AI tools for education, including Gemini for Education and NotebookLM, aimed at enhancing learning experiences for students and supporting educators in classroom management.
8 Sources
Technology
17 hrs ago
8 Sources
Technology
17 hrs ago
NVIDIA's upcoming GB300 Blackwell Ultra AI servers, slated for release in the second half of 2025, are poised to become the most powerful AI servers globally. Major Taiwanese manufacturers are vying for production orders, with Foxconn securing the largest share.
2 Sources
Technology
9 hrs ago
2 Sources
Technology
9 hrs ago