Curated by THEOUTPOST
On Sun, 16 Feb, 12:01 AM UTC
23 Sources
[1]
Apple Intelligence confirmed for iPhone, iPad and Mac. New update incoming for more languages and regions in April
Apple has confirmed that Apple Intelligence will expand to more languages and regions in April with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4. New supported languages include French, German, Italian, Spanish, Japanese, Korean, Simplified Chinese, and more, along with localized English versions for India and Singapore. The update will also bring Apple Intelligence to iPhone and iPad users in the EU for the first time and introduce support for Apple Vision Pro in U.S. English.Apple has announced that Apple Intelligence, its personal intelligence system, will soon support additional languages, including French, German, Italian, Brazilian Portuguese, Spanish, Japanese, Korean, and Simplified Chinese. Additionally, localized English versions will be available for Singapore and India. These new language options will roll out globally with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April. Developers can begin testing these updates today. With these upcoming software releases, iPhone and iPad users in the EU will gain access to Apple Intelligence for the first time. The technology will also expand to a new platform with Apple Vision Pro, initially available in U.S. English, enhancing users' ability to communicate, collaborate, and express themselves in innovative ways. Apple Intelligence is designed with privacy at its core, leveraging on-device processing for many of its AI models. For tasks requiring more computational power, Private Cloud Compute extends the security of the iPhone to the cloud while maintaining strict privacy protections. Apple has confirmed that more features and enhancements for Apple Intelligence, including expanded Siri capabilities, will be introduced in the coming months. Also, Apple recently launched the iPhone 16e which is the most affordable iPhone running Apple Intelligence thanks to the A18 Bionic Chip. You can learn all about it here - Apple iPhone 16e launched in India with A18 chip, satellite connectivity, AI features. Check price, specifications
[2]
Apple Intelligence confirmed for iPhone, iPad and Mac. New update expands AI to more languages and regions in April
Apple has confirmed that Apple Intelligence will expand to more languages and regions in April with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4. New supported languages include French, German, Italian, Spanish, Japanese, Korean, Simplified Chinese, and more, along with localized English versions for India and Singapore. The update will also bring Apple Intelligence to iPhone and iPad users in the EU for the first time and introduce support for Apple Vision Pro in U.S. English.Apple has announced that Apple Intelligence, its personal intelligence system, will soon support additional languages, including French, German, Italian, Brazilian Portuguese, Spanish, Japanese, Korean, and Simplified Chinese. Additionally, localized English versions will be available for Singapore and India. These new language options will roll out globally with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April. Developers can begin testing these updates today. With these upcoming software releases, iPhone and iPad users in the EU will gain access to Apple Intelligence for the first time. The technology will also expand to a new platform with Apple Vision Pro, initially available in U.S. English, enhancing users' ability to communicate, collaborate, and express themselves in innovative ways. Apple Intelligence is designed with privacy at its core, leveraging on-device processing for many of its AI models. For tasks requiring more computational power, Private Cloud Compute extends the security of the iPhone to the cloud while maintaining strict privacy protections. Apple has confirmed that more features and enhancements for Apple Intelligence, including expanded Siri capabilities, will be introduced in the coming months. Also, Apple recently launched the iPhone 16e which is the most affordable iPhone running Apple Intelligence thanks to the A18 Bionic Chip. You can learn all about it here - Apple iPhone 16e launched in India with A18 chip, satellite connectivity, AI features. Check price, specifications
[3]
Apple Intelligence to Expand to Vision Pro Headset in April | PYMNTS.com
Apple Intelligence will be available in beta in a software update, visionOS 2.4, with support for U.S. English, the company said in a press release. More features and support for other languages will be added throughout the year. "With Apple Intelligence for Vision Pro, users will be able to proofread, rewrite and summarize text using Writing Tools; compose text from scratch using ChatGPT in Writing Tools; explore new ways to express themselves visually with Image Playground; create the perfect emoji for any conversation with Genmoji; and much more," the release said. Apple introduced Apple Intelligence in June, saying the new suite of AI features would revolutionize the iPhone, iPad and Mac experience, while keeping users' data under lock and key. The company released Vision Pro, its first mixed-reality headset, in February 2024 along with 600 new apps and games that included forays into augmented reality and virtual reality from major retailers. In its Friday press release, Apple said the visionOS 2.4 update will also introduce an app for Vision Pro called Spatial Gallery that will feature a curated selection of photos and videos; an Apple Vision Pro app for iPhone that will enable users to quickly access apps and information for their Vision Pro; and enhancements to Guest User that will make it easier to share apps and experiences with iPhone or iPad users. Apple announced Friday, in another press release, that Apple Intelligence on other devices -- iPhone, iPad and Mac -- will expand to include more languages and regions in April with the release of iOS 18.4, iPadOS 18.4 and macOS Sequoia 15.4. The new languages will include French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, Chinese (simplified) and localized English for Singapore and India, according to the release. These languages will be accessible in nearly all regions of the world, per the release. "With the upcoming software updates, iPhone and iPad users in the EU will have access to Apple Intelligence features for the first time," the release said. More capabilities for Siri will be introduced in the coming months, per the release.
[4]
Apple : Intelligence expands to more languages and regions in April
Apple Intelligence on iPhone, iPad, and Mac expands to more languages and regions in April Apple Intelligence, the personal intelligence system that delivers helpful and relevant intelligence, will soon be available in more languages, including French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and Chinese (simplified) - as well as localized English for Singapore and India. These new languages will be accessible in nearly all regions around the world with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April, and developers can start to test these releases today. With the upcoming software updates, iPhone and iPad users in the EU will have access to Apple Intelligence features for the first time, and Apple Intelligence will expand to a new platform in U.S. English with Apple Vision Pro - helping users communicate, collaborate, and express themselves in entirely new ways. Apple Intelligence marks an extraordinary step forward for privacy in AI and is designed to protect users' privacy at every step. It starts with on-device processing, meaning that many of the models that power Apple Intelligence run entirely on the device. For requests that require access to larger models, Private Cloud Compute extends the privacy and security of iPhone into the cloud to unlock even more intelligence. Apple Intelligence will continue to expand with new features in the coming months, including more capabilities for Siri.
[5]
Apple Intelligence will expand to more languages with iOS 18.4 in April: All you need to know
Additionally, localised English versions for users in Singapore and India will also be introduced. Just days after unveiling the budget-friendly iPhone 16E, Apple has announced the release timeline for its upcoming software update, iOS 18.4. This update, set to arrive in April, will bring significant improvements, including the expansion of Apple Intelligence to more languages. Let's take a closer look at what Apple has announced. Through a blogpost, Apple revealed that Apple Intelligence, the company's suite of AI features, will soon support new languages. These include French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and simplified Chinese. Additionally, localised English versions for users in Singapore and India will also be introduced. These new language options will be available globally with iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April. Developers can start testing these updates now. Also read: iOS 18.4: Release timeline, Apple Intelligence features, expanded language support and everything else we know For the first time, Apple Intelligence features will be available to iPhone and iPad users in the European Union (EU). Additionally, Apple Intelligence will be integrated into the Apple Vision Pro in US English. According to the tech giant, this will help users "communicate, collaborate, and express themselves in entirely new ways." Apple emphasises privacy in AI with Apple Intelligence. The system primarily runs on-device, ensuring that personal data stays secure. For more complex tasks requiring advanced AI models, Apple uses Private Cloud Compute, which extends iPhone-level security to cloud processing while keeping user data protected. Also read: iPhone 17 series may get Apple's in-house chips for faster WiFi and Bluetooth connectivity Apple has confirmed that Apple Intelligence will continue to expand with new features in the coming months, including more capabilities for Siri. With iOS 18.4 arriving in April, users can look forward to a smarter and more personalised Apple Intelligence experience. Also read: iPhone 17 and iPhone 17 Air leaks: Price in India, camera, design, display and more
[6]
Apple Vision Pro to Get Apple Intelligence, Spatial Gallery App in April
These features will initially be available to users in Engligh (US) Apple Vision Pro is set to gain support for Apple Intelligence features in April, the company announced on Friday. The Cupertino company's suite of artificial intelligence features will be available on its first mixed reality headset with the next update to visionOS. Apple is also launching a new Spatial Gallery app that offers access to curated content on the Vision Pro, while a new app for iPhone will let users manage app downloads on the device. The Apple Vision Pro will also support a new Guest User mode, allowing the headset to be shared with other users. The company says that Apple Intelligence features will start rolling out to the Apple Vision Pro when visionOS 2.4 is rolled out in April. Features that will be part of the initial release will include Writing Tools (with ChatGPT integration), Image Playground (with a standalone Vision Pro app), and Genmoji. These features will initially be available to users who have set their device language to English (US). Users will also be able to create Memory Movies in the Photos app, which will also support natural language search queries., or use features like Smart Reply in the Mail and Messages apps. Other features coming to the Vision Pro include Image Wand in Notes, Priority Messages in Mail and Notification Centre, Mail Summaries, and Notification Summaries. One of the most anticipated features coming to the Apple Vision Pro is a Guest User mode that will allow owners of the spatial computer to share it with other users. Apple's description of the Guest User mode suggests that this won't be a full-fledged multi-user system that allows another user to have their own login and password. Instead, the Apple Vision Pro will allow a guest user to save their eye and hand setup for up to 30 days after using the device. After updating to visionOS 2.4, Apple Vision Pro owners can use their iPhone or iPad to start a new Guest User session and select apps that can be accessed by other users on the headset. After updating to iOS 18.4, Apple Vision Pro owners will see a new application on their smartphone that allows remote management of apps on the headset. Users can also see information about the device, in addition to downloading apps and games, or see recommendations for new content (including 3D movies or Apple Immersive titles), according to the company. On the headset, Apple's visionOS 2.4 update will introduce a new Spatial Gallery on the Vision Pro. This app will show spatial photos, spatial videos, panoramas and other content that is designed to take advantage of the displays on the mixed reality headset. These will cover various topics, including "art, culture, entertainment, lifestyle, nature, sports, and travel". Apple says that the visionOS 2.4 update will roll out to the Apple Vision Pro in April, and the Apple Intelligence features will initially be available to users who have set their device language and Siri language to English (US). The features are also expected to roll out to other regions where the headset is available, including Australia, Canada, China, Hong Kong, France, Germany, Japan, Korea, Singapore, Taiwan, the UAE, and the UK.
[7]
Apple Intelligence Features in the Vision Pro Might Arrive by April
Apple Vision Pro could reportedly get artificial intelligence (AI) features with an upcoming update. As per the report, the Cupertino-based tech giant is planning to add Apple Intelligence features to the augmented reality (AR)/virtual reality (VR) headset with the visionOS 2.4 update, which is expected to be rolled out globally in April. Notably, a report last year claimed that the wearable device will be upgraded with AI features in 2025. Alongside, the headset is also expected to get an upgraded guest mode and a new spatial content app with the update. Bloomberg's Mark Gurman claimed in a report that Apple Intelligence features could be available to users by April, when the visionOS 2.4 update is expected to be shipped. Citing unnamed people with knowledge of the matter, the report claimed that the new features could be rolled out in a developer beta update as soon as this week. Gurman claimed that the AI features with the upcoming update will include the Writing Tools, Genmojis, and the Image Playground app. It appears that other features will not be available to users immediately, and they might be added with future updates. Interestingly, this is the first time the tech giant has expanded Apple Intelligence to its larger ecosystem. Since its launch, the AI features have only been available on select iPhone, iPad, and MacBook models. As per the report, the decision to expand the AI features to the Apple Vision Pro was taken as the device is powered by an M2 chipset and features 16GB of RAM, which is considered enough to support on-device AI processing. The first set of Apple Intelligence features will reportedly not include the AI-powered Siri. In a separate report, Gurman claimed that the virtual assistant is facing software bugs and engineering issues, and the AI features are not working consistently. As a result, the enhanced Siri could witness a delayed release. Apart from AI, the Apple Vision Pro will reportedly also get an upgraded guest mode that will allow users to manage access and apps directly via their iPhone. Additionally, a new spatial content app is also expected to debut that will support 3D images and panoramas.
[8]
The Vision Pro is getting Apple Intelligence in April | TechCrunch
Apple Intelligence is heading to the Vision Pro, as part of an upcoming operating system update. Apple confirmed on Friday that its generative AI platform will arrive on the extended reality headset as part of VisionOS 2.4. A beta version of the software is currently available for developers. The public version is set for an April release Like the iPhone and Mac before it, the Vision Pro will receive Apple Intelligence updates in waves. The first set includes several familiar offerings, focused primarily on generating text and images. The company sees the addition of features like Rewrite, Proofread, and Summarize as key components for on-device workflow. It's worth keeping in mind that Apple has framed Vision Pro as a "spatial computing" device since the outset. For all the video, gaming, and other entertainment features, the company has sought to set the system apart from its extended reality predecessors by positioning it as an extension of desktop computing - or, as TechCrunch framed it in our review, "The Infinite Desktop." As it stands, composing text is a mixed bag on the headset. The default typing method requires the wearer to look at a letter, before pinching two fingers together to select. While well implemented, it's cumbersome when faced with writing more than a word or two at a time. Voice addresses this bottleneck to a degree, and Apple's recent AI-powered Siri supercharge bodes well for the smart assistant's Vision Pro future. Apple is banking on the combination of voice dictation and generative AI writing tools to deliver a smoother experience to convince more Vision Pro users to incorporate the headset into more of their existing workflows. At the very least, features like Message Summaries and email Smart Reply streamline interaction with different apps, without taking the user away from a given task. Image Playground is the other big piece of the puzzle, bringing image generation to the wearable display as part of the Vision OS 2.4 update. The feature is integrated directly into the VisionOS Photos app, allowing users to create images through verbal prompts. All of the above features have previously been rolled out on iOS, macOS, and iPadOS. There are no new Apple Intelligence features specific to Vision Pro arriving in this update. Along with VisionOS 2.4, Apple has also launched a Vision Pro iPhone app arriving with iOS 18.4, which is also now in beta. The app serves a few different purposes. Foremost is the ability to browse VisionOS content like TV shows and movies, which can then be transferred onto the headset. This feature appears to be, in part, a response to the limitations of wearing the headset, both in terms of personal comfort and battery life. If you're going to be scrolling through content, you might as well do it from the comfort of your iPhone. When an iPhone is unlocked and within proximity of the headset, the new app can also be used to manage guest accounts. The Vision Pro will prompt its owner when someone is attempting to sign in as a guest. A streaming image of the guest's in-headset view is accessible through the new app, as well.
[9]
Apple Intelligence is headed to the Vision Pro in April, dev beta available today
The rumors are true: Apple confirmed today that the Vision Pro will get Apple Intelligence features in April with the arrival of visionOS 2.4. A developer beta is also rolling out today for the less patient. As we've seen on other devices, Apple is starting out the Vision Pro's AI rollout with basic features. Those include Writing Tools, which can help you summarize, rewrite and proofread text, as well as generate text with ChatGPT; Image Playground for creating AI imagery; and Genmoji for building custom AI generated emojis and stickers. It really was only a matter of when Apple would bring Apple Intelligence to the Vision Pro. It's available on current Macs running the M1 chip, so the spatial headset's M2 hardware is clearly more than capable. Apple Intelligence on VP only supports English for now, but the company says more languages and AI features will be coming throughout the year. As for other minor Apple Intelligence capabilities, the Vision Pro will also be getting Priority Notifications and summaries, Smart Reply in Mail and Messages, and the ability to create Memory Movies in Photos. Arguably more useful to Vision Pro users, Apple is also introducing several apps and tweaks to make the headset a bit more useful. There's a new Spatial Gallery app exclusive to Vision Pro that will highlight spatial videos, photos and panoramas from Apple. Think of it as another way to enjoy the VP's immersive capabilities without waiting for another big budget Immersive Video to drop. The company says the Spatial Galllery app will include "stories and experiences" from brands like Red Bull (which is well known for making 360-degree videos for VR), as well as behind the scenes material from Apple shows like Severance and Shrinking. (Let's hope we can actually sit inside the creepy Lumon offices.) Additionally, a new Apple Vision Pro app for iPhone will let users better manage their headset experience. They can remotely add apps and games to the Vision Pro, as well as explore content to check out later. The app will also serve as a way to owners you know when new content drops for the Vision Pro (like a new "Arctic Surfing" episode of the Boundless Immersive series, which arrives today), as well as explore other videos for the headset, including a library of nearly 300 3D movies. A standalone mobile Vision Pro app also makes sense, especially since Meta has offered something similar for its headsets for years. It's a sign that Apple is slowly making the Vision Pro platform a bit more consumer friendly, instead of just being a testbed for developers working in spatial computing. I don't think Apple will be lowering the Vision Pro's price anytime soon, but whenever we get a cheaper headset from the company, it'll be helpful to have content discovery features like the Spatial Gallery and the Vision Pro iPhone app. And speaking of user-friendly tweaks, Apple is also improving Guest User mode with visionOS 2.4 Now headset owners will be able to start guest sessions with their iPhone and iPad, remotely choose which apps are available to guests wearing your Vision Pro, and also kick off AirPlay mirroring remotely. Previously, that process involved putting on the Vision Pro first, enabling Guest User mode, and then passing it to someone else to test out.
[10]
Apple Intelligence could arrive on Vision Pro in April | TechCrunch
Apple is planning to add Apple Intelligence to its Vision Pro headset in an update that could come as early as April, according to Bloomberg's Mark Gurman. Just a couple weeks after Apple Intelligence was first announced in June 2024, Gurman reported that Apple was looking to bring its suite of AI tools to the Vision Pro, though there were questions to answer about how those tools would be reimagined for a mixed reality experience. Now Apple is reportedly aiming to include Apple Intelligence (including Writing Tools, Genmoji, and Image Playground) in its visionOS 2.4 software update, with a version available to developers as soon as this week. The Vision Pro's first Apple Intelligence offerings reportedly won't include an upgraded Siri. In fact, Gurman also said a long-promised upgrade to Siri more broadly could be delayed due to engineering problems and bugs.
[11]
Apple Plans To Breathe New Life Into Its Vision Pro Headset With Apple Intelligence, Expected To Arrive Via Software Update With No Hardware Requirements
Last year, Apple unveiled its popular Vision Pro headset to the world, which has seen mixed reviews and even though it is a competent machine, its sheer price tag alone is the reason why most users would not get it. While visionOS 2 has been out and about for as long as iOS 18, it did not get any Apple Intelligence upgrades due to unknown reasons. It has been six months since the release of Apple Intelligence, and it appears that the Vision Pro will soon get the company's AI features with no new hardware requirements. According to Bloomberg, Apple will finally bestow its AI features on the Vision Pro headset with no hardware requirements. Some of the Apple Intelligence upgrades will include Writing Tools, Genmoji, Image Playground, and much more. The new AI features will finally be expanded beyond iPhone, iPad, and Mac. The headset will finally be refreshed when it comes to functionality and breathe new life, potentially attracting more customers. The report also mentions that Apple will also introduce a new app for the Vision Pro, which will allow it to deliver more spatial content to the headset. The app could be a hub for all spatial content, possibly from Apple and other companies. While the Vision Pro already has other sources for the same content, the new app could offer official content authorized by Apple. Apple has never promised its AI feature for the Vision Pro, and it remains to be seen when the update will arrive for the headset. We believe that Apple Intelligence support could arrive later this year or in May when Apple announces iOS 18.5 with new Siri enhancements. Gurman has also recently noted that Apple will delay the new Siri features by almost a month, which shows that the company is facing development challenges. It was previously noted that the company will release new Siri functionality with the release of iOS 18.4 next month, but it now seems highly unlikely. Apple is slated to release the first beta of iOS 18.4 next week, which will begin the cycle until next month when the update will be released to the general public. Next week, Apple also plans to launch the new iPhone SE 4, as Tim Cook has shared a teaser post on X, and we will be covering all the details extensively.
[12]
Apple Intelligence finally arrives on Vision Pro, but it's the new iOS app that might turn heads
Apple's most expensive piece of consumer hardware is finally getting Apple Intelligence. The latest visionOS 2.4 update, which becomes available as a developer beta today (February 18, 2025), brings Apple's brand of artificial intelligence to Vision Pro devices, though for now, it's only for US English devices. The move instantly helps better position the spatial computing platform against the upcoming Project Moohan headset, a mixed-reality device from Google, Samsung, and Qualcomm that promises to put Google Gemini at the center of the experience. The move by Apple also lowers some frustration from Vision Pro owners who were wondering why Apple's most advanced consumer technology was lacking a feature that is on all models of the iPhone 16, many iPads, and MacBooks. As has often been the case with the Apple Intelligence rollout, the update does not include every feature you'll likely soon find on iOS 18.4-supporting iPhones. However, it does include more than what we got on the initial iPhone Apple Intelligence rollout. Key new features include: More importantly, these features will integrate with Vision Pro's native functions, including voice and gestures. What's missing here is any kind of Siri update beyond the current digital assistant features, an exclusion that might frustrate some. On the other hand, now that Apple has opened the door to the Apple Intelligence on Vision Pro, numerous updates are sure to follow. So, whatever Siri features you see in the iPhone 16 with iOS 18.4, they're sure to eventually arrive on Vision Pro. We'd look forward to seeing a version that is aware of activities on the mixed reality headset and can take intelligent action based on what it sees through the device's multiple cameras and sensors (something that we expect from Project Moohan, whenever it finally ships). With the visionOS 2.4 update, users will be able to use voice prompts to request writing changes like, "Make it more friendly," and they'll get rewriting and proofreading assistance. Image Playground will be integrated into Messages, just like iOS. Smart Replies will be able to look at the contents of a message thread and create a contextual response. And you'll be able to use voice prompts to create Movie Memories. Those obviously exist on the iPhone but may take on a new dimension thanks to the Vision Pro's immersive field of view. Apple is also making several other changes to better connect your iPhone and Vison Pro experiences. The most notable one might be the new Vision Pro App on iOS, a utility that Apple arguably should have delivered when it launched the headset over a year ago. Think of the app as similar in some ways to the iPhone Watch App. The Vision Pro app acts as both a promotional tool for fresh Vision Pro content and spatial experiences but also a place where you can take remote actions. You can, for instance, use the app to add movies to watchlists, trigger app downloads, and learn details about your headset, like serial number, software version, and the prescription for your Zeiss lens inserts (if you have them). The app is set to arrive with iOS 18.4 and only installs if the system knows you have a Vision Pro. Apple seems to be doing a lot of work to widen the Vision Pro tent. To that end, Guest Mode gets a valuable update that should ease the sharing process. With visionOS 2.4, Vision Pro owners can simply hand someone their Vision Pro. When the guest puts it on, a message appears on the Vision Pro owner's iPhone or iPad, letting them enable sharing from there and choose which apps to share. Inside Vision Pro, users will find a new Spatial Gallery app offering curated spatial content, including photos, videos, and panoramas, cutting across multiple genres like sports, entertainment, and travel. Without a doubt, this is one of Vision Pro's most significant upgrades, and much of it is designed to help the headset better compete with upcoming devices from Google, Samsung, Meta, and others. None of it addresses the hefty $3,499.99 price tag, but at least there are now even more good reasons to make the investment.
[13]
Vision Pro will reportedly get Apple Intelligence as soon as April
The next big update could finally bring AI features to the headset. Apple Vision Pro may soon get an update bringing Apple Intelligence to the headset, according to a new report from . The first to iPhone, iPad and Mac in the fall, bringing , ChatGPT integration for Siri and over the last few months. But Apple's AI hasn't yet made it to the Vision Pro. According to Mark Gurman, that may change with the visionOS 2.4 update, which sources said may arrive in April with Apple Intelligence and a new app for viewing spatial media like 3D images and panoramas. The first Apple Intelligence features to come to Vision Pro will reportedly include Writing Tools, Genmoji and Image Playground. Gurman reports that developers may get access to these features in beta "as soon as this week." Don't expect any major changes to Siri on Vision Pro just yet, though. According to Gurman, a "major AI overhaul" the company had been planning for the assistant has been delayed.
[14]
Apple plans to add these features to sell more Vision Pros, report says
The Vision Pro might get new features to hopefully sell more units. (Image credit: Apple) Apple's Vision Pro had a lot of hype before its launch, but since then, the VR headset has been a bit of a flop for Apple. The company struggled to sell the Vision Pro when it launched one year ago from this month, no thanks to its $3,500 price tag. Apple already has another Vision Pro headset in the works, but it appears the company is not totally giving up on the hardware. In fact, it's going to add more features to make it more attractive in hopes of drumming up more interest. Apple Intelligence will come to the Vision Pro in its visionOS 2.4 software update, according to a report from Bloomberg Saturday. The company will also add an updated mode for guest users and a spatial content app. The new OS will reportedly roll out as soon as April, and developers could begin working with the feature within the week. Some of the Apple Intelligence features coming to the Vision Pro include the Writing Tools interface, Genmojis, and the Image Playground app, the report says. Apple didn't immediately respond to a request for confirmation. Apple Intelligence has yet to become the big selling feature for Apple. The feature launched last October following the release of the iPhone 17, but it still has yet to be fully "released." The number of tasks Apple Intelligence can currently do is limited, and the company is expected to fully release the AI later this year. Some of the features currently available with Apple Intelligence include the previously mentioned Writing Tools interface, Genmojis, and the Image Playground app. Siri also received an upgrade as it can now tap into the power of ChatGPT to understand images and documents as well as summarize phone notifications. Although Apple Intelligence may not have all the bells and whistles of ChatGPT or Google's Gemini, it does offer some privacy when it's being used. It can do this with Private Cloud Compute, which enables the use of the cloud to process a request. That request, however, is completely secured with the data anonymized to ensure the privacy of the user. So far Apple Intelligence is only available on the iPhone 15 and 16, as well as the upcoming iPhone SE 4. Apple hardware that uses the company's M1 chips or better can also make use of Apple Intelligence.
[15]
Apple Vision Pro major upgrade tipped with visionOS 2.4 -- here's all the new features
Apple Intelligence, new Spatial app and revamped Guest mode are coming While the Apple Vision Pro launched at an absurdly high price and the company is reportedly working on less expensive models, Apple isn't totally abandoning its first mixed reality headset. A new report from Bloomberg claims that Apple is working to bring a number of new features to the headset, including Apple Intelligence, as part of the next visionOS update. Bloomberg's Mark Gurman reports that some of these "enhancements" will become available this week in a developer beta, with the public launch expected in April. Not every Apple Intelligence feature will make its way to the Vision Pro, but sources say the headset will get the Writing Tools interface, Genmojis and Image Playground. The Vision Pro features a Mac M2 chip processor and 16GB of RAM, so it should be able to handle on-device AI processing. Apple Intelligence was likely always slated to be available on the Vision Pro, but it's possible that the announcement of Android XR and Google's integration of its Gemini AI tools in future headsets pushed up Apple's timeline. The first devices running Android XR will come out later this year with Samsung's Project Moohan expected to be the first headset running the mixed reality platform. Additionally, Bloomberg's report claims that Apple will launch a new app built specially for the Vision Pro meant for watching spatial content, including 3D images and panoramas. Apparently this new app is meant to "spur interest" in spatial media and entice more people to the product. Alongside that, Apple is supposed to release an immersive arctic surfing video for the TV app. Finally, the visionOS 2.4 update is supposed feature an overhauled mode for guests. This mode allows you to lend the headset to others. It could make it easier for people to share the Vision Pro in your house. It also allows for you to control the mode via your iPhone. According to unnamed sources, this guest mode feature is supposed to help Vision Pro owners promote the headset to friends and family. Essentially, in-home marketing for Apple. As mentioned, if you have a Vision Pro, you can potentially try out this update in a developer's beta which should launch soon. Or you'll have to wait until April to see what Apple actually releases.
[16]
Apple Aims to Boost Vision Pro with AI Features, Spatial Content App
Apple Inc. is planning to add Apple Intelligence to its Vision Pro headset, along with an updated mode for guest users and a spatial content app, as it seeks to boost interest in the device. The company aims to roll out the features as part of a visionOS 2.4 software upgrade targeted for as early as April, according to people with knowledge of the matter. The enhancements will become available in beta for developers as soon as this week, said the people, who asked not to be named discussing details of the update that aren't yet public.
[17]
Apple Vision Pro may be about to get its biggest update yet -- no new hardware required - 9to5Mac
Apple Vision Pro only just turned a year old, and it may be about to get its biggest update yet -- no new hardware required. That's because Apple Intelligence is finally rumored to be coming to Apple Vision Pro. When Apple announced its suite of AI features last June, it only promised support on the iPhone, iPad, and Mac. So far, Apple Intelligence is offered on iPhone 15 Pro, iPhone 16, and iPhone 16 Pro; M-series iPad Pro and iPad Air; the latest iPad mini; and M-series Macs. Apple never guaranteed that Apple Intelligence would come to Vision Pro in a future update. As I mentioned a few days ago: The biggest opportunity for Apple Vision Pro, aside from price and weight, is that it runs on Apple's second generation M-series processor and doesn't support Apple Intelligence. It's possible, however, that Apple could bring Apple Intelligence to the existing Apple Vision Pro without refreshing the hardware. The hardware only just turned a year old in the United States, and it has been arriving in markets over the last 12 months. However, a new rumor from Mark Gurman at Bloomberg suggests, for the first time, that the existing Apple Vision Pro will indeed receive Apple's AI features. If true, this will be a welcome surprise for customers of the $3500 headset. It's also a welcome rumor following the report that three new Siri features with Apple Intelligence may not be ready in time for iOS 18.4. Stay tuned as we expect a lot of Apple news in the coming days, including a new product launch on Wednesday that Apple has already announced.
[18]
A Huge Apple Vision Pro Update Arrives This April
Jackery's New Curved Solar Roof Tiles Are What I've Been Waiting For Apple confirms that its visionOS 2.4 update, planned for April, will bring Apple Intelligence to the Vision Pro headset. More importantly, this update packs some very useful quality-of-life features, such as remote app downloads and a refined Guest User management system. I guess we'll start with the AI stuff. Apple Intelligence debuted on macOS and iOS late last year with a smorgasbord of writing-assist features, image generating tools, ChatGPT integrations, and notification summarization functionality, among other things. As far as I can tell, Apple plans to bring the full gamut of its AI suite to the Vision Pro. Related Apple Vision Pro vs Meta Quest 3: Comparing the Top AR Devices of 2024 Don't get mixed up when entering mixed reality. Posts I won't be surprised if some features are delayed beyond the visionOS 2.4 update, as that's been a constant problem with the Apple Intelligence rollout. But we don't need to speculate too much, as Apple just opened an Apple Intelligence developer beta on Vision Pro that should give us a decent idea of what the April update will look like. Now that we're done talking about AI, let's move onto the more interesting stuff. This April, the iPhone will gain a Vision Pro management app that lets you remotely queue downloads to your headset. You can also use the app to browse Apple Immersive and Spatial Video content, access device information (such as OS version and serial number), set up Personalized Spatial Audio, and save ZEISS prescription lens information. Close visionOS 2.4 will also expand the Vision Pro's existing Guest User system, which allows you to temporarily share a headset with friends or family. Under the current system, Vision Pro owners may only set up Guest Users in the headset's Control Panel -- a slow and somewhat clunky process that feels a bit like musical chairs. Come April, you'll be able to set up Guest Users from your iPhone or iPad -- no need to put on the headset. You'll also gain the ability to stream the headset's video feed to your phone through AirPlay, which should make it easier to teach friends or family how to use Vision Pro. Finally, Apple is working on a Spatial Gallery that brings together Apple-curated spatial photos, spatial videos, and panoramas. Spatial Gallery will also highlight brand partnerships with Red Bull and other companies that produce spatial content, plus behind-the-scenes immersive footage from Apple TV+ shows like Severance. Those enrolled in the Apple Developer Program can begin testing the Apple Intelligence on visionOS dev beta today. Other users must wait until April for the stable visionOS 2.4 release, although there may be a public beta before then. Apple Vision Pro Experience the future with Apple Vision Pro: revolutionary spatial computing, ultra-high-resolution visuals, and seamless integration with your Apple ecosystem. See at Apple Source: Apple via Engadget
[19]
Your Apple Vision Pro is getting a lot better with visionOS 2.4 - 9to5Mac
Apple is releasing the first beta of visionOS 2.4 today, and it just might be the biggest update yet for Apple Vision Pro users. The update brings support for Apple Intelligence, major improvements to Guest Mode, a new Spatial Gallery app, an Apple Vision Pro app for iPhone, and more. Here's everything you need to know... "Apple Vision Pro is helping users communicate, collaborate, and experience entertainment in entirely new ways -- and we're continuing to push the boundaries of what's possible in spatial computing with visionOS 2.4," said Mike Rockwell, Apple's vice president of the Vision Products Group. "With Apple Intelligence, Vision Pro users will be able to take their productivity and creativity to new heights using features like Writing Tools, Image Playground, and Genmoji. And we're excited for users to discover and share incredible new experiences with Spatial Gallery." With visionOS 2.4, Vision Pro is adding support for Apple Intelligence, Apple's suite of artificial intelligence features. When Apple Intelligence was introduced at WWDC last year, Apple told a clear story about the features on iPhone, iPad, and Mac. Apple Vision Pro, meanwhile, missed out on the fun. That's changing now. Genmoji on Apple Vision Pro lets you make custom emoji based on text descriptions and people in your Photos library. Genmoji is accessible directly from the keyboard in various Apple apps like Messages. Image Playground allows you to create images based on concepts, text descriptions, and people from your Photos library. You can access Image Playground with a dedicated app that has been designed specifically to take advantage of Vision Pro's spatial computing interface. Just like on other platforms, it's also integrated right into Messages. visionOS 2.4 also adds the familiar Writing Tools feature to Vision Pro. Writing Tools lets you easily proofread and rewrite your text, with built-in options for making your text friendlier, professional, and concise. You can also generate summaries, key points, lists, and tables based on your text. Writing Tools on Apple Vision Pro can also tap into ChatGPT to create text from scratch based on your prompts. Smart Reply identifies what's in an email or Messages thread and suggests relevant responses. The update also includes the memory movie feature in the Photos app to create custom memory movies based on your description. Apple Intelligence then finds the best photos and videos to craft a storyline. Natural language search in the Photos app is also now available. A few more tidbits on Apple Intelligence in visionOS 2.4: visionOS 2.4 also includes support for Priority Messages in Mail, Mail Summaries, Image Wand in Notes, Priority Notifications in Notification Center, and Notification Summaries. Apple Intelligence uses on-device processing to protect users' privacy whenever possible. For requests that require even larger models, Private Cloud Compute extends the privacy and security of Apple products into the cloud to unlock even more intelligence. When using Private Cloud Compute, users' data is never stored or shared with Apple; it is used only to fulfill the request. Independent experts can inspect the code that runs on Apple silicon servers to continuously verify this privacy promise, and are already doing so. As for now, Vision Pro retains its existing Siri experience without any further upgrades. For example, unlike on other Apple Intelligence platforms, you can't access ChatGPT via Siri on visionOS 2.4. Apple says more features and languages will come to Siri on Apple Vision Pro throughout the rest of the year. Apple has other new features coming to Vision Pro with visionOS 2.4 beyond Apple Intelligence support. First, there's a new Spatial Gallery app. This app aggregates spatial photos, spatial videos, and panoramas into a central place. All the content, curated by Apple for Apple Vision Pro users, is free and will be updated regularly. The Spatial Gallery app includes content across different categories like sports, travel, entertainment, and more. Here's how Apple describes it: visionOS 2.4 introduces Spatial Gallery, a new app that features a selection of spatial photos, spatial videos, and panoramas curated by Apple for Apple Vision Pro. With Spatial Gallery, users will enjoy breathtaking and intimate moments spanning art, culture, entertainment, lifestyle, nature, sports, and travel, with new content released regularly. At launch, users can discover remarkable perspectives from photographers like Jonpaul Douglass and Samba Diop; new stories and experiences from iconic brands including Cirque du Soleil, Red Bull, and Porsche; behind-the-scenes moments from Apple Originals like Disclaimer, Severance, and Shrinking; and special moments from top artists. In conjunction with iOS 18.4, visionOS 2.4 is adding a new "Apple Vision Pro" app to iPhone. In this app, Vision Pro users can discover content and spatial experience, find useful tips about their device, and more. The app has a wide-ranging Discover page that showcases immersive video from the TV app, content from the Spatial Gallery app, and details on new apps and app updates. This is also where Apple will highlight new features coming to Vision Pro. The app also has a "My Vision Pro" section with information about serial numbers, optical inserts, AppleCare coverage, and more. This app will appear on your iPhone automatically after you update your iPhone to iOS 18.4 and your Apple Vision Pro to visionOS 2.4. If you don't have an Apple Vision Pro, you can download this app from the App Store to explore the Discover page and learn more about the spatial computing platform. Guest User is a way for Apple Vision Pro users to share their headset with friends, family, and colleagues. The first version of the feature required the Vision Pro owner to put on the device and select which apps they wanted to share with the guest user. Then, the guest user would put on Vision Pro and go through the hand and eye tracking calibration. With visionOS 2 last fall, Apple updated the feature to let a guest user save their hand and eye data for 30 days, so they wouldn't have to go through that calibration process each time. visionOS 2.4 makes another notable update to the experience. Now, the Vision Pro owner doesn't have to put on the Vision Pro before sharing it with a guest. Instead, the guest can put the headset on, then request access from the owner. The owner of the Vision Pro will see a popup on their iPhone, where they can easily approve that request and select which apps to share with the guest. During that flow, the Vision Pro owner can also turn on AirPlay so they can see what the guest is doing and help guide them. Apple has a couple of other small updates coming to Apple Vision Pro with visionOS 2.4. The Dictation feature now supports editing your text using your voice. You can also now solve inline math equations in the Notes app, just like the functionality that came to iOS 18 last year. The first developer beta of visionOS 2.4 is rolling out today. The update will be released to the general public in April.
[20]
Apple Is Making Vision Pro Easier (and Better) to Share
Apple announced VisionOS 2.4 which is designed to make it easier to find content to enjoy on a Vision Pro, adds support for Apple Intelligence, and makes it easier to share the headset with guests. Apple Intelligence is rolling out to Vision Pro devices as part of the update, which includes writing tools, Genmoji, and Image Playground. Users can also create a memory movie that leverages AI which uses assets on-device to make a video. Once complete, this video can be watched on a Vision Pro in an immersive environment. These and a host of other AI features will be coming in April. Apple is starting the rollout in English first and more features and languages are coming later this year. The developer beta launches next week. The update to VisionOS also adds what Apple calls Spatial Gallery, which is a new app that features spatial photos, videos, and panoramas that are curated by Apple for Vision Ppro users. The company says it will be updated regularly all of the content is free to Vision Pro users. Apple is also launching the Vision Pro app which is most akin to the Watch app for Apple Watch users but is more feature-rich. It is meant to give visibility to Vision Pro and available immersive content as well as provide remote access to a Vision Pro from an iPhone. It can perform remote actions, such as adding content to Watch Lists in Apple TV+, which will then appear on Vision Pro when it is used next. Users can also remotely download apps to the Vision Pro from the iPhone screen without having to put the headset on first. New features will also be shown in this app as well as what is called "My Vision Pro" which shows personalized information about a user's specific device. As noted, the experience is pretty similar to the Watch app but one of the differentiators is that the Watch app is more of a utility app while the Vision Pro app has the "discover" aspect to it. This app will appear automatically on iOS 18.4 for users who own a Vision Pro, but can be downloaded manually in any region where the Vision Pro is supported even if a user doesn't own a Vision Pro. Finally, Apple is making the guest and sharing experience much better. Presently, the owner of the Vision Pro has to put it on first before handing it to a guest. After the updatea, a guest can put it on and just request access and the owner will get a prompt to their iPhone and provide access. Also from the iPhone, owners can select what they want the guest to see. It also allows them to guide guests through an experience they can see what the guest is doing in the Vision Pro on their iPhone. It's very similar to the in-store Vision Pro demonstration experience, although it's not identical. That said, it should be much easier to share the headset, making it possible for more people to experience the headset, which can be a problem given its high asking price. Last year, Apple told PetaPixel that it remains incredibly committed to immersive content. "Everything from the Final Cut team to the WebKit team, to all of the teams that you would expect on Vision Pro, to the photos team, the camera engineering team, the camera hardware engineering team who did the impossible of moving the camera modules around to accommodate for this," Della Huff, a member of the Product Marketing team at Apple, said in a special episode of The PetaPixel Podcast. "There's tons of teams that are invested and I think that just speaks to we're literally putting our money down on this future because we think it is so important." VisionOS 2.4 is available now as a beta for developers. The public release is scheduled to arrive in April.
[21]
Apple Intelligence and ChatGPT Snail Their Way to the Vision Pro
The Vision Pro’s visionOS 2.4 update will let you blast friends with "Genmojis," but the more useful feature is the ability to stream what your friends see inside Apple's headset. Did you think the one missing feature on Vision Pro was the ability to create AI-generated emojis to text friends? On Friday, Apple launched the first few Apple Intelligence features on its $3,500 Vision Pro. While the features are well-known if you’ve used a Mac in the last six months, they may indicate Apple’s MR headset may also receive the cross-app AI-ified Siri whenever that finally becomes a reality. The visionOS 2.4 beta AI features work similarly to their Mac counterparts, though with UI changes to make it easier to type prompts without a mouse and keyboard. Like Mac or iOS, users can access the ChatGPT function with its limited AI prompt box. Like iOS or macOS, you can use AI writing tools to compose or proofread the text you write on Vision Pro. Why you’re using Apple’s headset to type an email is another question entirely. The newly-released Vision Pro now supports Genmoji for the Messages app and the Image Playground for generating a few cartoon-styled versions of your friends and petsâ€"even though most end up on the wrong side of the uncanny valley. Finally, the Photos app has a new “Create a Memory†feature. Users can input a prompt with the in-headset keyboard or their voice, and then the AI will create a custom slideshow by selecting users’ videos and photos that apply to that prompt. For example, Apple’s prompt for “day in the skate park†cut together a loose assortment of clips with a few simple transitions. S VisionOS 2.4 isn’t as major an update as the transition to visionOS 2, though Apple is trying to make its headset fill a niche within its wider ecosystem. This includes an all-new Vision Pro app for iOS. Users can use it to take remote actions on their AVP, like creating a watch list or downloading apps remotely. The app will also share your Vision Pro’s serial number if needed. Anybody with a Vision Pro connected to their Apple account will see the new app appear on their iPhones when they upgrade to iOS 18.4. The update also offers a new app on Vision Pro called Spatial Gallery that collects all of the users’ spatial photos and videos they’ve taken with either the headset or devices that support the format, like the iPhone 15 Pro and iPhone 16 models. Despite all this, what may be the most useful feature of them all is the new way Vision Pro works with guest users. The headset owner no longer needs to load in first before handing off the AVP to their friend to try it (who will inevitably put it on, tell you, “Oh, that’s cool,†and then never pick it up again). Instead, users will get prompted on their phone or iPad to set up a guest user. You can set what apps you will let them access, and you may even watch what they’re doing by streaming it to your iPhone or iPad through AirPlay. It’s akin to the same experience you get if you ever did Apple’s in-store AVP demo experience, though you may not have as much direct control over the headset as a regular Apple “Genius.†AI on AVP seemed inevitable, though it took longer than anticipated for AI to arrive in VR. The Vision Pro uses Apple’s M2 chip, and the Cupertino, California tech giant has already stated that any Mac with M-series silicon should have what it takes for Apple Intelligence. Either way, all the new AI features won’t offer a major change for anybody still using Vision Pro. The device is most helpful in creating a massive ultra-wide Mac screen or watching passive content on a pseudo-large display with AVP’s dual 4K OLED lenses. There’s the chance that the supercharged Siri promised to arrive sometime in the next few months may offer more capabilities and the ability to take actions on Vision Pro apps on users’ behalf. I can imagine the planned “Visual Intelligence†features that will let you ask questions based on what the external cameras see could offer new ways to use the device if any first- or third-party apps take advantage of it. But, hell, all I really want is an iPhone to AVP mirroring like I can do on my Mac with macOS Sequoia.
[22]
Apple Vision Pro Headset Gets Overdue Missing Features, Including Apple Intelligence
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Apple's futuristic Vision Pro headset hit the one-year mark, and over that year it's felt like it's still missing some key features required to succeed. It now looks like Apple's addressing a few of those features, at least in small steps, with VisionOS 2.4. Apple Intelligence is finally arriving on Vision Pro, along with some helpful-sounding connected apps and guest mode features for iPhones and iPads. The new OS update can be tested in a developer beta arriving today, but the official OS update isn't coming until April. I've been waiting for these new features for a while. Apple Intelligence might be the flashiest new feature, but it's not likely to be the most meaningful yet -- the AI extras launching in this wave don't include any camera-enabled Visual Intelligence multimodal features like those Google Gemini will flex on Android XR. But it should at least be a foot in the door toward more AI getting added over time. I'm more excited that the Vision Pro is going to work better with iPhones and iPads. Not necessarily in all the ways I want, but a new app and a Guest Mode feature should give the headset remote-access functions that Meta's Quest headset has had for years. The Apple Intelligence features coming to Vision Pro are familiar; they mostly mirror what's already on Macs, iPhones and iPads. Their arrival is overdue: I expected them last year. Writing tools, a mode integrated into several apps, is back again to summarize or generate text. There are message summaries. ChatGPT can be invoked for extra assistance. There's Apple's generative AI GenMoji, which makes emoji on demand, and Image Playground, which makes 2D images. None of these will generate anything 3D on Vision Pro... not yet. And Apple's AI-driven Memory Movies feature, which can generate photo and video galleries on demand, won't show any 3D "spatial" movies or photos yet. That's a bummer. The big missing thing, still, is Visual Intelligence. Apple's camera-enabled AI functions on iPhones, which are summoned using the Camera Button, scan the world and search or identify what's in view. Visual Intelligence would make sense on Vision Pro, which is literally a giant wearable display with world-viewing cameras studded everywhere. Right now, though, there's no Visual Intelligence feature in the mix. Or new Siri, for that matter. Apple's revamped Siri should be a part of iOS 18.4, but it won't be on the Vision Pro with this OS update. Google, meanwhile, is already integrating a multimodal camera-assisted Gemini AI into Android XR, seemingly on course to be available on day one of that OS' release. Apple could still introduce Visual Intelligence later this year. It's a likely candidate for VisionOS 3, which should be announced at Apple's WWDC developer conference, usually held in June. But, at least, Apple Intelligence arriving on Vision Pro -- as Apple already indicated it would -- shows that the current hardware can do more than Apple has allowed. I was frustrated that the Vision Pro never had a good working relationship, or even a connection with, iPhones. Meta's Quest headsets have had apps for phones for years that can browse and remote-download apps to the headset, sync phone notifications and remote-control Quest headsets to help people demo apps while you watch their experience on your phone screen. Apple's adding a lot of this with VisionOS 2.4 and iOS 18.4. An overdue Vision Pro iPhone app lets you remote-download apps and discover experiences coming to VisionOS. The app will also store details about the headset and prescription lens inserts. According to Apple, the app automatically appears on your iPhone with iOS 18.4 if you own a Vision Pro, but can also be downloaded from the App Store. There's also a new Guest Mode experience for sharing the headset. Apple's current process is weird and clunky, and it doesn't let you remotely observe what's going on in-headset to help. The new mode kicks in when someone else puts on the headset, and your nearby iPhone or iPad has a button to start a connection. It has an app picker that'll only make certain movies or apps appear on the headset, and it starts an AirPlay stream to watch whatever your guest is doing so you can guide them. It's a little strange that the new guest experience isn't launched from the Vision Pro app, and right now the app won't let you remotely launch or pause apps; it's a passive AirPlay stream. But again, it's a start -- and it sounds far better than what the Vision Pro had before. Apple is also introducing a curious new app called Spatial Gallery, which is described as a curated showcase of 3D photos and videos shot using the iPhone-based "spatial" capture format. The app sounds like a way to find other 3D content to watch in the Vision Pro, which is odd since Apple already has bullishly but slowly showcased its 180-degree 3D Immersive Video format in the headset, too. This is likely a tell that Immersive Video content production is facing a bottleneck. It's an expensive format to shoot and edit, requiring very specific high-end cameras. Meanwhile, with more basic 3D videos shot on an iPhone or other cameras and edited with a few apps that support spatial video editing (Final Cut Pro, DaVinci Resolve), it seems like Apple is splitting the difference to offer more Vision Pro-ready experiences as quickly as possible. In an ideal world, the future probably involves iPhone cameras developing better immersive 3D capture formats to make more impressive VR-ready content. The present, however, is split between phones having passable 3D capture and high-end professional cameras developing a different tier. I'd rather see Apple make further investments in useful immersive apps, but it looks like spatial video is going to be Apple's most achievable content move in the short term.
[23]
Vision Pro could soon get an Apple Intelligence upgrade, but can it boost sales?
AI integration and other features are to be introduced in an attempt to reignite interest in the costly AR headset. Apple Vision Pro has just celebrated its first birthday, but it's fair to say that the $3,500 AR headset hasn't been the slam dunk that Apple executives would have liked it to be. Just two months after release, the company was reportedly slashing shipments in half due to a lack of demand, and internally there's a worry that price isn't the only problem with even early adopters using it less than anticipated. Now Bloomberg's Mark Gurman has revealed plans for a major software update aimed at giving existing hardware a shot in the arm to boost sales. Gurman's sources say these features could make up part of the visionOS 2.4 update that could arrive in beta this week, with a view to a full release in April. The headline feature is the introduction of Apple Intelligence for the headset. Apple's take on generative AI has previously only been available on recent iPhones, iPads and Macs, but with its M2 chipset and 16GB RAM, the existing Vision Pro hardware should be up to handling on-device processing. That means that owners of Vision Pro are set to receive "standard features" such as the Writing Tools interface laced with ChatGPT, Gemoji and the Image Playground app. If there are unique Apple Intelligence features for Vision Pro in the works, they aren't mentioned in the article. Away from AI, Apple reportedly has another couple of tricks up its sleeve to drive interest in the headset. Firstly, a new app is reportedly on the way to view "spatial content tailored to the device, including 3D images and panoramas aggregated from outside sources". There will also be an "immersive video" arriving on February 21 about arctic surfing, Gurman writes. Finally, Apple plans to introduce a "revamped mode for guest users", letting owners temporarily loan their devices to others. Not only will this make it easier for multi-user households (assuming they take the same optical inserts), but "the company believes such a process could help users excite their friends and family about the Vision Pro" which could potentially lead to sales. As solid as these additions all sound, that feels like a stretch. The biggest bar to mass adoption of Vision Pro has always been its sky-high pricing, with a single unit costing the same as seven Meta Quest 3 headsets. While making Vision Pro easier to demo to interested friends and colleagues might generate a few more sales, it's unlikely to be a serious game changer, and I'm dubious that the AI features will have much impact either. Not only does it assume people are excited about Apple Intelligence more broadly -- something that, anecdotally, I'm not really seeing outside of the tech press -- but the features Gurman mentions are all just replicating what you can get on iPhone, iPad and Mac. It's safe to assume that any early adopters of Vision Pro are already deep in the Apple ecosystem, so will this really be that exciting? Or, to put it another way, why use ChatGPT to compose a document in Vision Pro when it's easy enough to do on your Mac or iPhone? Of course, this is likely just a first step, and there's definitely potential for Vision Pro to do some truly amazing stuff with artificial intelligence. Its starting point, however, sounds pretty underwhelming from where I'm sitting. But what else can Apple do? It has to try to boost Vision Pro sales somehow, and potentially interested consumers will ultimately be less likely to bite if it appears software updates are dwindling and the company is losing interest. Nonetheless, it feels like the next big test for market appetite won't come through new software, but hardware. From my perspective, a lot is riding on that cheaper model that's reportedly in the works.
Share
Share
Copy Link
Apple announces the expansion of Apple Intelligence to support multiple new languages and regions with the upcoming release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 in April, emphasizing privacy and introducing support for Apple Vision Pro.
Apple has announced a significant expansion of its Apple Intelligence system, set to roll out in April with the release of iOS 18.4, iPadOS 18.4, and macOS Sequoia 15.4 12. This update will introduce support for several new languages, including French, German, Italian, Brazilian Portuguese, Spanish, Japanese, Korean, and Simplified Chinese. Additionally, localized English versions will be available for users in Singapore and India 125.
For the first time, iPhone and iPad users in the European Union will gain access to Apple Intelligence features 12. The update will also expand Apple Intelligence to a new platform, with support coming to Apple Vision Pro in U.S. English 3. This integration aims to enhance users' ability to communicate, collaborate, and express themselves in innovative ways 12.
Apple Intelligence is designed with a strong emphasis on user privacy. Many of its AI models operate entirely on-device, ensuring personal data remains secure 12. For tasks requiring more computational power, Apple employs Private Cloud Compute, which extends iPhone-level security to cloud processing while maintaining strict privacy protections 45.
The visionOS 2.4 update for Apple Vision Pro will introduce several new features, including:
Additionally, a new Spatial Gallery app will showcase curated photos and videos, and an Apple Vision Pro app for iPhone will provide quick access to apps and information 3.
Apple has confirmed that more features and enhancements for Apple Intelligence, including expanded Siri capabilities, will be introduced in the coming months 124. This ongoing development suggests Apple's commitment to improving and expanding its AI offerings across its product ecosystem.
Developers can begin testing these updates immediately, with the public release scheduled for April 125. This early access allows developers to prepare their apps and services for the expanded language support and new features.
The expansion of Apple Intelligence coincides with the recent launch of the iPhone 16e, which is described as the most affordable iPhone running Apple Intelligence, thanks to its A18 Bionic Chip 12. This integration of AI features across Apple's product range, from budget-friendly options to high-end devices like the Vision Pro, demonstrates the company's strategy to make AI capabilities accessible across its entire ecosystem.
Reference
[1]
[2]
[4]
Apple CEO Tim Cook announces the expansion of Apple Intelligence to support multiple languages starting April, including localized versions for India and Singapore, during the company's Q1 2025 earnings call.
4 Sources
4 Sources
Apple is set to introduce its new AI-driven technology, Apple Intelligence, across its devices in October. This update promises to enhance user experience with advanced features for productivity, creativity, and accessibility.
12 Sources
12 Sources
Apple rolls out its AI-powered features internationally, including India, with iOS 18.4 update. The expansion brings new languages, enhanced privacy, and a suite of intelligent tools to iPhones, iPads, and Macs.
7 Sources
7 Sources
Apple announces plans to broaden language support for its AI-powered Apple Intelligence feature. The expansion will include German, Italian, Korean, Portuguese, and Vietnamese, enhancing the accessibility of AI capabilities across its devices.
6 Sources
6 Sources
Apple announces the release of its AI suite, Apple Intelligence, for EU iPhones and iPads in April 2025, alongside expanded language support and new features, marking a significant shift in its global AI strategy.
8 Sources
8 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved