9 Sources
[1]
Apple Unveils AI-Powered Live Translation Service for iOS 26
Thomas is a native of upstate New York and a graduate of the University at Albany. As a member of CNET's How To team, he writes about the intersection of policy, information and technology, and how you can best be served in that area. Outside of work, he can most often be found watching too many movies, reading too much, drinking too much coffee, or spending time with his cats. Language barriers look set to fall in Apple's forthcoming iOS 26 with the new Live Translation feature. Unveiled during the company's WWDC 2025 presentation, Live Translation will, as the name suggests, offer on-the-fly language translation across the Phone, Messages and Facetime apps in iOS 26, after years of the company dabbling with translation as its own app or via Siri. It's an AI-powered service, marking the company's latest effort to provide greater value to users for its Apple Intelligence AI systems, after the relatively tepid rollout in the last year. In Messages, texts will translate into the recipient's preferred language as you type them out. Messages you receive back will also be instantly translated for you. While Facetiming, Live Translation will offer captions in your preferred language as you converse with someone else speaking a different one, without interfering with the other person's voice. That's in contrast to its functionality in during phone calls, which will feature a spoke translation alongside the voice of the person you're speaking with. There's also no need to rehash any of those iMessage anxieties, as Live Translation will work even if the person you're communicating with isn't using an iPhone. For the time being, however, it's unknown how which and how many languages will be supported.
[2]
Apple Intelligence is getting more languages - and AI-powered translation
Apple just held its annual WWDC 2025, during which the company announced major updates coming to its operating systems across devices. During the conference keynote, Apple also revealed that it's expanding Apple Intelligence, the company's set of artificial intelligence (AI) features, to more languages than before. Also: 4 best iPadOS 26 features revealed at WWDC - and which iPads will get them Apple Intelligence was already available in English, French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and Chinese (simplified). Apple is adding eight more languages by the end of the year: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese. "Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of software engineering. Also: How to install iOS 26 on your iPhone (and which models support it) Adding new languages makes using the Apple Intelligence features more inclusive than before, as it lets more people from different parts of the world communicate with the AI features. However, it can also expand a user's learning horizons to include more languages. For example, as someone who speaks multiple languages, I often ask ChatGPT questions in Spanish, Portuguese, German, and French, as I can rely on the AI chatbot to recognize each language and respond accordingly. Even as a student of new languages, I use the bot to ask questions about pronunciation, definitions, and examples. Also: Every iPhone model compatible with Apple's iOS 26 (and which ones aren't) Apple also showcased the new AI-powered Live Translation feature that can translate Messages as they are typed, delivering them in the translated language, and then receive automatically translated messages in return. For Phone calls, the translation is spoken aloud during the conversation. During FaceTime calls, Live Translation lets users follow along with translated live captions. Apple announced other updates to its AI features, including new updates to Visual Intelligence, integrations with Shortcuts, new languages coming by the end of the year, and improvements to Genmoji and Image Playground. Get the morning's top stories in your inbox each day with our Tech Today newsletter.
[3]
Your iPhone will translate calls and texts in real time, thanks to AI
Apple just held its annual WWDC 2025, during which the company announced major updates to its operating systems across devices. During the conference keynote, Apple also revealed that it's expanding Apple Intelligence, the company's set of artificial intelligence (AI) features, to include Live Translation for iPhone, iPad, and Mac, and more languages than before. Apple showcased the new AI-powered Live Translation feature that can translate Messages as they are typed, delivering them in the translated language. Users can then receive automatically translated messages in return. For phone calls, the translation is spoken aloud during the conversation. During FaceTime calls, Live Translation lets users follow along with translated live captions. Also: How to install iOS 26 on your iPhone (and which models support it) "Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of software engineering. Live Translation will be powered by Apple-built models that run entirely on the device, without sending conversations to the cloud for processing. Because Live Translation is part of Apple Intelligence, it'll only be available in supported iPhone, iPad, and Mac models. These include the iPhone 15 Pro or newer, and iPads and Macs with an M1 chip or newer. Apple Intelligence was already available in English, French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, and Chinese (simplified). Now, Apple plans to add eight more languages by the end of the year: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese. Also: 4 best iPadOS 26 features revealed at WWDC - and which iPads will get them Adding new languages makes using the Apple Intelligence features more inclusive than before, as it lets more people from different parts of the world communicate with the AI features. However, it can also expand a user's learning horizons to include more languages. For example, as someone who speaks multiple languages, I often ask ChatGPT questions in Spanish, Portuguese, German, and French, as I can rely on the AI chatbot to recognize each language and respond accordingly. Even as a student of new languages, I use the bot to ask questions about pronunciation, definitions, and examples. Also: Every iPhone model compatible with Apple's iOS 26 (and which ones aren't) Apple announced other updates to its AI features, including new updates to Visual Intelligence, integrations with Shortcuts, new languages coming by the end of the year, and improvements to Genmoji and Image Playground. Get the morning's top stories in your inbox each day with our Tech Today newsletter.
[4]
Apple Intelligence is now open on device to third-party developers
Apple announced enhanced features for Apple Intelligence, opening AI features to third-party apps to use on your device. And it enables live translation now too. It has expanded the number of languages and it is building it into more features across Apple Watch, iPad, iPhone, Mac and more. It is opening access to on-device foundtaion models to any app that needs to use AI, even when you are offline. It has had writing tools, Genmoji, Image Playground, clean up in photos, visual intelligence, natural language search in photos, and private cloud compute in the past. It has made Siri more helpful and natural. Today, Apple announced live translation using Apple Intelligence, delivering a message in a preferred language. Apple senior vice president Craig Federighi made the announcement at the Worldwide Developer Conference (WWDC).
[5]
iOS 26 adds a translation tool that works inside calls, chats & video
Live translation in iOS 26 will turn the iPhone into a real-time interpreter for calls, messages and video chats, all without leaving your app or sending data to the cloud. Here's how it works. Live translation shows up where you already talk in Messages, FaceTime, and calls. It provides real-time, on-device interpretation of conversations without sending data to the cloud. The feature is part of Apple Intelligence, a new system-wide framework for generative AI that works directly on supported devices. Live translation processes spoken and written conversations entirely on the iPhone. It's not entirely magic though. You'll have to download the language models necessary for the translation. After that you're good to go. Live translation lets users type a message in their own language and have it delivered in the recipient's language. In FaceTime, the feature provides live captions during video calls. For audio-only calls, it can translate and speak the conversation aloud. The translation is handled by Apple's on-device foundation models, meaning the system works offline and doesn't shares transcripts or voice data externally. The technology is integrated into the Phone app, FaceTime, and Messages, making it available for real-time speech and text. Apple has emphasized that this approach prioritizes speed and security, keeping conversations local to the device. So, there's no need to shout into your phone hoping Siri gets it. We tested the new live translation feature in iOS 26 to see how it handled real-time messaging. We sent texts in German, and the recipient's iPhone automatically recognized the language. It didn't work perfectly. Some lines were delayed or didn't translate at all, but that's expected since we're running the beta and it's buggy. While iOS 26 will be available on iPhone 11 and newer, live translation is restricted to iPhone 15 Pro, iPhone 15 Pro Max, and all iPhone 16 models. Users must have Apple Intelligence enabled, and both Siri and device language set to a supported option. The supported languages vary by app. In Messages, live translation works with English (U.S., UK), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Chinese Simplified. For Phone and FaceTime, support is narrower at launch. It's limited to English (U.S., UK), French, German, Portuguese, and Spanish. Apple says more languages will be added by the end of 2025. Google and Samsung already offer similar live translation features, but most rely on cloud processing. Google Translate and Live Translate in Android phones support dozens of languages across older and newer models. Apple is taking a more conservative path by requiring the latest chips and restricting access to certain apps. Where the company hopes to stand out is privacy. All translation occurs on-device and the system is deeply integrated into iOS. It offers smoother experiences in Messages and FaceTime than many third-party alternatives. The tradeoff is limited availability for now. Live translation is a meaningful upgrade for travelers, international families, and business users working across borders. Unlike standalone apps, Apple's solution is embedded in everyday communication tools. Live translation joins other Apple Intelligence features like smart writing tools, image generation, and personalized summaries. The release arrives alongside iOS 26's "Liquid Glass" design overhaul, Apple's most significant visual redesign since iOS 7. The interface now uses translucent effects and dynamic motion to give content more depth and focus. While Live Translation is driven by functionality, it's part of a broader shift in how Apple wants users to see and use their devices -- more intuitive, expressive, and intelligent. Apple's live translation feature brings real-time multilingual communication to native iPhone apps without compromising user privacy. For users with the latest devices, it's one of the most useful applications of Apple Intelligence so far.
[6]
Apple brings Live Translation to iMessage, phone calls, and more | AppleInsider
Apple is breaking down language barriers with a new Live Translation feature in iOS 26 powered by the company's on-device AI processing models. The Live Translation feature, announced during the company's WWDC 2025 event, brings real-time language translation to iOS 26. The feature will work across Messages, FaceTime, and other apps. For example, the Live Translation feature can immediately translate text into other languages as you type out a message in iMessage. As texts in other languages come in, the Apple Intelligence feature can instantly translate them for you. Apple says that the new feature works seamlessly across its first-party apps, even if the person you're talking to isn't using an iPhone. Additionally, Apple also announced a Call Translation API for developers to use in their own third-party apps. Real-time translation is also making its way into non-communications apps later this year, with Apple Music bringing live lyric translation and pronunciation features to its app in iOS 26.
[7]
Apple Just Added a Brilliant iPhone Feature to Help Businesses and Consumers Communicate
To say that this will smash language barriers for business and personal iPhone users alike is no exaggeration. Apple's been slowly building up to this innovation for a few years, beginning with some simple translation features in Siri, then with a dedicated Translate app. These were useful features, but you had to invoke them by a deliberate action. Having translation actually built into the communications apps on the phone will make a huge difference. During the keynote presentation, Apple showed the system working to real-time translate a conversation between English and French during a FaceTime video call. With barely a pause after one person spoke, the AI took over and spoke in an artificial voice in the other language, while the translated text appeared on screen, caption-like. Think of it as foreign movie subtitles for real life. A similar trick happens during phone calls, without the live captioning. Text messaging is even more seamless, with the translation happening in real time as a user types.
[8]
Apple Intelligence Will Now Provide Live Translations on Your iPhone
Apple has also added ChatGPT support to Visual Intelligence Apple made several artificial intelligence (AI)-related announcements, branded as Apple Intelligence, at the Worldwide Developers Conference (WWDC) 2025 on Monday. During the keynote session, the company recapped the existing AI features amd unveiled new features that are now available for testing, and will be rolled out to users later this year. These new features include Live Translation, Workout Buddy in Apple Watch, ChatGPT integration in Visual Intelligence, updates to Genmoji and Image Playground experiences, as well as AI capabilities on Shortcuts. Craig Federighi, the Senior Vice President (SVP) of Software Engineering at Apple, announced that the tech giant is now opening up access to its on-device foundation models to third-party app developers. These AI models also power several Apple Intelligence features. Developers can access these AI proprietary models to build new features within their apps, or entirely new apps via the Foundation Models Framework. Apple highlighted that since these are on-device models, these AI capabilities will function even when a device is offline. Notably, it will also ensure that the user data never leaves the device. For developers, they will not have to pay any application programming interface (API) cost for on-cloud inference. The framework natively supports Swift, allowing developers to access the AI models seamlessly. Additionally, the framework also supports guided generation, tool calling, and more. Federighi mentioned that Siri will not be getting the advanced AI features teased at last year's WWDC till 2026, which is when Apple will share more information about it. However, this year, the Cupertino-based tech giant is planning to ship a few more Apple Intelligence features. The biggest new arrival is Live Translation. The AI-powered feature is being integrated into the Messages app, FaceTime, and the Phone app to allow users to easily communicate with those who speak a different language. It is an on-device feature, which means the conversations will not leave the users' devices. Live Translation will automatically translate messages in the Messages app. Users will see an option to automatically translate their messages as they type, and they can then send it to their friends and colleagues in a language they speak and understand. Similarly, when the user receives a new message in a different language, the feature will instantly translate it. On FaceTime calls, the feature will automatically add live captions in users' language to help them follow along. During phone calls, Live Translation will translate what a person says in real-time and speak it aloud. Apart from Live Translation, Apple is also updating Visual Intelligence. iPhone users can now ask ChatGPT questions while looking through their device's camera. The OpenAI chatbot will know what the user is looking at and understand the context to answer user queries. It can also search apps such as Google and Etsy to find similar images and products. Additionally, users can also look for a product online just by highlighting it in their camera. Apple says Visual Intelligence can also recognise when a user is looking at an event, and automatically show suggestion to add it to their calendar.
[9]
Apple Intelligence Adds Live FaceTime Translation and ChatGPT-Powered Image Creation
You can also generate images in different styles in Image Playground with the help of ChatGPT. At the WWDC 2025 event today, Apple announced a slew of new AI features for Apple Intelligence. First off, Live Translation can translate FaceTime and phone calls in real-time. During FaceTime calls, translated live captions are displayed on the screen. And during phone calls, the translation is spoken aloud during the conversation. In addition, in the Messages app, messages are automatically translated as they type, delivering the message in the user's preferred language. And this feature is powered by Apple's on-device local AI models. Apple says since the processing is done locally, "users' personal conversations stay personal." Next, Image Playground which is part of Apple Intelligence, is getting support for ChatGPT image creation. You can now generate images in different styles, such as oil painting, vector art, etc., using ChatGPT. With regards to ChatGPT integration, Apple says, "Users are always in control, and nothing is shared with ChatGPT without their permission." Moreover, you can now interact with anything you see on the screen, as part of the Visual Intelligence upgrade. You can highlight objects on the screen and ask ChatGPT questions about what you are looking at. You can also search Google, Etsy, and other supported apps to find more information. Not to mention, you can extract calendar events from the live screen as well. Apple Intelligence is also coming to Apple Watch through an AI-powered Workout Buddy. It uses your fitness data to create a workout companion to generate motivational insights. There is even a text-to-speech model to generate a voice that matches your energy and style. As for availability, Apple says all these new features are "available for testing starting today through the Apple Developer Program." A public beta is coming next month to eligible devices.
Share
Copy Link
Apple introduces Live Translation, an AI-powered feature in iOS 26 that offers real-time language translation across Phone, Messages, and FaceTime apps, enhancing communication without compromising privacy.
At the annual Worldwide Developers Conference (WWDC) 2025, Apple unveiled a groundbreaking feature for iOS 26: Live Translation. This AI-powered service aims to eliminate language barriers by offering real-time translation across various communication platforms 12.
Source: CNET
Live Translation integrates seamlessly into Apple's core communication apps:
Messages: As users type, their texts are instantly translated into the recipient's preferred language. Incoming messages are also automatically translated 13.
FaceTime: During video calls, Live Translation provides live captions in the user's preferred language, allowing for smooth conversations between speakers of different languages 13.
Phone Calls: For audio-only calls, the feature offers spoken translations alongside the original speaker's voice 15.
Live Translation is powered by Apple Intelligence, the company's suite of AI features. Key aspects include:
Apple emphasizes user privacy in the implementation of Live Translation:
While Google and Samsung offer similar translation features, Apple's approach differs in key ways:
Live Translation represents a significant step in Apple's AI strategy:
As part of the broader iOS 26 update, which includes a visual redesign dubbed "Liquid Glass," Live Translation contributes to Apple's vision of more intuitive and intelligent devices 5.
Apple's Live Translation feature in iOS 26 marks a significant advancement in real-time language translation technology. By prioritizing privacy, offline functionality, and seamless integration, Apple aims to set a new standard in AI-powered communication tools. As the feature evolves and expands to more languages and devices, it has the potential to revolutionize how people communicate across language barriers.
Source: AppleInsider
Source: AppleInsider
Summarized by
Navi
Google introduces Search Live, an AI-powered feature enabling back-and-forth voice conversations with its search engine, enhancing user interaction and information retrieval.
15 Sources
Technology
1 day ago
15 Sources
Technology
1 day ago
Microsoft is set to cut thousands of jobs, primarily in sales, as it shifts focus towards AI investments. The tech giant plans to invest $80 billion in AI infrastructure while restructuring its workforce.
13 Sources
Business and Economy
1 day ago
13 Sources
Business and Economy
1 day ago
Apple's senior VP of Hardware Technologies, Johny Srouji, reveals the company's interest in using generative AI to accelerate chip design processes, potentially revolutionizing their approach to custom silicon development.
11 Sources
Technology
17 hrs ago
11 Sources
Technology
17 hrs ago
Midjourney, known for AI image generation, has released its first AI video model, V1, allowing users to create short videos from images. This launch puts Midjourney in competition with other AI video generation tools and raises questions about copyright and pricing.
10 Sources
Technology
1 day ago
10 Sources
Technology
1 day ago
A new study reveals that AI reasoning models produce significantly higher COβ emissions compared to concise models when answering questions, highlighting the environmental impact of advanced AI technologies.
8 Sources
Technology
9 hrs ago
8 Sources
Technology
9 hrs ago