Curated by THEOUTPOST
On Wed, 31 Jul, 12:02 AM UTC
10 Sources
[1]
Apple Intelligence hands on - iOS 18.1 dev beta gives us an early look at a smarter Siri and more
Apple Intelligence is forming right before our eyes. This week, Apple shipped the iOS 18.1 dev beta, which incorporates more than a dozen Apple Intellience updates across Phone, Notes, Photos, Mail, Writing, and Siri. Even at this early stage, it's clear that Apple's brand of generative artificial intelligence could change how you use your best iPhone. First, a big caveat. iOS 18.1 dev beta is not for everyone. The name should give you a clue: this is for developers and intended as a way for them to familiarize themselves with the updated platform and its new Apple Intelligence capabilities. It'll help them understand how their own apps can tap into upcoming iPhone on-device (and Private Computer Cloud-protected off-device) generative AI capabilities. This is not a software update for consumers, especially since it's unfinished and not always stable. It's also incomplete. Areas of Apple Intelligence that are still to come in future betas and updates that will arrive sometime this year and - yes - next year include any kind of generative AI image creation (including Genmojis), photo editing, support for languages besides English (US), ChatGPT integration, Priority Notifications, and Siri's ability to add in personal context and control in-app actions (it can already control native device capabilities, though). I've been playing with it because it's my job, and, yes, I can't help myself. This is our first opportunity to see how Apple's AI promises to translate into reality in Apple Intelligence. Is it possible for a feature to be both bold and subtle? Siri's Apple Intelligence glow-up in the iOS 18.1 dev beta removes the Siri orb and, at first glance, appears to replace it with the most subtle translucent bar at the bottom of the screen. On my iPhone 15 Pro, I can bring up Siri by saying its name, "Siri," and through a double tap at the bottom of the screen. When I do the latter, I see a mostly see-through bar pulse at the bottom of the display, which indicates Siri is listening. However, if I don't touch the screen and just say, "Siri" or "Hey Siri," I see a multi-colored wave pulse through the entire screen with the bezel glowing brightly for as long as Siri is listening. I can also enable the same effect by long-pressing the power button. Obviously, the new look is exciting and quite probably subject to change. Perhaps more interesting are some of the new Apple Intelligence-infused capabilities that arrive with this early dev beta. In my experience, the new Siri is already better at remembering context. I've asked about locations and then dug into details about them without having to restate the state or city. This could be a time and frustration saver. While exploring the iOS 18.1 dev beta, I noticed the subtlety of Apple Intelligence's integration. In Text to Siri, for instance, there's no obvious indicator of where it lives or how to activate this type-only Siri interaction. But it also makes sense to me. When I want Siri's help but have no interest or ability (I'm not alone) to speak my prompt, typing it in could be quite helpful. To use Type to Siri, I just double-tapped at the bottom of my iPhone screen, and a prompt box popped up with the words "Ask Siri...". When I asked it to "Set an alarm for 2:45 PM," it did it just as wordlessly as I had typed it in. In fact, Siri didn't tell me that the alarm was set, but when I checked by asking Siri out loud to "Show me my alarms" it showed me it was set. "Woah." That was my honest reaction when I saw the new Apple Intelligence-powered Priority Section sitting at the top of my email box. It had everything from security alerts that I'd already addressed to some Nextdoor Neighbor alerts that might truly be a priority for me based on their utter weirdness. I found I could open any of the emails directly from that box, but there's currently no way to swipe one away as you can with regular email, but I guess then that's the point of "priority." The Summarize tool is pretty good and gave me brief, precise boil-downs for all kinds of emails. I mostly used it on lengthy newsletters. I think the summaries were accurate, though I could report to Apple with a thumbs down if they were not. I'm betting that people will come to use these quite often and may no longer read long emails. Another potential time saver or possible avenue for missing nuance. Guess we'll see. My favorite new Apple Intelligence feature, though, may be the Writing Tools. Sure, it's a bit hidden under copy and paste, but it is already fast, powerful, and kind of fun. I wrote a silly, poorly worded email to a coworker about a missing pizza, selected the text, and then slid over to the Writing Tools under Copy and Paste. The menu offers big buttons to proofread or rewrite and quick shortcuts to tones of voice: Friendly, Professional, or Concise. I can even get a summary or list of key points from my email. This email was brief, though, so I just needed the reinterpretation; they were spot on and a little funny, but that's because I wrote the original silly email. These tools work well in real-time conversations in iMessages, where I used them to ensure the right tone of voice. As a young cub reporter in the late 1980s, I used to attach a special microphone to the back of my handset. It snaked back to a tape recorder, capturing the audio for both sides of my interview. Today, we make calls on our smartphones that make it easier to capture audio. Perhaps none, though, will make it as frictionless as the new Phone call recording and transcription in iOS 18.1 dev beta. I conducted a brief two-minute call with my colleague, Jake. I initiated the call and clicked a little icon in the upper lefthand corner that let me record it. For transparency, the system announced to Jake that the call was being recorded. It also automatically captured the audio in the Notes app, where, within a minute after ending the call, I found a very accurate and clearly labeled transcription. I can tap on any section to replay the associated audio and search against the entire transcription. Honestly, if this were the only iOS 18 update, I would install it just for this. You know how your iPhone creates photo collages or memories for you without you asking for them? Sometimes, they hit the right emotional note, and you're almost happy your phone did it. Other times, maybe not so much. Apple Intelligence in iOS 18 combines the automation of that system with generative AI smarts to let you craft the perfect memory. At least, that's the pitch. In iOS 18.1 dev beta, "Create a Memory Movie" stood ready to do my bidding, the prompt box asking me, like a psychiatrist, to "Describe a memory." I tried various prompts, including some based on suggestions like 'Summer 2024," but all I got was a combination of Movie Memories that were off the mark or that did not work at all. This is, of course, super early beta, so I'm not surprised. At least I could see how the interface operates and the "show your work" image collage animation that appears before the system plays back the memory movie (complete with music that I could alter with selections from Apple Music). The system is also set to support natural language queries in search, which could bring Apple's Photos app in alignment with Google Photos, where I've long been able to search for almost any combination of keywords and find exactly the images I need. In my very early experience, the iOS 18.1 Photos would only work when it had finished indexing all my iCloud-based photos. Since I have tens of thousands of photos, it may take a while. As I noted above, it's very early days for Apple Intelligence. iOS 18.1 Dev Beta is a work in progress intended for developers and not recommended for the public. It's incomplete and subject to change. Even when the iPhone 16 arrives (maybe September) Apple Intelligence may still be under development. Some features will arrive fully baked on the new phone, others will arrive in software updates, and some could slide all the way into next year.
[2]
I've spent 24 hours with Apple Intelligence and while there's lots still missing, this first beta makes me very excited for the iPhone's AI-fuelled future
When Craig Federighi revealed Apple Intelligence back in June the AI-sceptic in me was incredibly excited by Apple's first major step towards an AI-fuelled future. The company is known for not doing things first but doing things right and a first look at the AI features built with the end user in mind piqued my interest. After news began to circulate earlier this week that Apple Intelligence won't be ready to launch alongside the iPhone 16 in September, so did rumors of the release date for the first developer beta to include Apple Intelligence. Now, nearly two months later I've finally got my hands on an, albeit very early, version of Apple Intelligence and I can't wait to see what the future of this software holds. I'm based outside the US, so the first thing I had to do to even gain access to Apple Intelligence was change my iPhone 15 Pro Max's language and region. After that, I restarted my device, made a cup of tea, and joined the Apple Intelligence waitlist ready for a long wait before accessing the new AI powers. Merely a minute later, with an iPhone as hot to the touch as my freshly brewed chamomile tea, I was in -- Apple Intelligence had entered my world. Now, before I give my thoughts on the AI features I've tried over the last 48 hours, I want to preface that this incredibly early build of Apple Intelligence is missing quite a lot of what people are excited to try. In fact, my colleague James has written about everything missing in the Apple Intelligence beta. Instead, I'm going to focus on the handful of AI features that do work on the best iPhones right now. Apple Intelligence's summary tool is pretty damn cool. It's completely built into iOS 18.1 and can be used at any point simply by selecting text. The tool functions differently depending on the app, for example, in Safari if I find an article and open Reader, I can then summarize the text to make it easier to digest. In Mail, on the other hand, every email has a summarize button at the top that quickly gives me the key information of the message. None of this is new to those heavily invested in AI already but having it directly embedded into iOS, iPadOS, and macOS is going to prove very useful. Obviously, there's an ethical debate here, and I'm not really a fan of summarizing content myself, but using the tool in the Mail app could quickly become one of the best Apple Intelligence features, especially for quickly skimming key information. Alongside the ability to summarize are the new proofreading and rewriting options. Time will tell whether this creates a weird utopia where every iPhone user communicates in the exact same tone of voice but as it stands tools like Grammarly should be quaking in their boots. With just a press of a button, Apple Intelligence reads what you've written and makes sure it's grammatically correct -- ideal for quickly working on the go. The rewrite feature gives you different tones of voices to choose from and while It's going to prove useful for many, I find it deeply concerning. Just for a test I went onto iMore and copied text from one of my articles to see if Google could pick up on its origin. Pasting the text brings up the article in question but using Apple Intelligence's rewrite feature before searching for the text doesn't find anything from iMore's domain. I know like the summary tool this isn't anything new, but the fact that millions of iPhone users will gain access to these tools embedded in iOS 18.1 freaks me out for the future. Siri's had a complete redesign in iOS 18.1 with the voice assistant now a gorgeous full-screen animation that beams around the edges of your device. You can also activate "Type to Siri" by double tapping the bottom of your iPhone's display, perfect for times when you want to discreetly ask something in public. Siri is going to be at the core of the Apple Intelligence experience but we'll need to wait until 2025 to get access to Siri 2.0. The updated Siri will have awareness of personal context and the applications on your display so you can use it to get the most out of Apple Intelligence. In the WWDC keynote, Apple used an example of the AI capabilities pulling information from a poster of an event to determine whether or not someone could move their meeting and still make their daughter's theatre performance. Apple Intelligence was able to know who the person's daughter was, the event in question, and on top of all that contact those involved and create new calendar events. Seriously impressive stuff. In this beta, however, Siri has none of those capabilities and while it can answer more prompts than before, often opting to provide an answer rather than sending you a list of Google links, it's still not smart enough to feel the true power of AI. I love the new Siri animation and the flexibility of options on how to activate it, but Apple Intelligence as a whole feels a bit disjointed without the major update to everyone's favorite (yeah, right) voice assistant. Alongside the features I've mentioned above, Apple Intelligence can also make Movie Memories in the current iOS 18.1 developer beta, although for whatever reason it won't work on my device. There's also a neat feature that lets you transcribe phone calls to Notes and from my brief testing it works very well. That said, however, as it stands Apple Intelligence feels like small enhancements to iOS that at present are nowhere near their potential. That's obviously to be expected as iOS 18.1 isn't expected to release until October but it does raise some questions about the future of Apple's AI tools. My main concern is the iPhone 16 at launch, especially if Apple Intelligence's arrival in October won't bring with it the next generation of Siri which is core to the experience. The features on offer today, combined with headline features like Genmoji and Image Playground will be a welcome addition to the iPhone experience, but without the personal assistant aspect of Siri 2.0, I fear that we'll quickly forget these tools are even baked in.
[3]
I tested Apple Intelligence on my iPhone 15 Pro Max: 3 ways it spoiled me rotten
Thank you, Apple Intelligence, for indulging my desire for convenience whenever possible. Credit: Kimberly Gedeon / Masbale Apple Intelligence, if it were personified, would be a royal attendant who feeds me grapes and fans me with palm leaves. I've never felt so pampered. Is this what it's like to be catered to? Is this what it's like to be spoiled rotten? If you've been out of the loop, Apple Intelligence is the Cupertino-based tech giant's new suite of AI features, which were announced at WWDC 2024 in June. Some of the most highly anticipated Apple Intelligence features, like Genmoji (AI-generated emojis) and Image Playground (AI-generated images), are not available yet. However, there are still some Apple Intelligence utilities you can test right now -- especially since they've made their first robust debut with the iOS 18.1 developer beta that Apple dropped on Monday. With my iPhone 15 Pro Max, I tested some Apple Intelligence features, and to be succinct, I appreciate how much it indulges my laziness. Of course, it's not perfect -- not yet at least. After all, the iOS 18.1 developer beta is expectedly a bit rough-around-the-edges as Apple collects feedback from testers. (This is why you should always backup your iPhone before installing any iOS beta because it can be risky.) Overall, though, Apple has something here that will make its "Pro" and "Pro Max" iPhone models even more enticing than ever. (Apple Intelligence is only available on the current-gen "Pro" variants: iPhone 15 Pro and iPhone 15 Pro Max.) Apple basically put a TLDR (too long, didn't read) button in Safari, allowing me to skip ultra-long articles and get to the point. After opening Safari, I can press-and-hold the icon on the left of the URL bar, tap "Summarize," and wham, Apple Intelligence gives me "CliffsNotes," if you will, of any article I don't have time to read. It's perfect for when you want to get to the gist of the story as quickly as possible. As someone with an attention deficit, I can't help but get a little antsy after stumbling upon a wall of text or a verbose story. Typically, I'd read a few sentences and give up. However, with the "Summarize" tool, I received short-and-sweet synopses on articles that ramble, meander, and seem to go nowhere fast. In fact, at times, I found myself wanting to read a lengthy article in its entirety after the AI-powered summary revealed that the story is juicier than expected. That being said, when it comes to delving into dense articles, why would I use Google Chrome? I'm hopping and skipping over to Safari to take advantage of that new Apple Intelligence-powered Summarize tool. The only downside, however, is that the Summarize tool tests my patience sometimes. It can be a few seconds too slow for my tastes, but this isn't unique to Apple Intelligence (ChatGPT, Copilot, and Gemini can be slow pokes, too.) But admittedly, it's worth the wait. One of the most popular use cases for AI, whether it's ChatGPT, Google Gemini, or Copilot, is "tone tuning" text. For example, you can drop in an email draft and ask those AI tools to help adjust your tone. However, I'll admit that I was one of those people who thought, "Pfft, I don't need an AI chatbot to tell me how to make something sound more professional!" As it turns out, the Apple Intelligence-based tone-adjustment tool, which can be accessed via the new Writing Tools feature, is a lot more helpful than I thought. You'll be surprised how often you may think you're coming across friendly and congenial in emails and texts, but instead, your messages are being interpreted as prickly. To reduce the risk of this, I found myself using the "friendly" tone-adjustment tool to nix unintended snippiness. The best part is that I was able to use Writing Tools in almost any text field across the iOS 18.1 developer beta. I just highlight the words, hit "Writing Tools," and choose my desired tone. As a result, I've definitely seen more positive responses from my co-workers, friends, and other loved ones. I take a lot of pictures and selfies, but no, I don't categorize them nor put them into albums (because, spoiler alert, I'm lazy). I just let them pile up into a haphazard collection of random snapshots. Every now and then, however, I need to find that one picture, which requires me to scroll endlessly to find it. Fortunately, with the iOS 18.1 developer beta, I can use natural language to search for a particular picture in the Photos app. For example, I typed the word "pancakes" to find a saved screenshot of my favorite IHOP order. This natural search feature can also detect text on the photo, which is helpful, too. However, the natural language-based search function isn't sophisticated yet. For example, while I can get away with typing something like "laptop" and "food" (and it will find photos that match those terms), it's not advanced enough to grasp more complex searches like, "Woman wearing a red shirt." Again, this beta version of iOS 18 is early days, so the search function's complexity may get a boost when iOS 18 officially drops later this year. Siri received an AI-powered boost with Apple Intelligence, but my favorite perk is its contextual awareness. For example, if I am perusing through a webpage on Safari, I can say something like, "Hey Siri, send this article to Jason." Siri is aware of which article I'm looking at, so it can snatch the URL and send it to my fiancé without needing to lift a single finger. (I'm telling you -- I've been spoiled rotten.) As hinted at the outset, Apple Intelligence nurtures my laziness -- and I dig that. I don't always want to suffer through several paragraphs to get to the author's point. I don't want to spend too much time hemming and hawing over how to best respond to someone. And finally, I don't want to scroll endlessly through my cluttered gallery to find a specific photo -- it's like finding a needle in a haystack. Apple Intelligence addresses all of those concerns. I was initially skeptical of Apple's new suite of AI features, but as it turns out, Apple Intelligence is useful, and yes, totally worth it. Apple Intelligence is expected to officially roll out with iOS 18 later this year (but keep in mind that reports claim that some features may be delayed).
[4]
I tried Apple Intelligence on macOS Sequoia -- here's my take on the new AI features
I've had my first taste of Apple Intelligence, and I wish I'd waited longer before sampling the goods. Not because Apple Intelligence is bad, but because it's not finished yet. Apple recently made it available on Macs via a late July update to the macOS Sequoia developer beta, which is only accessible by registered Apple developers. This is very early software. And if you're not a developer already running the macOS Sequoia beta, I suggest you wait until the full version of macOS Sequoia arrives in the fall before you play around with Apple Intelligence. I've been using the first Apple Intelligence features in macOS for a day or so now, and what's available is functional and capable, but painfully limited. To give you a sense of what's coming, I'll run through every major way in which Apple Intelligence changes the way you use macOS -- or at least, the ways I've found so far. One of the first surprises I encountered when testing Apple Intelligence on the beta version of macOS Sequoia is that it's not enabled by default -- you have to turn it on. Doing so is simple enough: just find the Apple Intelligence & Siri submenu in your Mac's System Settings menu, or launch Spotlight Search (Cmd + Space) and type Apple Intelligence, then select it from the results. Note that at least during the beta testing period, you have to tell Apple you are interested in using Apple Intelligence before you will be given the chance. During my testing there was a "Join the waitlist" button in the Apple Intelligence menu, and I had to hit that and wait about two hours before I was allowed to enable Apple Intelligence on my MacBook Pro (16-inch M3 Max). This waitlist will likely be a distant memory by the time Apple Intelligence finally debuts, but if you plan to try it early you should be ready for a (hopefully brief) wait. (And if you want to grab macOS Sequoia before it's released so you can check out Apple Intelligence early, check out our step-by-step guide to how to download macOS Sequoia!) One of the two big new Apple Intelligence features that's now available to try out in macOS Sequoia is Writing Tools, a suite of options you can access in select apps to do things like rewrite and summarize text. At first blush these tools are useful but hardly exciting, or much different than what you can get from Copilot in Windows 11. In Apple apps like Notes. During my early testing of the macOS Sequoia beta, Writing Tools became available in Apple apps like Notes and Pages. as well as third-party apps like Slack. Accessing them from a supported app is easy: you just highlight text you want to edit, then mouse over the Writing Tools subsection of the pop-up menu. From there you either click the specific tool you want to use, or you can click Show Writing Tools to show a visual palette of the available tools next to the text you're editing. During my testing I was able to tell Apple Intelligence to proofread or summarize text for me, rewrite it as concise, friendly or professional, and generate tables, lists or key points. None of this functionality is particularly mind-bending or industry-leading, but what I was able to test at this pre-release stage works well. I have yet to catch Apple Intelligence making an error when rewriting or summarizing text, and the quality is about as good as you get from ChatGPT. The other big new features that comes to macOS Sequoia with this Apple Intelligence injection is a supercharged Siri. The reason Apple Intelligence & Siri are now paired together in their own Settings menu is because the two are deeply intertwined. Siri gets a lot better on Macs that support Apple Intelligence, to the point that I now see a good reason to enable Siri and use it on a MacBook. See, until now Siri was one of the things I immediately disabled on every new Mac. But with Apple Intelligence, it's useful enough that I actively want to use it. Now that I'm beta-testing a version of macOS Sequoia with Apple Intelligence I can ask Siri to text a message to my partner and it just does it, smartly fixing any errors it encounters along the way. Of course, it's not always smooth or responsive. During my day or so putting the supercharged Siri through its paces I've not noticed any major crashes or errors, but it does often take two or three requests before it understands what I'm asking. Simple requests, like "Hey Siri, open Notes" trigger the right response reliably enough. But the more complicated requests often take a few tries before Siri gets the message, and the more advanced Siri features powered by Apple Intelligence -- like the ability to ask it to show you files a specific person sent you last week -- are still MIA. This first glimpse of Apple Intelligence on macOS Sequoia is an awfully limited one, but it does afford us a peek at what's coming for Mac fans before the end of 2024. And if I'm being honest, it's done nothing more than whet my appetite for the full kit and kaboodle. When Apple showed us Apple Intelligence for the first time back at WWDC 2024 I thought it looked slick and, more importantly, smartly integrated in ways that suggest Apple is handling AI so much better than Microsoft on the PC front. Now that I've had a chance to use it at home, I'm mostly just frustrated we can't see more of what's possible with Apple Intelligence on Macs. There's a ton of key features still missing from the pre-release beta version of Apple Intelligence on macOS Sequoia, including the capacity to generate images, videos and Genmoji. While you can try it out right now if you really want, I'm inclined to suggest it's best to wait for the full version before jumping headfirst into Apple Intelligence on Mac.
[5]
Every Apple Intelligence feature you can use in the iOS 18.1 beta | Digital Trends
Apple has released the first developer beta for iOS 18.1, and it's crucial for one key reason: Apple Intelligence. The suite of artificial intelligence features that Apple introduced at the WWDC a few weeks ago is finally making its way to iPhones. Well, at least a select few of those features. Right now, Apple Intelligence is only available for the iPhone 15 Pro and iPhone 15 Pro Max. Also, make sure your device location is set to the United States and Siri's language is set to English (U.S.) to get the best of Apple Intelligence. Right now, there's a waitlist to enable Apple Intelligence on an iPhone, but as per a healthy few reports we've seen online, the waitlist clears in about 10 to 20 minutes. I managed to get the same done on iPadOS 18 in roughly five minutes. Recommended Videos NOTE: Some of the Apple Intelligence features seem to require additional assets to work after you install the iOS 18.1 Developer Beta update. Hence, you will see a message that says, "Support for processing Apple Intelligence modes on the device is downloading." Apple Intelligence features you can use now Once you've downloaded the iOS 18.1 Developer Beta update, cleared the waitlist, and your phone has finished installing Apple Intelligence models, you are ready to go. On the dedicated Apple Intelligence & Siri page in the Settings app, simply enable the Apple Intelligence toggle at the top. So far, all of the following Apple Intelligence features are live and ready to use. Writing Tools One of the most hotly anticipated features of the Apple Intelligence portfolio is the systemwide Writing Tools. For example, in the Notes app, when you select text or double tap to see the contextual menu, you see options like Proofread, Rewrite, and adjust the style with tones like Concise, Professional, or Friendly. More importantly, you also get one-tap options like text summarization, extracting key points from a text, turning it into a list, or formatting it in the form of a table. The whole process is near instantaneous, and it definitely feels quicker than what you would get while using cloud-linked tools like ChatGPT or Copilot on mobile. Mail summarization Mail is set to receive a lot of upgrades, but one of the most useful of them all is the Summarize feature. I loved this feature in the Shortwave app, and now, the pre-installed Mail app on your iPhone will summarize email contents with a single tap. Another neat feature is the AI-boosted voice note feature, which can now transcribe and summarize the contents of your voice recordings. You can directly embed voice clips in Notes files without having to switch to another app and have their summary or transcription always at your disposal with a single tap. A smarter Siri The first major attraction is Siri. When summoned, it now plays a glowing light animation around the entire display. The assistant can now directly tap into Apple's reserves and will guide you through your device-related queries. I also noticed that contextual understanding seems better, and it processes commands with an improved level of comprehension and alacrity. Also, we get a new way to invoke Siri. If you don't like the long press on the power button to summon Siri or interact with voice commands, just double-tap on the navigation bar at the bottom, and a keyboard will slide up that lets you enter text commands for the virtual assistant. An intelligent focus mode Apple offers multiple focus mode presets, such as gaming and work, and the ability to create a custom focus mode. With iOS 18.1, you get a new preset called Reduce Interruptions, which intelligently parses through the clutter of notifications using on-device intelligence. It reads through a notification's contents, and if they seem urgent, it will send an alert. It can be scheduled right from the control center and lets you customize the behavior, as well. For example, you can whitelist on a per-person basis to allow notifications from only a select few important people, and the same can be done for apps. All of the missing Apple Intelligence features That said, a handful of Apple Intelligence facilities have yet to arrive in the iOS 18.1 beta. But that is not entirely unexpected. AI tools are notoriously error-prone, and it takes a lot of time to close the loopholes so that they don't make the same dumb or disturbing mistakes. Bloomberg also reported that some of the Apple Intelligence features will take their own sweet time to release and that we can expect a few of them to make an appearance next year. Below are some of the biggest Apple Intelligence tricks that are yet to be seen. Siri awareness and app integration We recently detailed a research paper on how Apple wants Siri to always be aware of what's on your screen so that it can answer your queries and assist users when needed. Apple finally made this official at WWDC, alongside a promising new ability to interact with other apps installed on your phone. Unfortunately, these Siri tricks have yet to appear with the Apple Intelligence package. ChatGPT Apple says the new AI-boosted avatar of Siri will handle most of your chores, but in cases where it needs to dig deeper, it will show a prompt that lets you offload the task to ChatGPT. In case you missed it, ChatGPT is now multi-modal (capable of processing text, voice, and video inputs), and with the latest GPT-4o update, it's conversation and comprehension abilities have made a new leap. However, it seems Apple needs more time to polish and release this two-tier AI assistant system on compatible hardware. Priority notifications This is one of those features that I've been eagerly waiting for. As the name suggests, this one automatically sifts through the barrage of notifications and will only highlight those that seem important. Apple says it will accomplish the task by summarizing the notifications to allow a quick glance through all the clutter. Mail upgrades Apple showcased some convenient tricks like Priority Messages in the Mail app, conversation summaries, a new Primary section at the top, and an AI-charged Smart Reply for contextual quick replies. Some of these capabilities can be experienced in Gmail and third-party apps like Shortwave, but they are yet to arrive with the first developer beta of the iOS 18.1 update. Image eraser If you've been wowed by the AI trick that seamlessly removes certain items in an image on a Google Pixel or Samsung phone, well, Apple has something similar in its kitty for the Photos App. It allows you to pick and remove unwanted elements from a picture, but alas, this trick hasn't arrived with the iOS 18.1 Developer Beta update. Image Playground Image Playground is one of the most fun aspects of Apple Intelligence. It lets users tap into the power of AI and create custom images on the go, which can then be used in apps such as Notes, Messages, and more. But Apple is not keeping it limited. Instead, thanks to the Image Playground API, developers can also enable this facility in their app. This is an on-device process, which means unless you are syncing or sharing data over the internet, you can create and insert AI-generated images wherever you want on your iPhone. On a related note, Apple also promised some cool upgrades like a natural language search for finding specific images and text-driven custom memories. But so far, we don't see them working as advertised in the Photos app. Genmoji Think of Genmoji as the next evolution of emoji, but one with an endless canvas that is only limited by your creativity. Thanks to Genmojis, users can generate their own emoji with nothing but a text description. They can be used in line with text and can also be shared as stickers.
[6]
Apple's Intelligence is already here: this is everything we can do in its first beta version - Softonic
Apple has released the first beta of iOS 18.1 for developers, and with it comes Apple Intelligence. Apple's AI takes its first step on the iPhone and does so with many new features. Not all of them are available yet, as some require more development time, but there is still much to discover now that we have Apple Intelligence among us. For now, iOS 18.1, iPadOS 18.1, and macOS 15.1 are only available for devices with Apple Intelligence support. Once the updates are installed, assuming we have a developer account, the list of new features is as follows. One of the standout features of Apple Intelligence is the writing tools incorporated into the devices. These allow us to review texts by instantly correcting spelling, grammar, word choice, and sentence structure. In addition, they offer us the option to rewrite texts to improve their clarity or modify the tone without altering the content itself. We can also select texts to obtain an automatic summary in different formats: paragraphs, lists, key points, or tables. The new version of Siri now features its new interface that displays a glow around the edge of the screen when activated, preventing it from obstructing the view of the screen itself. Additionally, Siri can now maintain context between requests and follow the conversation thread even if we modify our request in the middle of a sentence. Now we can also write to Siri. Simply double-tap on the bottom of the iPhone or iPad screen to make our request in writing. On the Mac, we will press the Command key twice. Now both in Mail and Messages, quick reply options based on the context of the received message and the response we want to give are included. Both applications can summarize the content of multiple notifications directly on the lock screen. In addition, Mail organizes urgent messages at the top of the mailbox and provides brief summaries in the inbox list so that we can understand the content at a glance without having to open the email. The Photos app includes Memory Maker, a tool that allows us to create movies from simple descriptions like "My beach vacation" or "Christmas in London." This feature automatically selects the most suitable photos and music, generates a short narrative, usually temporary, and presents the final movie. Now we can also search more naturally in the Photos app, search within our videos, and use autocomplete for searches. The focus modes have also been updated to reduce interruptions, allowing only important notifications to capture our attention. With the interruption reduction mode, the system determines what to show us and silences everything else. Another standout feature is the ability to record phone calls directly from the Phone app interface, always with consent notifications for all participants. Recordings and transcriptions are saved in the Notes app, where automatic summaries of the conversations can be generated. Similarly, any recording in the Notes app, not necessarily in the Phone app, can be transcribed, summarized, etc. In the Privacy and Security section, we have the option to access a report of our interactions with Apple Intelligence, which allows us to export the data used by Apple's artificial intelligence. Yes, it is a beta, and yet, the results we are seeing are truly impressive. It is important to take into account that generative AI can be directly integrated into the system. While in China and Europe, we will have to wait until next year for Apple Intelligence to be available due to the DMA; in the rest of the world, after installing the beta, we can join the waiting list for this feature. A feature, if we can call it that, that will change the iPhone forever.
[7]
What Apple Intelligence features are missing in the latest iOS 18 beta? Here are the AI features we're still waiting for on iPhone
After years of waiting, Apple's first foray into AI is finally here, and it's only just the beginning. Apple Intelligence was officially unveiled at WWDC 2024 and, though it is expected to revolutionize the way the best iPhones work, the new beta only has a few features. I'm a particular fan of the new AI transcription function, and the writing tools could make for a good Grammarly replacement, but, now that the full upgrade seems to have been delayed, it's hard not to feel like the new beta is a bit half-hearted. Here are all the features missing in the new Apple Intelligence beta. Siri has been improved with a new look and you can now flick between using your voice or typing to control Siri. It is much easier to access and has an improved knowledge base to work from when processing commands. It is also better at understanding the context for queries but this is not 2.0, it's much closer to Siri 1.1. It can't combine app use, like finding an image in Photos and then importing it to Journal. It also can't use on-screen awareness, like reading a text you were sent with an address, and then adding it to the contacts sheet of the person who sent it, without opening the app. This is all to say Siri is not yet smart. Its new look and use should set it up for a much bigger upgrade later this year. ChatGPT is not active on-device in any fashion just yet. When the full Apple Intelligence launches this year, users will be able to use the AI query engine to process requests on-device. These could include identifying plants, suggesting meals with certain ingredients, or just answering a simple question. You can still use ChatGPT via a search engine on any Apple device but you will need to pay to use it. According to the Apple site, ChatGPT use on Apple devices will be free, and much easier to access. Are you sick of superfluous notifications from pushy mobile games hogging up your notification feed? If so, the priority notifications tool can use context clues to put the most important notifications at the top of your screen. This will analyze how you use your iPhone, iPad, and Mac, and weigh notification importance based on that. If you are waiting for a call from your doctor, that missed call notification will sit at the very top of your screen as you unlock it. As someone with multiple email inboxes, and far too many social media to pay attention to, this feature seems like a godsend to me, and I can't wait for it to launch later this year. The Clean Up tool can erase any objects that got caught up in the frame of a photo without your knowledge. Does an image have some rubbish in the background? Have you been pranked by the dreaded photo bomb? If so, you can simply swipe over anything you want removed from an image and it will magically erase it. The reason this function needs AI to work is it has to replicate whatever background is behind the thing you want to remove. Not only does it have to get rid of something, it has to fill in the beach, field, sky, or whatever other backdrop is behind the picture. When done right, images should look like you haven't used the tool in the first place. Perhaps greatest of all, this can work on all old images so you can edit photos from years ago in mere moments. Apple Intelligence has a whole host of generative AI features. The Image Wand can transform a sketch in your notes into an actual image. If you are an architect or a casual doodler, this can help add dimensions to a rough idea. At WWDC 2024, Apple showed off a rough idea, and AI managed to analyze it and create an image based on that information. You don't even need to draw a sketch for this to work. You can circle empty space and it can grab surrounding context to make a diagram or mockup. As well as this, the new Image Playground is another generative tool that can create images based on prompts, which can then be saved to Photos. You can simply type in what you want and it will try to make you a distinct image based on that information. As pointed out on the developer site, this app can be hosted within other apps, allowing users to create imagery on games, productivity tools, and more. A similar idea -- Genmoji is a new tool that uses generative AI to make new emoji. You can type in prompts, and it will create a sequence of emojis, that you can send to others via apps like Messages. Interestingly, you can even pick out faces from your Photos app, and create an emoji of them. Mail does now have priority notifications, and can summarize emails, which are both great additions but it is missing some key features. It can't yet summarize priority emails, and the priority notification functionality is rather inconsistent right now, defeating the purpose of having AI screen your inbox. Though this is more about Siri 2.0 not arriving, it also can't intelligently add Mail functionality to apps, like asking Photos to email someone an image you have taken with a caption of your choice. If you've managed to get into the beta, you will have a handful of months to get used to all the new AI features, before the next major batch arrives.
[8]
Apple Intelligence: 5 AI-powered things you should do immediately
Apple Intelligence is here. Well at least in beta form. If you've got one of Apple's best iPhones, iPads, or Macs you can hop on the beta right now and start using the latest and greatest in Apple AI. iOS 18.1 is here, along with iPadOS 18.1, and macOS Sequoia. If you're comfortable downloading the developer beta and you've got a compatible device, you can start sampling Apple's AI offering right now. Many of the headline features, such as generative imagery and the new Siri, aren't available right now, but there's still fun to be had. Here are five things you should do straight away using Apple Intelligence. Apple's new Writing Tools includes a handy feature that can rewrite any block of text using AI with just one tap. For example, perhaps you're writing the introduction to some notes, or you want to send an email, you can ask Apple Intelligence to re-write your words for you. If you can't quite find the right phrase to politely decline an invitation, or you'd like to offer someone some feedback succinctly, this is the feature for you. Simply highlight the text you want to re-write and tap "Rewrite." You can even choose the type of tone you want from Friendly, Professional, or Concise. Apple Intelligence also includes a nifty feature that can summarize pretty much anything. Notably, it can sum up an email in a nice, digestible three-line paragraph, or an entire web article. You can summarize web articles using the button in the bottom left corner of the Safari web browser, and you can summarize emails by tapping the 'Summarize' button in the top right corner. In Photos, you can ask Apple Intelligence to create a Memory Movie of your photos from trips, locations, certain people, or anything else. Head to the Photos app, and in the new Memories section tap "Create." From there you can describe a memory to Apple Intelligence and it'll generate the movie. You can say things like "My recent trip to Las Vegas" or "Pictures of my dog and I." If you take or make a call on the new beta, there's a new button in the top left corner of the call screen that lets you record your phone calls. It will notify participants before the recording starts, with transcriptions found in the Notes app. Once the recording is done, you can use Apple AI to generate summaries of your call, which can be kept with the transcript alongside the audio log. Apple Intelligence's fully-fledged Siri upgrade isn't here yet, and there's no ChatGPT for now, but you can at least play around with an improved version in iOS 18.1. Bringing up Siri will reveal the brand new interface with the cool glowing purple border. Not only is the new animation delightful, but Siri is now a little smarter according to our early testing, and seems to prefer giving good answers rather than web results. Siri should get an even bigger upgrade when Apple Intelligence rolls out in full later this year. So there are five things you can do with Apple Intelligence right now. We're still waiting on a bunch of other new features including Priority notifications, Cleanup for Photos, generative AI features like Image Playground, and of course ChatGPT integration. Until then, this vital early testing will give Apple valuable data to improve the experience for the impending public release.
[9]
See Apple Intelligence in Action
Apple released iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 betas for developers yesterday, introducing an early version of Apple Intelligence. Not all of the Apple Intelligence features are implemented yet, but we thought we'd go through what's available and see just what kind of changes AI will bring to the iPhone, iPad, and Mac. Siri has a new design with a glow around the edges of the display when active, and there is a Type to Siri option so you don't need to talk to Siri. Siri is able to maintain context between requests and follow along if you stumble over words, plus Siri can answer all kinds of questions about your devices thanks to the integration of Apple's product knowledge and support database. Phone calls can be recorded, transcribed, and summarized, there are Smart Replies for Mail and Messages, Photos has a Memory Maker feature, and there's a Reduce Interruptions Focus Mode that intelligently filters out notifications you don't need to see right away. If you want a more in-depth rundown of all of the features that are available, we have a dedicated guide that highlights what's in the beta now. There are several Apple Intelligence features coming later, including Image Playground, Genmoji, ChatGPT integration, and the smarter version of Siri that has onscreen awareness and can do more in and across apps. You won't find these features in the current iOS 18.1 beta, but Apple plans to roll them out across 2024 and 2025. Apple Intelligence is limited to developers at the current time, with Apple working to refine the Apple Intelligence features and eliminate bugs. A beta version of Apple Intelligence will be available for all users later this fall, after iOS 18, iPadOS 18, and macOS Sequoia launch. You will need an iPhone 15 Pro/Pro Max or later or an Apple silicon Mac or iPad to use Apple Intelligence due to the processor requirements.
[10]
Apple Finally Revealed a First Look at Apple Intelligence. Here's Why the Tech Giant's Slow and Steady Approach to AI Could Be Beneficial.
On Monday, Apple offered a first look into Apple Intelligence, its highly anticipated AI designed for devices like the iPhone, iPad, and Mac. Developers registered with Apple's $99 per-year membership now have access to iOS 18.1 beta and can preview some of the AI features Apple promised in June. According to CNBC, Monday's preview includes features like AI-generated summaries for Mail, Messages, and voicemail, and AI text generation through Apple's Writing Tools service. Siri also has a new look and can better understand commands, even when a speaker trips over their words. Bigger features, like Siri being able to work within apps and integrate directly with OpenAI's ChatGPT, are not available in the preview. Related: Mark Zuckerberg Says Apple's 'Constrained' Platform Is the 'Major Reason' He's Pushing for Open Source AI Alon Yamin, co-founder and CEO of AI detection platform Copyleaks, says Apple's slow but steady approach to the AI race "is a testament to its commitment to user experience." "Over the last year, we've seen that Apple is in no hurry regarding generative AI, while other larger competitors rush to the market, often resulting in course correction," Yamin told Entrepreneur in an email. "While it may disappoint some users, this delay ultimately sets the stage for a more refined and reliable AI experience." Though Apple's early developer preview of Apple Intelligence came out on Monday, a Sunday Bloomberg report stated that the iPhone maker will broadly roll out AI later than expected this year. Instead of arriving with iOS 18 and iPadOS 18 in September, the AI updates will roll out a few weeks later, by October, per Bloomberg. Apple has not commented on the report as of press time. Bloomberg previously reported that two important Siri AI upgrades, Siri understanding context and working within other iPhone and iPad apps, will come out next year. Apple Intelligence currently only works with the latest iPhone models, the iPhone 15 Pro and Pro Max, which could prompt millions of iPhone users on older phones to upgrade.
Share
Share
Copy Link
Apple's latest iOS 18.1 developer beta introduces 'Apple Intelligence', a suite of AI-powered features set to transform user experience on iPhones and other Apple devices. This update showcases Siri's enhanced capabilities and various AI integrations across the operating system.
Apple has taken a significant leap into the world of artificial intelligence with the introduction of 'Apple Intelligence' in the iOS 18.1 developer beta. This new suite of AI-powered features is set to revolutionize the way users interact with their iPhones and other Apple devices 1.
At the heart of Apple Intelligence is a smarter, more capable Siri. The virtual assistant now boasts improved natural language processing and understanding, allowing for more complex and context-aware interactions. Users can now engage in multi-turn conversations with Siri, making the interaction feel more natural and human-like 2.
Apple Intelligence extends beyond Siri, integrating AI capabilities throughout the operating system. Some notable features include:
Smart Text Predictions: The keyboard now offers more accurate and context-aware text predictions, learning from user behavior over time 3.
Intelligent Photo Organization: The Photos app uses AI to automatically categorize and tag images, making searching and organizing photos more efficient 4.
Adaptive Battery Management: AI algorithms optimize battery usage based on individual user patterns, potentially extending device battery life 5.
True to Apple's commitment to user privacy, Apple Intelligence processes most data on-device rather than in the cloud. This approach ensures that personal information remains secure while still delivering powerful AI capabilities 1.
The introduction of Apple Intelligence also opens up new possibilities for app developers. With new APIs and tools, developers can integrate AI features into their apps, potentially leading to a new generation of intelligent applications for iOS 2.
While still in its early stages, Apple Intelligence has garnered positive initial reactions from developers and tech enthusiasts. Many see this as Apple's answer to the growing AI capabilities of competitors like Google and Microsoft 3.
However, it's important to note that the current beta version is likely just the tip of the iceberg. Apple is expected to continue refining and expanding these AI features before the public release of iOS 18.1, potentially introducing even more capabilities 4.
As Apple continues to develop and integrate AI across its ecosystem, users can look forward to a more intuitive, efficient, and personalized experience on their Apple devices. The introduction of Apple Intelligence marks a significant milestone in the company's AI journey, setting the stage for an exciting future in mobile technology 5.
Reference
[1]
Apple's new AI features, Apple Intelligence, are rolling out with iOS 18 updates. While promising, analysts doubt their immediate impact on iPhone 16 sales, citing production cuts and delayed feature releases.
8 Sources
8 Sources
Apple has introduced its suite of AI tools, Apple Intelligence, with the latest OS updates. This article explores the key features and their impact on user experience across iPhones, iPads, and Macs.
5 Sources
5 Sources
Apple's AI suite, Apple Intelligence, is evolving with iOS 18.4, bringing new features and improvements while addressing existing challenges. The update showcases Apple's commitment to refining its AI offerings in response to user feedback and competitive pressures.
3 Sources
3 Sources
A comprehensive look at Apple's new AI features, their functionality, and alternative options for users without compatible devices.
3 Sources
3 Sources
An in-depth analysis of Apple's new AI-powered features, Apple Intelligence, coming to iPhones, iPads, and Macs, highlighting both promising aspects and areas for improvement.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved