Curated by THEOUTPOST
On Tue, 29 Oct, 12:04 AM UTC
4 Sources
[1]
Apple Offers Up to $1 Million to Anyone Who Can Hack Its AI Servers
Member of the board, UCLA Daily Bruin Alumni Network; advisory board, Center for Ethical Leadership in the Media Apple CEO Tim Cook waxed poetic about the arrival of some Apple Intelligence features for the iPhone 15 Pro, Pro Max and iPhone 16 last week, saying in a tweet that the intro of Writing Tools along with new cleanup features for photos and a more conversational version of its Siri voice assistant is "the beginning of an exciting new era." But reviewers say Apple Intelligence isn't all that, at least not yet. CNET editor Bridget Carey reminds us the new generative AI features will be available in the US in only a limited way -- you need to go to your iPhone settings to get on the wait list, and doing things like making your own emojis with AI (a feature Apple calls Genmoji) will come later. Meanwhile, CNET mobile reviewer Lisa Eadicicco said you shouldn't "expect your iPhone to feel radically different" and called the new features "a first step in what could hint at larger changes" later on. So far, she finds the message and notification summaries the most useful. "What I've come to appreciate most is that I can look down at my phone after getting a barrage of texts or Slack messages and know whether it's an emergency just from the lock screen," she says in her iOS 18.1 early review. "The summaries aren't perfect (AI, as it turns out, can't nail sarcasm and doesn't know the inside jokes I share with my friends)," she adds. "But this type of functionality is exactly the type of passive, practical intelligence I'm hoping to see more of on smartphones in the future." Though Apple continues to roll out AI features slowly as part of what software chief Craig Federighi said last month is the company's strategy to "get each piece right and release it when it's ready," one thing Apple feels very confident about is how it's handling the privacy and security on the Private Cloud Compute, or PCC, servers that power some Apple Intelligence features. That's why it's inviting hackers as well as privacy and security professionals and researchers to verify the security claims it's made about PCC and is offering bounties from $50,000 up to $1 million to anyone who finds a bug or major issue. PCC, Apple claims, is the "most advanced security architecture ever deployed for cloud AI compute at scale." Here's a nontechnical explainer of PCC, and you can find a more technical one from Apple here. Bottom line: Apple guarantees it protects all the data on your iPhone by keeping it on the device (which is known as on-device or local processing). If a complex AI task needs to be handed off to more-powerful computers in the cloud -- PCC servers running custom Apple chips -- the company promises it'll use "your data only to fulfill your request, and never store it, making sure it's never accessible to anyone, including Apple." "Because we care deeply about any compromise to user privacy or security, we will consider any security issue that has a significant impact to PCC," Apple says about the Apple Security Bounty reward program. "We'll evaluate every report according to the quality of what's presented, the proof of what can be exploited, and the impact to users." Here are the other doings in AI worth your attention. Since it was released two years ago, OpenAI's groundbreaking ChatGPT AI chatbot has been used to write letters, emails, ads and wedding vows; provide summaries of long reports; and craft software code for the 250 million people who use it every week. But despite the chatbot's ability to respond to users' prompts in plain, everyday English, one thing it couldn't do well was provide answers to search queries with links to the web source -- like Google does -- since the training data in OpenAI's large language model wasn't being updated with the latest news. That changed last week when the San Francisco startup delivered on its promise to evolve its popular gen AI chatbot into a gen AI search engine that could grab data for you from the web in real time. The new search functionality and up-to-date links are powered by OpenAI partner and investor Microsoft and its Bing search engine. "You can get fast, timely answers with links to relevant web sources, which you would have previously needed to go to a search engine for," OpenAI wrote in an Oct. 31 blog post. "Getting useful answers on the web can take a lot of effort. It often requires multiple searches and digging through links to find quality sources and the right information for you. Now, chat can get you to a better answer: Ask a question in a more natural, conversational way, and ChatGPT can choose to respond with information from the web. Go deeper with follow-up questions, and ChatGPT will consider the full context of your chat to get a better answer for you." What does it all mean? It's another step in the escalating battle between OpenAI and Google, which has been adding gen AI functionality on top of its search engine, with its Gemini chatbot. Meta is also reportedly working on an AI search engine. They're all fighting for your attention -- and a share of the money to be made in gen AI. As for where OpenAI is getting those links to the latest news, it's signed licensing deals with publishers including the Associated Press, Axel Springer, Condé Nast, Dotdash Meredith, Financial Times, GEDI, Hearst, Le Monde, News Corp, Prisa (El PaÃs), Reuters, The Atlantic, Time and Vox Media, CNET's Imad Khan reported. Meanwhile, The New York Times is suing OpenAI and Microsoft for scraping its stories without permission. As for other content providers concerned that the AI company is co-opting their work without permission or compensation, Khan noted that, "It's up to other news publishers if they want to have OpenAI's robots crawl their sites for information." Meanwhile, OpenAI told Bloomberg News that three-quarters of its revenue comes from the 1 million paying subscribers (consumers and businesses) that it's signed up. But that isn't enough to fund its operations -- the privately held company made history last month when it raised $6.6 billion, in one of the largest venture capital funding rounds in US history. Software engineering as a profession isn't going anywhere anytime soon. But as I noted in a recent column about the future of jobs, it's a profession that will be changed by AI as tools will be used to write code more efficiently and effectively (and less expensively). I cited recent comments by executives at Amazon and Perplexity to make the point. Now add Google to the list. During the company's earnings call last week, CEO Sundar Pichai said that more than 25% of all new code at Google is generated by AI, then reviewed and accepted by engineers. "We're ... using AI internally to improve our coding processes, which is boosting productivity and efficiency," Pichai said. "This helps our engineers do more and move faster." You can read all his remarks here. But I'll just repeat that if software engineers need to start rethinking what they do, then it's probably time we all reflect on how AI will change our jobs in the not-too-distant future. If you want to learn or hone your AI prompt-writing skills, Google has a new 10-hour, go-at-your-own-pace course that's offered through Coursera for $49. Called Prompting Essentials, it was developed by AI experts at Google and Google's DeepMind AI lab. The company says you don't need any prior experience with AI and prompting and that you'll be able to "build a library of reusable prompts." I thought this bit was interesting: Google experts say most prompts are too short. There are 21 words on average in a successful prompt, but "data shows user prompts are often shorter, containing nine words or less." Oct. 30 marked the one-year anniversary of US President Joe Biden releasing his executive order on AI. The administration shared a list of the more than 100 actions completed by various federal agencies, but I'll note there's still no sweeping AI regulation in place in the US akin to what the European Union passed earlier this year as part of the EU AI Act. Considering bangs (or a beard)? Add advice on hairstyles (and facial hair) to the list of things AI can offer you. CNET contributor Amanda Smith tested an app called Facetune, by Lightricks, that shows users how they'd look with different hairstyles, makeup, outfits or facial hair. Smith used the app to help her decide what to ask her hairstylist for, before spending big bucks on a cut and color.
[2]
The Next Big iOS Upgrade Is Going to Make Your iPhone Look Very, Very Strange
Apple is infusing its phones with a new A.I., and it might be their biggest -- and riskiest -- bet yet. Apple Intelligence, long teased by the Big Tech giant as the fix for its laggard position within the A.I. game, left beta testing this week and rolled out to customers lucky enough to own Apple's most advanced models (i.e., the iPhone 15 Pro and anything released after). There are plenty of slick A.I. features to excite Apple fans, and to complement the brand's new custom-chip-powered hardware: "Writing Tools" for proofreading and editing messages, overhauls of Siri and the Mail app, the added ability to record and transcribe phone calls, and upgrades to the search and edit functions for your photo albums. But, of all the iOS updates, the one you probably heard about and experienced the most was the auto-summarizer for your notifications -- and not necessarily for the best reasons, per the screenshots circulating on social media: While there were many such cases of odd text summaries, perhaps the most notable, infamous example of the feature landed two weeks ago, in a now-deleted viral tweet where developer Nick Spreen screenshotted his phone's interpretation of an inbound breakup text: "No longer in a relationship; wants belongings from the apartment." The notoriety has only heightened since then, to the point that CNET published a helpful guide on Monday for turning off "the most annoying Apple Intelligence feature." And that's not the only Intelligence goodie that's hardly living up to promise. As Quartz reported Wednesday, users are also complaining about an onerous software-update process, malfunctioning Siri mechanisms, and A.I.-generated responses that are even less accurate than those that spew from hallucinating engines like ChatGPT. The mismatch between Apple Intelligence's actual quality and the way it's portrayed in the company's latest masterpieces of condescending advertising -- that is, as genius agents that can "write smarter" than you can -- is quite stark. Understandably, then, this intelligence doesn't appear to be driving a sales surge for the A.I.-customized iPhone 16 lineup. (Nope, not even for the Mac Mini.) But that shouldn't have been too surprising: A summer study in the Journal of Hospitality Marketing and Management found that consumers overwhelmingly lost interest in products when they were labeled as being A.I.-powered as opposed to simply "high-tech," while a recent CNET survey found that only 18 percent of its respondents viewed A.I. integration as their "main motivator for upgrading their phone." That's bad news in a year when iPhone sales have plunged globally -- to the point that Apple's yet again ceded its onetime position as world's biggest phone maker to Samsung. And while Apple reported record quarterly revenues to investors this week, analysts continue to warn that iPhone sales (still the corporation's biggest moneymaker by far) aren't getting a much-needed boost from the Apple Intelligence previews. "iPhone revenue stands as the report's Achilles' heel," Thomas Monteiro, a senior analyst for Investing.com, told me in an email. "Given the strong trend in consumer spending, the presented numbers indicate that users were generally unimpressed by the recent features, meaning that the next suite of A.I.-to-product offerings will need to do an overall better job to impress the public." The calculus behind Apple's A.I. approach was that it wouldn't be as hasty as its rivals (namely, Meta and Google) in its attempts to catch up to the ChatGPT era. Rather, it would steadily game out the most useful applications of the tech for its everyday products, and maximize its historic advantage as hardware and software pioneer. This was sensible thinking, especially in light of Google's endlessly clumsy A.I. foibles and Meta's horrific misinformation crisis. And the products Apple's been announcing lately definitely stand out (check out the AirPods Pro that also serve as hearing aids). If Apple wants to stick to a business model that's worked out quite well so far -- that being, a brand of lifestyle accessories with actual, practical use for your everyday needs -- it makes sense for the company to behave more frugally when it comes to splurging cash on and rolling out A.I. that users may not even want. Both Meta and OpenAI have already been learning that the hard way. The issue remains, however, that a lot of this A.I. remains fundamentally faulty, no matter who's making the model or what's operating it. Right now, every major A.I.-focused or -curious firm is hoping to build out veritable forests of data centers in the hopes that more data, energy, capacity, and training will finally free the biggest large language models from the curses of making shit up and getting basic facts wrong. Yet that race for superiority also involves these companies (especially Google) getting their A.I. engines to eat and regurgitate much of their own generated slop in turn, thus worsening the models by corroding the overall value of their training data. Having already flouted copyright law and invited numerous lawsuits in the quest to make god-A.I., Big Tech is falling into a vortex of self-propelling errors that's had the effect of polluting the entire digital information ecosystem. It's hard to tell right now, but there may very well be a cap on how far these models can go -- and Apple, even after having taken (supposedly) more careful study than its peers, is coming up on that right now, with a flurry of its notification-summary mistakes. Perhaps Apple has some remedies awaiting. But if, two years after ChatGPT's debut, Apple's biggest rollouts fall into the same PR nightmares that its brasher rivals are still overcoming, what does that say about Big Tech's biggest bet?
[3]
Column | Apple Intelligence is finally here, but you may not even want it
Apple's new AI summarizes things on your iPhone. But sometimes it's hilariously bad at it. This week, you can take a first bite of Apple Intelligence, which Apple says is a more useful, more thoughtful, more private, more made-for-iPhone version of artificial intelligence. Starting today, Apple Intelligence is available with a software update to iOS 18.1 for anyone using this year's iPhone 16, last year's iPhone 15 Pro or recent Macs and iPads. Apple requires you to turn on a setting to start using Apple Intelligence. After testing Apple Intelligence features on my iPhone for months, I've found that the AI still doesn't do much -- and sometimes doesn't act intelligent at all. Using Apple's AI also appears to drain my phone's battery faster. Skip to end of carousel Geoffrey A. Fowler (Señor Salme for The Washington Post) Geoff's column hunts for how tech can make your life better -- and advocates for you when tech lets you down. Got a question or topic to investigate? Geoffrey.Fowler@washpost.com. Read more. End of carousel One thing Apple Intelligence does that you can't get from ChatGPT or Google Gemini is summarizing all your iPhone lock-screen notifications. That's mildly useful except when it goes bananas at least once or twice per day. One example: Last Thursday, Apple AI summarized a news headline as, "Steve Anderson urges Harris to endorse Harris." (The actual original headline was, "Fellow General Steve Anderson Tells John Kelly Why He Must Endorse Harris Now.") Apple didn't answer my questions about why the battery on my year-old iPhone 15 Pro lasts only until about 3 p.m. now that I'm using Apple Intelligence. (Many factors can influence battery life, but this seems like more than just a coincidence.) Apple smartly understands it has a monopoly over much of the data and screen real estate on your iPhone, and can use AI to summarize, organize and edit it for you. And I like the more cautious approach Apple Intelligence takes with privacy, running functions locally on your device or on a special cloud service so your personal data isn't accessible to anyone else. (That's a big improvement over Meta, Microsoft and Google.) The problem is, Apple's AI capabilities are behind industry leaders -- by more than two years, according to some Apple employees cited by Bloomberg. Apple says there's more to come from Apple Intelligence, including the ability to generate images, and some much-needed upgrades to its Siri assistant. In recent days, Apple senior executives have suggested in interviews that an underwhelming debut is all part of its plan. They said Apple only wants to release AI products that get it "right" or are "ready." But is it actually ready? Apple Intelligence feels annoying and unfinished just often enough that I wouldn't blame anyone for leaving it switched off for now, or waiting another year before buying a new iPhone. To help you decide, here are highlights and lowlights of what Apple Intelligence does in its current form. What Apple Intelligence does today Summaries of notifications and emails What's good: Glance at your lock screen and see short summaries of your messages, news alerts and other notifications. This could help you catch up quickly. My favorite Apple Intelligence feature: In the Inbox view of your Mail app, the two lines under the name of the sender and subject now include a short AI summary of the whole message. That's more useful than previewing whatever happened to be in the first two lines of the email. What needs work: The summaries are right most of the time -- but just often enough are bonkers. On Saturday, it mis-summarized a chat about costumes as "Pugsley is little fester." (What?) And a text from a friend became: "Feedback received, well-received." All AI is bad at humor and social context, but Apple Intelligence tries and often fails at summarizing texts where it truly isn't in on the joke -- and takes up space on your lock screen with its drivel. I've also seen it get confused about names and convey the exact opposite of the meaning in a message. (You can, in the settings for notifications, turn off the summaries.) The Mail app also tries to pull out certain messages it thinks are "priority" and put them at the top of your inbox. For me, it gets this wrong frequently -- for example, leaving calendar invites up there long after I've already added them to my calendar. 'Clean up' photos What's good: See a stray hand in your photo? In the photos app, tap edit and then on a new eraser icon. You can tap, circle or "brush" what you want to remove and replace it with an AI-generated background that matches the rest of the photo. What needs work: It struggles at more mildly complicated editing. When I tried to clean up power lines in a photo of trees at sunrise, it could erase about half of the power lines. But Google Photos -- which first launched its "magic eraser" function back in 2021 -- did a better job, especially with the lines that intersected with trees. Meanwhile, Google Photos has evolved more AI capabilities, including making sure the faces of people in your photos are looking forward and smiling. Writing help What's good: Select some text in a note or email you're drafting, and Apple Intelligence will offer to proofread or rewrite it in a tone that is "friendly," "professional" or "concise." What needs work: The Apple Intelligence proofreading function incorrectly told me to use the word "skepticically" instead of "skeptically" in a draft of this column. And I don't know why I'd rely on Apple Intelligence for writing help and not one of the more capable tools built into the software from Microsoft or Google that most people use to do their writing. Google's Gemini, for example, can rewrite paragraphs following any style description you enter, including a poem. 'Smarter' Siri What's good: Apple's assistant has a flashy new interface, and you can stumble over your words and Siri might still understand you. It didn't flinch to offer the San Diego weather when I asked, "What's, um, the sun like today down over in San Jose -- no, I mean San Diego?" What needs work: Siri still isn't as smart as any of the leading chatbots, and its speaking voice can't carry on a conversational chat that approaches their realistic-sounding human voices. Siri also still answers most complex questions with a Google search. A promised ability to send complex questions to ChatGPT is coming with the next software update, says Apple. Most concerning, Siri still doesn't really know me. I can't ask "when is my next haircut?" and have it pull the answer from my calendar for me. This kind of "personal context" is something Apple touted would come with Apple Intelligence. Someday.
[4]
iOS 18.1 & iPadOS 18.1 review: baby steps with Apple Intelligence
Apple Intelligence arrives one month into iOS 18 and iPadOS 18, bringing a few of the anticipated Apple Intelligence features to iPhones and iPads, but there's nothing revolutionary about Apple's AI push. When new technologies arrive, the world races to try and adopt them to be ahead of the game. Apple is rarely ever first to something, if in the top ten, and that's been the case with so-called artificial intelligence. There's not much new in iOS 18.1 and iPadOS 18.1 beyond the Apple Intelligence updates. Apple has continued to refine the new dark and tinted icons, but beyond some new app splash screens, there's little else. This review will focus on what Apple Intelligence introduces to the operating systems. I dove more into using the iPhone 16 Pro with Apple Intelligence in my one-month review published earlier. Apple Intelligence is Apple's attempt at integrating AI into its products like iPhones, iPads, and Macs. The foundation of Apple Intelligence is built on large language models trained on licensed and publicly available data. Competitors rely heavily on server-side components and learning from how users interact with models. Apple has taken a more private and secure route by keeping models local to the device, and if something is sent to an Apple server, it's done privately with Private Cloud Compute. No user data, queries, inputs, or other information is used to train Apple Intelligence. Writing Tools are available anywhere text can be input in the system. If you're in Apple Notes, the tools get additional UI since they're integrated into the app. The Writing Tool I use the most is Proofread. It's a simple grammar, punctuation, and spelling check that quickly replaced Grammarly. That's $140 a year I won't be spending on a tool that's arguably gotten worse since it shifted to "AI." Apple may have worked hard to avoid hallucinations associated with AI, but its tools aren't immune. I tried Rewrite on a paragraph that mentioned LCUs without explaining the acronym. The Rewrite expanded the abbreviation to "Life Cycle Units," which is incorrect -- they are Landing Craft Units. I have no idea why it felt the need to expand that acronym or make up a meaning, but that's what happens with AI. The key difference between Apple Intelligence and everyone else's tools is that you can't ask Apple's to write an entire essay from scratch. Every tool relies on existing content. The user is still responsible for writing everything first. I do like that the Proofread tool gets its own editing UI in Apple Notes. You're able to arrow through corrections and revert them if needed. I'm interested in seeing how Writing Tools evolves over time. This free built-in system-wide extension is already useful in its infancy, and I expect Apple will add more syntax detection and correction over time. The most likely Apple Intelligence feature everyone will notice is summaries. Everything is summarized from notifications to websites with little to no user intervention. Notifications found on the Lock Screen are grouped by app, so if multiple notifications appear, an AI summary of those notifications will describe them. This feature is hit and miss depending on the topic. Conversations between family members are often summarized well, but humor or sarcasm are lost on the algorithm and can lead to some interesting results. Work notifications from Slack, summaries of news, and other more structured information tend to fare better at summarization. For example, if you get a cluster of Apple Card notifications together, the summary will tell you the net expense. But math isn't always a strong point here, as I've seen summaries confuse one total with another that results in telling me a water bill was $1,340. Summaries can be found in the Mail app too. The email preview that previously showed the useless and repetitive "we hope to reach you" introductions now has a short summary. When you open an email, you can click "Summarize" at the top to see a short description of the entire email thread. This is especially handy in long conversations where details can easily get buried. Summaries are also available in the Safari Highlights feature. Generally, webpage summaries are provided automatically, but more recent content likely hasn't been summarized, so a sparkle will show up on the menu option in the Address Bar. Reader Mode in Safari will show the generated summary and a table of contents as applicable. I've found AI summaries very useful across the system. It means saving time when sorting emails or glancing at incoming notifications. This has been especially useful during work when RSS feeds shove a lot of information in, and a quick summary is much easier to parse than a wall of text. Apple Photos has a couple of features associated with Apple Intelligence -- search, Memories, and Clean Up. Searching in Photos is improved thanks to Apple Intelligence parsing your library and understanding natural language. Find an exact moment in time by describing it, even if you don't remember when it happened. You can say something like "watching fireworks with Mom" or "at the lake with the dog," and relevant results will show up. It's quite useful. Users can generate a memory movie using a prompt. I tried a few prompts, and specificity helps, like "trips with this person" versus something abstract like "falling in love." I've always had a bit of a soft spot for memories in Photos, but they feel a bit stale, especially since the song choice always feels off the wall. That, and Live Photos cause random bursts of sound where you wouldn't expect it. Luckily, you can customize all of these aspects of a memory. Apple Intelligence gives users a good starting point, then it can be tweaked into something more usable. The most marketable feature is Clean Up, though it isn't exactly revolutionary. Instead of Machine learning algorithms determining how to remove objects or fix things, AI is the backend. Clean Up works great, and there's the added UI element of tapping glowing objects to make them disappear. You can even use Clean Up on a face to pixelate it. I expect Clean Up and photo editing will get more enhancements with Apple Intelligence in the future. For now, it's passable and can be used instead of paying for a different tool. A few things changed about Siri with iOS 18.1 and iPadOS 18.1, but they're not what you're hoping for with Apple Intelligence. The new glowing animation, Type to Siri shortcut, and ability to self-correct during a command are all that's here. The fancy glowing animation replaces the age-old Siri orb and serves no actual purpose. On iPad, when Siri is invoked, it takes up the whole edge of the large display, even on an external monitor. A quick double tap on the bottom bar will summon Type to Siri on iPhone. It's a good way to enter a command when you don't want to speak out loud. Type to Siri has been a feature for a long time, but it required the user to enable it. However, the gesture is easy to invoke by accident, and I've had the keyboard pop up at really inconvenient moments. I'm trying to train myself to stop resting my thumb in that area to reduce inadvertent activations. Those that are annoyed by this can turn off the shortcut. If you speak to Siri and make a mistake, stumble over your words, or correct yourself, it understands and replies accordingly. This is a great quality-of-life upgrade but only for devices with Apple Intelligence and not for HomePod or Apple TV. If you're expecting Siri to be improved in capability, understanding, or context, that hasn't happened yet. Technically, Siri hasn't really changed at all in iOS 18.1 and iPadOS 18.1 -- that new context-aware Siri comes later. App intents will be able to provide data to Apple Intelligence that Siri can surface, like with Apple's example of providing flight data from an email. It's going to be interesting to see in practice, but it won't be available until early 2025 in a point four release. Technologists hyping the arrival of everything science fiction has promised with AI has left the world excited for what life-changing features Apple might provide. Instead, Apple has taken a reserved approach with generative technologies to provide simple-seeming Writing Tools, Photos updates, and system-wide summaries. There has been a disconnect between what is being promised to consumers and what is actually shipping. Instead of the AI that might "steal our jobs," we've gotten an incredibly proficient set of tools capable of generating confident lies and trademark-violating images. Things are moving quickly, and the technology behind AI is useful, especially in research and enterprise. However, the consumer use of the technology has become focused on productivity tools, image generation, and other something-from-nothing applications. Google Pixel 9 is advertised with built-in AI, like adding yourself to a photo, generating images, getting AI weather summaries, generating lists in Google Keep, and summarizing a phone call. Some of these features are unique to the Pixel phone, but don't feel revolutionary or particularly out of reach for Apple. I've been asking myself and others what Apple is supposedly behind on, and the only answer I can find is marketing. Google, OpenAI, and others have been doing a great job making a name in early AI efforts, where Apple didn't start talking about AI proper until June 2024. Average users that don't pay close attention to technology seem to have no idea what Apple Intelligence is for, let alone AI in general. One person I asked said they hoped it could organize their photo library into albums based on content like screenshots. The Photos app already does this. Another thought it would let them perform tasks the iPhone is already able to do with Shortcuts or Focus. For example, notifying their spouse when they are on their way home from work. There's a reason Apple Intelligence is underwhelming with its Writing Tools and object removal from photos -- AI isn't really all that exciting. It's an evolution of the machine learning tools we've been using for years and not some kind of Earth-shattering revolution. I believe technologists have been irresponsible in how they've portrayed AI to the general public. Even the name "artificial intelligence" implies more than there is. There's nothing intelligent about AI. But that's okay. Apple Intelligence has made a difference in my life, as I've mentioned before. Therefore, Apple must be doing something right in this reserved approach. So far I've focused on Apple Intelligence because that's what's new in iOS 18.1 and iPadOS 18.1. There's not much else to discuss as far as new features or changes, so here's where I'll dig into how these operating systems have been for the first month. The biggest change I notice every day is the change to Home Screen customization with dark mode icons. The new Control Center has been a nice update, but not enough apps have added buttons to make a difference in how I use it yet. The new Photos design is fine. I like that I have more control over organization, but it only makes me wish for features like Focus Filters more. The new Passwords app has made managing my passwords easier, but even more importantly, it's better for my family too. It is much easier to teach my family members good password practices when there's an app clearly built for this purpose. Math Notes has been a nice addition, especially on iPad where I can write things out. I tend to budget and plan with a little napkin math, jotting things down instead of doing a formal table. It's been great for that. There's not much to say that wasn't said in our initial iOS 18 review or iPadOS 18 review. Developers are only just getting around to adding optimized icons for tinted and dark modes, features for the new Control Center, or the new top bar, so there's not much to assess. There still isn't a AAA gaming story for iPhone or iPad despite ever-improving hardware. Game Mode kicks on, but if it makes a difference, I can't tell. I still can't reliably record the AppleInsider Podcast from iPad, install a universal clipboard tool, or run apps like Pixelmator Pro. And Stage Manager still has the multiple bugs I covered, like the cursor not knowing which app to be active in. We'll continue to examine iOS 18 and iPadOS 18 as Apple provides new point updates. Slow updates to features, new Apple Intelligence releases, and bug fixes can only make this release cycle better. I've found Apple Intelligence to be a good addition, even though it isn't overly flashy. Writing Tools and system-wide summaries have made a positive impact on my workflows, which is more than nothing. It'll be interesting to see what Apple can do to address user needs before iOS 19 is revealed in June 2025. Perhaps we could see a last-minute surprise similar to cursor support in iPadOS, or we'll just be left wanting, again. I'm quite happy with how iOS 18 and iPadOS 18 have faired in the past month. Plus, Apple Intelligence has proven to be a useful tool. There's still work to be done to address bugs in iPadOS, and iOS needs more developer support for the new features available. That's what's keeping this from being a higher score.
Share
Share
Copy Link
Apple rolls out its AI features, Apple Intelligence, with a focus on privacy and security. The update brings new capabilities but faces criticism for inconsistent performance and battery drain issues.
Apple has finally entered the AI race with the introduction of Apple Intelligence, a suite of AI-powered features for its latest devices. The update, available for iPhone 15 Pro, iPhone 16, and recent Macs and iPads, marks Apple's attempt to catch up with competitors in the AI space 12.
Apple Intelligence introduces several new capabilities:
Apple emphasizes its commitment to user privacy and security with Apple Intelligence. The company claims its Private Cloud Compute (PCC) is the "most advanced security architecture ever deployed for cloud AI compute at scale" 1. To back this claim, Apple is offering bounties up to $1 million for identifying security vulnerabilities in PCC 1.
Despite the hype, early reviews of Apple Intelligence have been mixed:
The introduction of Apple Intelligence hasn't significantly boosted iPhone sales, which have been declining globally 2. Analysts suggest that AI features are not a primary motivator for phone upgrades, with only 18% of surveyed consumers citing AI integration as their main reason for upgrading 2.
Apple executives have hinted at a gradual rollout strategy, emphasizing the importance of getting each feature right before release 13. Future updates may include image generation capabilities and improvements to Siri 3.
Apple's cautious approach to AI integration contrasts with more aggressive strategies from competitors like Meta and Google. While this may help Apple avoid some of the pitfalls experienced by its rivals, it also means the company is playing catch-up in the AI space 23.
As the tech industry grapples with challenges such as AI hallucinations, data quality, and legal issues surrounding training data, Apple's measured approach may prove beneficial in the long run. However, the company will need to address current shortcomings and expand its AI capabilities to remain competitive in this rapidly evolving landscape 23.
Reference
[3]
An exploration of how AI is reshaping various job sectors, particularly in software engineering, and its integration into consumer technology.
3 Sources
3 Sources
Apple's foray into AI with Apple Intelligence has been met with disappointment, as users find the features limited, buggy, and less capable compared to competitors like Google's AI offerings.
5 Sources
5 Sources
Apple's delayed entry into AI with Apple Intelligence shows promise but faces criticism for its staggered rollout and mixed user reception. The tech giant aims to expand its AI offerings in 2025, balancing innovation with privacy concerns.
7 Sources
7 Sources
Apple's new AI features, Apple Intelligence, are rolling out with iOS 18 updates. While promising, analysts doubt their immediate impact on iPhone 16 sales, citing production cuts and delayed feature releases.
8 Sources
8 Sources
Apple's voice assistant Siri lags behind competitors, causing delays in product launches and raising questions about the company's AI strategy. This struggle reflects broader challenges in the consumer tech industry's push for AI integration.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved