4 Sources
[1]
The latest Apple Intelligence privacy scare is a lot of fuss about nothing, but here's how to stop your phone using Enhanced Visual Search (if you really want to)
Well, I have. It reminds me a bit of the popular scam trend which had a resurgence in 2024, where people posted a statement that denies Facebook the right to access their photos. It usually says something like "I do not authorize META, Facebook or any entity associated with Facebook to use my photos, information, messages or posts, past or future." Needless to say, like all good conspiracy theories, it contains a grain of truth - there are legitimate concerns about the security of our photos on social media. However, the whole thing is a hoax, and copying and pasting the text as a Facebook post will do absolutely nothing for your privacy concerns. In Apple's case, again, there is an element of truth. Apple is indeed sending your iPhone photos to be analyzed by AI, and the feature is turned on by default, however, it's really nothing to worry about. The feature in question is called Enhanced Visual Search. What this feature does is match places in your photos to famous locations and landmarks stored in a global index maintained by Apple. That way it can work out if you're standing next to the Leaning Tower of Pisa, or you're at Stonehenge, and automatically tag the location, with Apple not seeing your photos. Apple released a policy document in November 2024 that states: "Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides IP address. This prevents Apple from learning about the information in your photos." Worrying about this as being a security threat seems blown out of all proportion to me. However, if you really want to turn this feature off then head to Settings on your iPhone and then Apps. Find Photos and then scroll to the bottom of the settings. Right at the bottom you'll see a slider for Enhanced Visual Search, which you can turn off. On a Mac, open the Photos app and go to Settings/General. It's an odd time for Apple right now. It recently had to deny that Siri had ever sold its customers data for marketing purposes, after settling a $95 million class-action lawsuit focused on its Siri assistant. Apple recently released a statement about privacy and Siri, which states unequivocally that: "Apple has never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone for any purpose. We are constantly developing technologies to make Siri even more private, and will continue to do so." Despite statements like this I commonly hear my friends saying things like, "I swear my phone is listening to me and sending me adverts based on what I was discussing". I've never thought these kind of things were true, but rather simple cases of coincidence and confirmation bias, but urban myths like this this seem to never really go away, despite statements from Apple. Perhaps the recent settlement and Apple's statement on the matter will simply fan the flames for more conspiracy theories, but I for one won't be turning off Enhanced Visual Search any time soon, or worrying that my iPhone is spying on me.
[2]
iPhone and Mac users are upset over Apple's automatic AI photo analysis
Some say Enhanced Visual Search contradicts Apple's stance on privacy. Apple arrived to the AI party later than other smartphone makers, and its recent moves to catch up are causing a fair few controversies. While Apple Intelligence continues to invent fake news stories in its notification summaries, another controversy is brewing over a quietly released feature for identifying landmarks in photos. It appears that Apple automatically opted users into Enhanced Visual Search with the release of iOS 18.1 and macOS 15.1 in October. The mechanism sends Images stored in the Photos application to Apple servers in order to identify places of interest. Apple says the mechanism behind Enhanced Visual Search is end-to-end encrypted and that neither it nor its partner Cloudflare can see the photos or access any identifying information. Images are first analysed by a local machine-learning model to find possible "regions of interest". The model calculates a vector embedding to represent that part of the image and uses homomorphic encryption to scramble the contents (Apple explains this process on its website) before sending it to a remote server where computations are made to identify a matching landmark in a database. But some users are unhappy about the way the feature was quietly implemented with no option to opt out. And even if you opt out now, data from existing photos has already been sent to Apple servers. As reported by The Register, the software developer Michael Tsai wrote in his blog last week: "Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don't think the company is living up to its ideals here. Not only is it not opt-in, but you can't effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you've already opted out of uploading your photos to iCloud." Jeff Johnson, another software developer, wrote: "It ought to be up to the individual user to decide their own tolerance for the risk of privacy violations. In this specific case, I have no tolerance for risk, because I simply have no interest in the Enhanced Visual Search feature, even if it happened to work flawlessly." Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute, wrote on the Hacker News forum: "This is not how you launch a privacy-preserving product if your intentions are good, this is how you slip something under the radar while everyone is distracted." The implementation does seem surprising considering how Apple has tried to make privacy a selling point. It even put up billboards with the line 'What happens on your iPhone stays on your iPhone'. This isn't just about privacy but respecting user preferences. Even if they data is encrypted as it claims, it seems wrong that the company is deciding that users will use new features by default without even communicating that they exist. If you want to turn off Enhanced Visual Search, go to Settings > Apps > Photos on iOS / iPadOS or to Settings > General on a Mac and uncheck the box. If you're not put off Apple, you can check out the best New Year deals below, or see our round up of Apple January sale deals.
[3]
Apple's new photo feature is sending data to its servers -- how to turn it off
Enhanced Visual Search comes enabled by default and could be putting your privacy at risk Many of the changes brought on by Apple Intelligence are centered around the camera and images on your iPhone, like making a movie out of your stored photos or generative image editing. But natural language search, which lets you find photos by describing them using regular language, was one of our favorites. In a new blog post though, developer Jeff Johnson has shed light on some privacy concerns around this feature which was intended to help people use their iPhones to more easily search for landmarks and other points of interest. Many Apple users have have already updated their devices to iOS 18 without being aware that it comes equipped with a photo feature called Enhanced Visual Search. Here's how it works: the device detects a point of interest in a photo - the example getting thrown around is the Eiffel Tower - then on-device AI creates a mathematical fingerprint or what Apple has described as "a vector embedding is calculated for that region of the image." This vector embedding or fingerprint is then sent to Apple's servers, with a relay in the middle for privacy alongside fake queries so the server doesn't know which one is legit. Cupertino's servers then return possible matches in an encrypted form. While Apple's explanations have assured users this is all done while keeping photo data private, Johnson sees this as a privacy violation that was introduced silently and without consent. He writes in his blog: "I never requested that my on-device experiences be 'enriched' by phoning home to Cupertino... If something happens entirely on my computer, then it's private, whereas if my computer sends data to the manufacturer of the computer, then it's not private, or at least not entirely private." The Enhances Visual Search feature is part of the iOS 18 and macOS 15 release from September 16, 2024 and the company's overview of the features at that time did not explain or mention this tradeoff. They did say that search would now include "natural language queries and expanded understanding" but that doesn't really explain to users that their photo data is being sent back to Apple's servers by default. On October 24th though, there was a brief explanation in a highly technical Apple document on machine learning and homomorphic encryption with a reference to EVS and privacy, explaining the feature allows for searching for landmarks or points of interest. There's also a legal document which offers more information. However, the reality is that few users access or read these notices and fewer yet understand the information contained within them. If you're worried about one of the best iPhones sending your photos and the data they contain back to Apple, Enhanced Visual Search is easy enough to turn off. Keep in mind though, this feature comes enabled by default, so you may also want to let your privacy-conscious friends and family know about it too. You may have to provide tech support for some of your older relatives but here's how to turn it off on your iPhone: You can opt out of the EVS feature anytime by going to Settings > Apps > Photos. From there, when you scroll all the way to the bottom you will see an option to toggle Enhanced Visual Search on or off. If you're using one of the best MacBooks or even a Mac mini M4, you can do the same thing by opening the Photos app, clicking on Settings within the app and in the General tab you'll find the option to toggle Enhanced Visual Search on or off. Keep in mind though, that if there is a landmark or other point of interest you do want to use this feature with, you can always reenable it by following the steps detailed above in reverse. Hopefully, we'll see Apple address the fact that Enhanced Visual Search sends photo data back to its servers sometime soon. In the meantime though, at least you now know how to to turn off this feature that comes enabled by default in iOS 18.
[4]
Apple opts everyone into having their Photos analyzed by AI
Homomorphic-based Enhanced Visual Search is so privacy-preserving, iPhone giant activated it without asking Apple last year deployed a mechanism for identifying landmarks and places of interest in images stored in the Photos application on its customers iOS and macOS devices and enabled it by default, seemingly without explicit consent. Apple customers have only just begun to notice. The feature, known as Enhanced Visual Search, was called out last week by software developer Jeff Johnson, who expressed concern in two write-ups about Apple's failure to explain the technology, which is believed to have arrived with iOS 18.1 and macOS 15.1 on October 28, 2024. In a policy document dated November 18, 2024 (not indexed by the Internet Archive's Wayback Machine until December 28, 2024, the date of Johnson's initial article), Apple describes the feature thus: Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a "region of interest" that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding - an array of numbers - representing that portion of the image. The device then uses homomorphic encryption to scramble the embedding in such a way that it can be run through carefully designed algorithms that produce an equally encrypted output. The goal here being that the encrypted data can be sent to a remote system to analyze without whoever is operating that system from knowing the contents of that data; they just have the ability to perform computations on it, the result of which remain encrypted. The input and output are end-to-end encrypted, and not decrypted during the mathematical operations, or so it's claimed. The dimension and precision of the embedding is adjusted to reduce the high computational demands for this homomorphic encryption (presumably at the cost of labeling accuracy) "to meet the latency and cost requirements of large-scale production services." That is to say Apple wants to minimize its cloud compute cost and mobile device resource usage for this free feature. With some server optimization metadata and the help of Apple's private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare. Thus, Apple unilaterally began running people's Photos through a locally running machine-learning algorithm that analyzes image details (on a purely visual basis, without using location data) and creates a value associated with what could be a landmark in each picture. That value is then used on a remote server to check an index of such values stored on Apple servers in order to label within each snap the landmarks and places found in Apple's database. Put more simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher. If it all works as claimed, and there are no side-channels or other leaks, Apple can't see what's in your photos, neither the image data nor the looked-up label. Apple claims that its use of this homomorphic encryption plus what's called differential privacy - a way to protect the privacy of people whose data appears in a data set - precludes potential privacy problems. "Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don't think the company is living up to its ideals here," observed software developer Michael Tsai in an analysis shared Wednesday. "Not only is it not opt-in, but you can't effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you've already opted out of uploading your photos to iCloud." Tsai argues Apple's approach is even less private than its abandoned CSAM scanning plan "because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes." Nonetheless, Tsai acknowledges Apple's claim that data processed in this way is encrypted and disassociated with the user's account and IP address. While there's no evidence at this point that contracts Apple's privacy assertions, the community concern has more to do with the way in which Apple deployed this technology. "It's very frustrating when you learn about a service two days before New Years and you find that it's already been enabled on your phone," said Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute in the US. The Register asked Apple to comment, and as usual we've received no reply. We note that lack of communication is the essence of the community discontent. "My objection to Apple's Enhanced Visual Search is not the technical details specifically, which are difficult for most users to evaluate, but rather the fact that Apple has taken the choice out of my hands and enabled the online service by default," said Johnson in his second post. He told The Register that it's unclear whether the data/metadata from your Photos library is uploaded before you even have a chance to disable the opt-out setting. "I don't think anybody knows, and Apple hasn't said," Johnson observed. ®
Share
Copy Link
Apple's new Enhanced Visual Search feature, which uses AI to analyze photos for landmarks, has raised privacy concerns among users and experts due to its default activation and data transmission methods.
Apple has quietly rolled out a new feature called Enhanced Visual Search (EVS) as part of its iOS 18 and macOS 15 updates, sparking a debate about user privacy and consent. The feature, which uses AI to analyze photos for landmarks and points of interest, has been enabled by default, raising questions about Apple's commitment to user privacy 1.
EVS uses on-device AI to detect points of interest in photos, creating a mathematical fingerprint or "vector embedding" for each identified region. This data is then sent to Apple's servers using homomorphic encryption and differential privacy techniques, which Apple claims protect user privacy 2.
Despite Apple's assurances of privacy protection, some users and experts have expressed concerns:
Apple maintains that EVS is designed with privacy in mind:
Several experts have weighed in on the controversy:
For users concerned about privacy, EVS can be disabled:
This controversy highlights the ongoing tension between technological advancement and user privacy. As AI features become more prevalent, companies like Apple face the challenge of balancing innovation with user trust and consent. The debate around EVS may influence future approaches to AI integration and privacy policies in consumer technology 1234.
Summarized by
Navi
[4]
Google is providing free users of its Gemini app temporary access to the Veo 3 AI video generation tool, typically reserved for paying subscribers, for a limited time this weekend.
3 Sources
Technology
18 hrs ago
3 Sources
Technology
18 hrs ago
The UK's technology secretary and OpenAI's CEO discussed a potential multibillion-pound deal to provide ChatGPT Plus access to all UK residents, highlighting the government's growing interest in AI technology.
2 Sources
Technology
2 hrs ago
2 Sources
Technology
2 hrs ago
Multiple news outlets, including Wired and Business Insider, have been duped by AI-generated articles submitted under a fake freelancer's name, raising concerns about the future of journalism in the age of artificial intelligence.
4 Sources
Technology
2 days ago
4 Sources
Technology
2 days ago
Google inadvertently revealed a new smart speaker during its Pixel event, sparking speculation about its features and capabilities. The device is expected to be powered by Gemini AI and could mark a significant upgrade in Google's smart home offerings.
5 Sources
Technology
1 day ago
5 Sources
Technology
1 day ago
As AI and new platforms transform search behavior, brands must adapt their strategies beyond traditional SEO to remain visible in an increasingly fragmented digital landscape.
2 Sources
Technology
1 day ago
2 Sources
Technology
1 day ago