3 Sources
3 Sources
[1]
Craig Federighi claims that Apple Intelligence servers are "incredibly simple" -- and for a good reason - Softonic
Without storage, without management tools, and all verifiable Craig Federighi, senior vice president of software at Apple, has shed some light on how the company manages data privacy in the artificial intelligence cloud. According to Federighi, the servers that Apple uses for these functions, called Private Cloud Compute (PCC), are deliberately designed to be "basic," and this is not a coincidence. This simplicity aims to ensure that our data remains private. Apple's approach to its Apple Intelligence features and data handling is structured in three stages. First, the company performs as much processing as possible directly on our device, without sending data to external servers. Only when strictly necessary, and the device does not have the capacity to handle the required processing, the data is sent to Apple's servers. As a last resort, and only with our consent, the company uses external services like ChatGPT to complete processing tasks. In this context, one of Apple's biggest challenges has been achieving a level of privacy equivalent to end-to-end encryption (E2E) on servers that perform natural language processing. The problem with this type of processing is that servers need to access the data we send them to perform AI inference, which goes against the principles of E2E, where not even the server should be able to read the data. To solve this, Apple has implemented three main solutions. First, they have placed all server management tools outside the protected zone, such as load balancers and data loggers, preventing them from decrypting the data. Second, they use non-persistent storage. One of Apple's key decisions has been to design PCC servers in an extremely simple way. These servers do not have persistent storage, such as hard drives or solid-state drives, which means they cannot store processed data. This decision aims to eliminate any possibility of data being retained beyond the moment it is processed and returned to our device. In his interview with Wired, Federighi explains that PCC servers cannot retain any data after a restart, as the encryption system is designed to destroy any encryption key every time the server restarts. As a result, after each restart, the servers start from scratch with a new encryption key, making any previously processed data completely irrecoverable. Thirdly, and in line with the previous two, Apple has made each production version of the PCC servers publicly available for inspection. This allows anyone to verify that the servers function according to Apple's claims, but this system goes further. The records of all versions of PCC servers are stored in a cryptographic attestation log, preventing any unverified server from processing data. In fact, any Apple device would even refuse to send data to a server that has not been properly authenticated. "Creating the trust model in which our device will refuse to send a request to a server unless the signature of all the software that the server is running has been published in a transparency log has undoubtedly been one of the most unique elements of the solution, and completely critical to the trust model," comments Federhigi.
[2]
Apple Intelligence servers are really basic, says Craig Federighi
Apple software SVP Craig Federighi says that the Private Cloud Compute servers used for Apple Intelligence features are really basic - and with good reason. The exec says it's one of a number of decisions the company made to ensure that it's AI cloud servers form a "hermetically sealed privacy bubble" with your iPhone ... We've talked before about Apple's three-stage hierarchy when it comes to Apple Intelligence features: We've also discussed the five safeguards Apple applies to its own servers, which includes the "extraordinary step" of verifiable transparency. In an interview with Wired, Federighi says that part of the privacy protection is achieved by making the PCC servers really basic (even if the chips aren't). It's hard to imagine a data center server without hard drives or SSDs for storing user data, but that's exactly what Apple has created. PCC servers are as bare-bones as possible. For example, they don't include "persistent storage," meaning that they don't have a hard drive that can keep processed data long-term. Additional features further ensure there is no way for data to survive a reboot. They do incorporate Apple's dedicated hardware encryption key manager known as the Secure Enclave, and randomize each file system's encryption key at every boot up as well. This means that once a PCC server is rebooted, no data is retained and, as an additional precaution, the entire system volume is cryptographically unrecoverable. At that point, all the server can do is start fresh with a new encryption key. One weakness that used to exist with iCloud is that data was encrypted, but did not use end-to-end encryption - meaning that Apple, or a hacker who gained access to Apple servers, could read the data. Apple has been gradually rolling out E2E encryption for more and more iCloud data (though you do need to enable it), but that posed a problem for PCC servers. "What was really unique about the problem of doing large language model inference in the cloud was that the data had to at some level be readable by the server so it could perform the inference. And yet, we needed to make sure that that processing was hermetically sealed inside of a privacy bubble with your phone," Federighi says. "So we had to do something new there. The technique of end-to-end encryption -- where the server knows nothing -- wasn't possible here, so we had to come up with another solution to achieve a similar level of security." The company's solution was two-fold. First, all the usual server tools that might allow an administrator (or hacker) access to your data, like load balancers and data loggers, sit outside of the protected area, so cannot decrypt the data. Second, that lack of persistent storage: once the response is sent back to your phone, it is deleted and can never be recovered. The "extraordinary step" Apple referenced previously is that absolutely anyone can check that the system works the way the company says it does Apple is making every production PCC server build publicly available for inspection so people unaffiliated with Apple can verify that PCC is doing (and not doing) what the company claims, and that everything is implemented correctly. All of the PCC server images are recorded in a cryptographic attestation log, essentially an indelible record of signed claims, and each entry includes a URL for where to download that individual build. PCC is designed so Apple can't put a server into production without logging it. And in addition to offering transparency, the system works as a crucial enforcement mechanism to prevent bad actors from setting up rogue PCC nodes and diverting traffic. If a server build hasn't been logged, iPhones will not send Apple Intelligence queries or data to it. That's an unprecedented step for any cloud company, says Apple. "Creating the trust model where your device will refuse to issue a request to a server unless the signature of all the software the server is running has been published to a transparency log was certainly one of the most unique elements of the solution -- and totally critical to the trust model." While the interview mostly recapped information already known, the iPhone 16 launch naturally means a lot more people will be paying attention.
[3]
Apple Intelligence Servers Are Deliberately Kept Basic To Ensure Apple Continues With Its Commitment To Protect User Privacy
Apple might have taken its time to bring forward Apple Intelligence, but it is now working aggressively to bring up new features and upgrades and plans to keep evolving the technology. While the AI-infused capabilities being incorporated in Siri and other apps are noteworthy, the company's Senior Vice President has recently stated that the services used for bringing Apple Intelligence advancements ahead are, in fact, very basic, and it is a deliberate choice to keep the setup this way. The iPhone 16 lineup was recently announced at the "Its Glowtime" event, and a handful of Apple Intelligence capabilities were introduced, which would be rolling out by the end of the year. Apple never settles for the basics and is always searching for novel and advanced ways to establish itself and its systems. When we see the initiatives the company is taking on the AI front, we expect it not to settle down in terms of the systems that drive it. However, that is not the case with the Private Cloud Compute servers that are used for Apple Intelligence features. According to Craig Federighi, Apple's Senior Vice President of Software Engineering, the servers of Apple Intelligence features, particularly Private Cloud Compute, use basic servers, and simplicity is a choice for continuing Apple's commitment to protecting user privacy. The executive suggested that this decision was made so that the AI servers form a "hermetically sealed privacy bubble" that leverages on-device Apple intelligence and ensures little to no interaction with user information. Apple goes with a sorted strategy for its Apple Intelligence features where the processing is done on-device, and if, for some reason, some external processing is to be utilized, then Apple's servers are the go-to, and if that does not work too, then ChatGPT with user's permission is put to use. While talking to Wired, Federighi expressed that Apple opts for basic Private Cloud Compute (PCC) servers to enhance privacy protection, although the chips used are, in fact, powerful. The company's PCC servers do not follow the traditional system where storage is a major element, as Apple does not rely on hard drives or SSDs. PCC servers are as bare-bones as possible. For example, they don't include "persistent storage," meaning that they don't have a hard drive that can keep processed data long-term. Craig stated that they even go one step further by adding more features to the server in order to ensure no data persists after the reboot is carried out. They do incorporate Apple's dedicated hardware encryption key manager known as the Secure Enclave, and randomize each file system's encryption key at every boot up as well. This means that once a PCC server is rebooted, no data is retained and, as an additional precaution, the entire system volume is cryptographically unrecoverable. At that point, all the server can do is start fresh with a new encryption key. The only spot that was not catered to and was a weak one for the company was that it did not use end-to-end encryption giving room for hackers to gain access to data. This is also being catered to as Apple has been slowly moving towards E2E encryption. Apple wants users to know the aggressive approach they are taking to ensure the system does not compromise privacy. They are even transparent about the process and allow anyone to check it for themselves. Apple is making every production PCC server build publicly available for inspection so people unaffiliated with Apple can verify that PCC is doing (and not doing) what the company claims, and that everything is implemented correctly. Apple is working arduously to ensure user data is protected and the system is transparent in its processes and methods.
Share
Share
Copy Link
Craig Federighi, Apple's senior vice president of software engineering, reveals that Apple's intelligence servers are intentionally kept simple to prioritize user privacy. This approach contrasts with other tech giants' complex AI systems.
In a surprising revelation, Craig Federighi, Apple's senior vice president of software engineering, has disclosed that the company's intelligence servers are "incredibly simple" by design
1
. This statement comes as a stark contrast to the complex AI systems employed by other tech giants, raising questions about Apple's strategy in the rapidly evolving field of artificial intelligence.Federighi explained that the simplicity of Apple's servers is intentional and serves a specific purpose. Unlike competitors who rely on vast amounts of user data to train their AI models, Apple's approach is fundamentally different. The company's servers are designed to be "really basic," focusing primarily on aggregating data to improve on-device intelligence rather than processing personal information
2
.The core reason behind this strategy is Apple's unwavering commitment to user privacy. By keeping the servers simple, Apple ensures that personal data remains on the user's device, minimizing the risk of privacy breaches. This approach aligns with Apple's long-standing philosophy of protecting user information and differentiating itself from other tech companies that heavily rely on cloud-based data processing
3
.Apple's focus on on-device intelligence means that most of the AI processing occurs directly on the user's iPhone or other Apple devices. This not only enhances privacy but also allows for faster response times and the ability to function without an internet connection. The company's Neural Engine, integrated into its chips, plays a crucial role in enabling this local AI processing
1
.Related Stories
While this approach has been praised for its privacy-centric nature, it also raises questions about Apple's ability to compete in the rapidly advancing field of AI. Competitors like Google and Microsoft are investing heavily in large language models and cloud-based AI services. However, Apple seems confident that its strategy of prioritizing privacy will not hinder its ability to innovate and provide cutting-edge AI features to its users
2
.As the AI landscape continues to evolve, Apple's unique approach may set a new standard for privacy-conscious AI development. The company's ability to balance technological advancement with strong privacy protections could influence industry trends and consumer expectations. However, it remains to be seen how this strategy will impact Apple's competitiveness in the long run, especially as AI becomes increasingly central to technological innovation
3
.Summarized by
Navi
15 Apr 2025•Technology
23 Oct 2024•Technology
20 Oct 2024•Technology
1
Business and Economy
2
Business and Economy
3
Policy and Regulation