3 Sources
3 Sources
[1]
Apple Looking to Fundamentally Change iPhone Memory Design to Enhance AI Performance
Apple is reportedly planning a significant shift in its iPhone hardware design by transitioning to discrete memory packaging to enhance on-device AI performance. Samsung, a key supplier of Apple's memory components, has begun research to accommodate the change at Apple's request, according to a new report from Korea's The Elec. The shift will mark a departure from the current package-on-package (PoP) method, where the low-power double data rate (LPDDR) DRAM is stacked directly on the System-on-Chip (SoC). Starting in 2026, the DRAM will instead be packaged separately from the SoC, which should significantly improve memory bandwidth and enhance the iPhone's AI capabilities. The current PoP configuration was first introduced in the iPhone 4 in 2010 and favored since then for its compact design. Stacking the memory directly atop the SoC minimizes the physical footprint, which is especially important for mobile devices where space is at a premium. However, PoP packaging imposes constraints that limit its suitability for AI applications, which require faster data transfer rates and more memory bandwidth. With PoP, the size of the memory package is constrained by the size of the SoC, capping the number of I/O pins and therefore limiting performance. Moving to discrete packaging will allow the memory to be physically separated from the SoC, enabling the addition of more I/O pins. This design change should increase the data transfer rate and the number of parallel data channels. Separating the memory from the SoC also provides better heat dissipation. Apple has previously used discrete memory packaging across the Mac and iPad product lines but later shifted to memory-on-package (MOP) with the introduction of the M1 chip. MOP shortens the distance between the memory and the SoC, reducing latency and improving power efficiency. For the iPhone, adopting discrete packaging may necessitate other design changes, such as shrinking the SoC or battery to create additional space for the memory component. It may also use more power and increase latency. In addition, Samsung is reportedly working on next-generation LPDDR6 memory technology for Apple, which is expected to offer two to three times the data transfer speed and bandwidth of the current LPDDR5X. One variant under development, LPDDR6-PIM (Processor-in-Memory), integrates processing capabilities directly into the memory. Samsung is said to be collaborating with SK Hynix to standardize this technology. The changes could appear beginning with 2026's "iPhone 18" devices, providing Apple can overcome engineering challenges related to SoC miniaturization and internal layout optimization.
[2]
Report: 2026 iPhone will prioritize AI upgrades by reworking RAM setup - 9to5Mac
Apple is always working on new iPhones that are multiple years out, and today a new report details a noteworthy internal change coming to the 2026 iPhone 18 lineup. The iPhone 16 has been called the first model 'built from the ground up' for AI. But unsurprisingly, future iPhones are making even more changes to optimize for an AI-heavy future. Seok Jin Lee at The Elec writes: Samsung has started research to change the packaging method for the low-power double data rate DRAM used in iPhones. The South Korean tech giant's attempt to change the integrated circuit (IC) of the LPDDR to discrete package is per Apple's request, sources said. This means the LPDDR will be packaged separately (ie discretely) from the system semiconductor. Cupertino is planning to apply the change starting in 2026...The change to discrete package is being done to widen the bandwidth of the memory for on-device AI. Historically, Apple has used PoP (package-on-package) setups for RAM in its iPhones. That means it has been stacked on top of the main system chip. This approach has various advantages, but as The Elec notes, it's not necessarily the best setup for on-device AI. PoP is not optimal for on-device AI. Bandwidth is determined by data transfer speed, data bus width, and data transmission channels. Bus width and channel are determined by the number of I/O pins. To increase the number of pins, the package needs to become larger. But in PoP, the memory's size is determined by the SoC, which limits the number of I/O pins on the package. [...] Discrete package also provides better heat regulation. On-device AI's parallel processing causes high levels of heat. Having a larger surface for the memory allows heat to emitted in a wider surface. Apple has used discrete packaging for Macs and iPads in the past, but since the transition to Apple silicon with the M1, it's used PoP setups there too. It's likely that this change for the iPhone could make its way to the Mac and iPad too. There's one thing this report strongly implies: Apple must expect to see significant demands placed on RAM for future iPhones due to on-device AI features in the works. Apple Intelligence only recently debuted, but as Apple and the rest of the tech world have made clear, AI will play a big part in our computing future. And whatever Apple has in the works now for new software, its 2026 iPhone 18 hardware needs to be ready. Are you surprised by this report? Why or why not? Let us know in the comments.
[3]
Apple wants faster RAM to help speed up Apple Intelligence
Apple Intelligence on an iPhone will benefit from faster memory Apple is working with Samsung to change how RAM is packaged for the iPhone, with the aim of widening the bandwidth to help in AI tasks. While most smartphones have their RAM on the same package as the processor, Apple is looking to go against the norm. In order to allow for more RAM and also speed up memory access for Apple Intelligence tasks, it now wants to put the RAM and processor on separate chips. According to The Elec, Apple has asked Samsung to begin researching how best to package the DRAM memory used in iPhones. The issue is that being on the same package as the processor is only quicker up to a point. To make faster RAM, Apple wants a larger DRAM package. But there are only so many connectors on the processor side, and consequently only so much RAM that can be on the same chip. Samsung has therefore been tasked with how to create a larger DRAM package, and connect it back to the processor in the fastest way. Being a separate package will also help with heat, as on-device AI is an intensive process that makes the chips hot. Reportedly, Apple could have instead chosen to use the kind of high-bandwidth memory (HB) that is often in servers. But this appears to have been rejected, as there are difficulties getting it small enough to fit a phone, and to also become sufficiently low power to work with a phone's battery. The physical constraints of an iPhone are also an issue using a separate or discrete package for the RAM. Apple may have to reduce the size of its System on a Chip (SoC) processor and even also the battery, to fit the separate RAM in. While Samsung is said to have only just begun its research, Apple is believed to be planning to use the new method in 2026's iPhone 18 range. The Elec is a decent source of information from within Apple's supply chain. It is less accurate in predictions it forms about what Apple will do.
Share
Share
Copy Link
Apple is reportedly working on a significant change to iPhone memory design, moving from package-on-package to discrete packaging to enhance on-device AI capabilities. The change is expected to debut in the 2026 iPhone models.
Apple is reportedly planning a significant change in its iPhone hardware design, moving away from the current package-on-package (PoP) memory configuration to discrete memory packaging. This shift, expected to debut in the 2026 iPhone models, aims to enhance on-device AI performance by improving memory bandwidth and data transfer rates
1
2
.Since the iPhone 4 in 2010, Apple has used the PoP method, where low-power double data rate (LPDDR) DRAM is stacked directly on the System-on-Chip (SoC). While this approach minimizes the physical footprint, crucial for mobile devices, it imposes constraints that limit its suitability for AI applications
1
.The PoP configuration caps the number of I/O pins due to size limitations, restricting performance. This limitation becomes particularly significant for AI tasks, which require faster data transfer rates and increased memory bandwidth
1
3
.The transition to discrete packaging offers several benefits:
Increased I/O pins: Separating the memory from the SoC allows for more I/O pins, potentially increasing data transfer rates and the number of parallel data channels
1
.Improved heat dissipation: Discrete packaging provides better heat regulation, which is crucial for on-device AI's parallel processing that generates high levels of heat
2
.Enhanced AI performance: The changes are expected to significantly improve the iPhone's AI capabilities by widening the memory bandwidth
3
.While the shift promises performance improvements, it also presents challenges:
Space constraints: Adopting discrete packaging may require shrinking the SoC or battery to accommodate the separate memory component
1
3
.Power consumption: The new design may use more power and potentially increase latency
1
.Engineering hurdles: Apple faces challenges related to SoC miniaturization and internal layout optimization
1
.Related Stories
Samsung, a key supplier of Apple's memory components, has begun research to accommodate this change at Apple's request. The company is also working on next-generation LPDDR6 memory technology, expected to offer two to three times the data transfer speed and bandwidth of the current LPDDR5X
1
.One variant under development, LPDDR6-PIM (Processor-in-Memory), integrates processing capabilities directly into the memory. Samsung is collaborating with SK Hynix to standardize this technology
1
.This move signifies Apple's commitment to enhancing on-device AI capabilities in future iPhones. The company expects significant demands to be placed on RAM due to upcoming AI features
2
. The change could potentially extend to other Apple products like Macs and iPads, which have used similar memory configurations in recent years2
.As Apple continues to develop its AI offerings, including the recently debuted Apple Intelligence, this hardware change demonstrates the company's forward-thinking approach to ensuring its devices are equipped to handle the evolving demands of AI technologies
2
3
.Summarized by
Navi
[3]
1
Business and Economy
2
Business and Economy
3
Policy and Regulation