10 Sources
10 Sources
[1]
Arm is releasing the first in-house chip in its 35-year history | TechCrunch
Storied semiconductor and software company Arm Holdings is starting to make its own chips after nearly 36 years of licensing its designs to companies like Nvidia and Apple. At an event Tuesday in San Francisco, the company revealed the Arm AGI CPU, a production-ready chip built for running inference in an AI data center. The UK-based company developed the chip using its Arm Neoverse family of CPU IP cores and through a partnership with Meta. Meta is also the chip's first customer of the Arm AGI CPU, which is designed to work harmoniously with the tech company's training and inference accelerator. Arm also counts OpenAI, Cerebras, and Cloudflare, among others, as launch partners. Arm's transition to making its own silicon has been anticipated for some time. The company started developing the chips back in 2023, according to CNBC reporting, and the processors are already ready to order. TechCrunch reached out to Arm for more information regarding the timeline of the chip's development and release. While it might have been expected, the move is a historic deviation from Arm's long tradition of exclusively licensing its designs to other chipmakers. The company, which is majority owned by Japanese conglomerate Softbank Group, will now be competing alongside many of its partners. The fact that Arm is producing a CPU, as opposed to GPU, is also notable. GPUs, or graphics processing units, have drawn a lot of attention because they are used to train and run AI models. CPUs are an equally important part of a data center rack. In its pro-CPU pitch, Arm notes that these chips manage thousands of distributed tasks, including managing memory and storage, scheduling workloads and moving data across systems. The CPU has become the "pacing element of modern infrastructure -- responsible for keeping distributed AI systems operating efficiently at scale," the company said. This puts new demands on CPUs and requires an evolution of the processor, Arm said. CPUs are also becoming harder to come by. In March, Intel and AMD told their customers in China that wait times for their products would be longer due to CPU shortages, Reuters originally reported. Computer prices have also started to rise amid the growing shortage.
[2]
Arm Is Now Making Its Own Chips
Arm, one of the world's leading chip design firms, announced Tuesday that it is producing its own semiconductors. The move is a departure from its long-standing model of licensing intellectual property to companies that manufacture and sell chips themselves. Speaking to a live audience in San Francisco, Arm CEO Rene Haas made his pitch for how the new Arm CPU could benefit the tech industry and why this is the right time for the company to step outside of its lane and go head-to-head with other chipmakers. Arm's in-house chip efforts were rumored for years. Now, as artificial intelligence proliferates throughout the economy and demand for computing resources skyrockets, Arm is trying to capture a sliver of the market for central processing units (CPUs) optimized to handle AI workloads. The new chip is called the Arm AGI CPU, a nod to artificial general intelligence, an often-invoked but still hypothetical form of AI that could match human performance across domains. It's designed to be coupled with other chips in high-performance servers inside data centers and to handle agentic AI tasks. The chip is being fabricated by Taiwan Semiconductor Manufacturing Corporation, the world's leading semiconductor foundry, and is being built using TSMC's 3nm process. At the chip reveal event, Arm executives emphasized the company's history of designing energy-efficient chips, and claimed its new AGI CPU will be the world's "most efficient agentic CPU on the market." Compared to competitors like the latest x86 chips made by Intel and AMD, Arm says this chip will deliver better performance per watt, or the amount of energy a computer uses to operate, and could save customers billions of dollars in electricity spending. The first major customer of Arm's new chip is Meta, which the company says has received samples of the CPU. OpenAI, SAP, Cerebras, and Cloudflare, as well as the Korean tech firms SK Telecom and Rebellions, have also agreed to buy the chip. Arm projects its AGI CPU will reach "full production availability" in the second half of this year. Nvidia CEO Jensen Huang, Amazon senior vice president and distinguished engineer James Hamilton, and Google AI infrastructure chief Amin Vahdat appeared in pretaped video testimonials praising Arm's new hardware. None committed to buying it, but all three tech giants already use Arm's designs in their own processors. Arm's history traces back to the late 1970s, when it was known as Acorn and produced microprocessors. In the 1990s the entity changed its name to ARM (Advanced RISC Machines) and its then-CEO began licensing the firm's chip designs to other companies. Arm, which has since dropped the all-caps "ARM" branding, saw its business boom during the mobile revolution. By the 2010s many of the world's largest tech companies, including Apple, Nvidia, Microsoft, Amazon, Samsung, and Tesla, were all relying on its technology. Arm appeared eager at the press event to demonstrate it has support from bold-faced names in the tech industry. While the company is mostly taking aim at chipmakers like AMD and Intel, which build CPUs based on a different architecture, it risks potentially alienating some of its longtime partners by releasing its own chip. Nvidia, which primarily makes GPUs, also bundles Arm-based CPUs into its rack systems. Earlier this year, Nvidia said it would sell stand-alone CPUs for the first time. Meta was one of its first buyers. Ben Bajarin, CEO and principal analyst at the research firm Creative Strategies, says that Arm could be perceived more as a competitor than partner as its strategy evolves. Right now, Arm is launching a streamlined CPU with a relatively small number of cores -- the chip's built-in processing units -- designed specifically for running AI agents, Bajarin points out. Over time, Arm may expand into more general-purpose CPUs, while AMD and Intel develop chips tailored for agentic AI. That would put the companies in more direct competition with one another.
[3]
Arm rolls its own 136-core AGI CPU to chase AI hype train
Turns out artificial general intelligence was a CPU this whole time Arm unveiled its first homegrown silicon -- yes, an actual chip, not another shake-n-bake blueprint -- during an event in San Francisco on Tuesday, and said that flagship customer Meta is set to deploy the 136-core CPU at scale later this year. Dubbed the AGI CPU, the British chip designer's first Arm-branded datacenter processor is designed with agentic AI in mind. You heard it here first folks, artificial general intelligence (AGI) is here and it's a ... it's a CPU. The new hardware represents a sea change in the British chip designer's business model. While Arm is no stranger to datacenter silicon, its involvement in those products up until now has been to license the core IP or instruction set architecture necessary to build them. Despite the hypemaxxed branding, the chip's Arm Neoverse V3 cores won't be running AI models themselves. That's a job for GPUs or one of the growing number of high-end AI ASICs. Instead, Arm sees its first datacenter CPU powering AI agents. In this respect, the chip will compete directly with Nvidia's standalone Vera CPUs and rack systems detailed at GTC last week. "We think that the CPU is going to be fundamental to ultimately achieving AGI," Mohamed Awad, Arm's EVP of cloud AI, told El Reg. While GPUs have gotten the lion's share of attention in recent years, the rise of agentic systems like OpenClaw have brought the need for general-purpose compute back into view. These frameworks need CPU cores and memory to write and execute code, automate tasks, and facilitate the reinforcement learning used to train next gen models. Arm is betting on the proliferation of these agents to drive a four-fold increase in CPU demand, and it's positioning its latest chip to capitalize on this trend. Arm's AGI CPU is a 300-watt part with 136 of its Neoverse V3 cores clocked at up to 3.7 GHz (3.2 GHz base), spread across two dies fabbed on TSMC's 3 nm process. The processor features 2 MB of L2 cache per core along with 128 MB of shared system-level cache (SLC). According to Awad, a concerted effort has been made to avoid including accelerators or functions which eat up die area and ultimately don't benefit the target workload. "The way that legacy CPUs had been built worried about things like support for legacy applications," he said. "We specifically didn't want to add things that weren't going to...be 100 percent utilized in the mission of this device." He added, "This is a clean sheet design meant to address all that." Unlike Nvidia's Vera, Arm has opted to forego simultaneous multithreading for its agent-optimized processors, with Awad arguing that one thread per core allows for more deterministic performance scaling. The CPU is fed by 12 channels of DDR5 -- presumably 6 channels per die -- with support for memory speeds up to 8800 MT/s. At 825 GB/s of aggregate bandwidth, that works out to 6 GB/s per core. Unlike many modern CPUs, the chip's memory and I/O functions are integrated into the same die as the compute in an effort to minimize latency. Because of this, each socket will be exposed to the operating system as two distinct NUMA domains. Finally, for I/O, the processor is equipped with 96 lanes of PCIe 6.0 connectivity and support for CXL 3.0. Meta, which is already deploying large numbers of Nvidia's Arm-based Grace CPUs and plans to use the company's Vera chips, will also be among Arm's first major CPU customers. As part of these efforts, Arm says that its validated two different OCP rack designs. There's a 35 kW air-cooled rack with 32 compute blades which, if our math is right, works out to 8,704 cores per rack. The company has also validated an even denser 200 kW liquid-cooled rack with 42 eight-node servers, which works out to 45,696 cores. For reference, that's more than twice the core count of Nvidia's Vera ETL256 CPU racks at 22,528. Meta isn't the only customer lining up to gobble Arm's new processors. OpenAI, SAP, Cerebras, Cloudflare, F5, SK Telecom, and Rebellions are also listed as early customers. In addition to AI agents, Arm sees applications for the chip as a head node for custom accelerators, or even as a general purpose CPU for networking or storage. In fact, we're told that OEM partners, including Lenovo, are already working on 19-inch systems using the chip. Up until now, enterprise customers have had limited choices with regard to Arm datacenter silicon, with Ampere computing being the only non-cloud-based player in town. Arm's AGI CPU is set to arrive later this year. Whether it'll be what actually brings about The Singularity is another matter entirely. ®
[4]
Arm to Sell Its Own Chips for First Time in Bid for AI Revenue
Ask Mark Gurman Anything About Apple Ask Mark Gurman Anything About Apple Ask Mark Gurman Anything About Apple From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. Click to listen Click to listen Click to listen Click to listen Arm Holdings Plc, which made its name licensing technology to semiconductor makers, will begin selling its own chips for the first time, aiming to claim a bigger piece of the massive spending on AI gear. Meta Platforms Inc. will be the first major customer for the UK-based company's chip, called an AGI CPU, Arm said Tuesday at an event in San Francisco. The product will have as many as 136 cores -- a measure of processing power -- and draw 300 watts of electricity, Arm said. Taiwan Semiconductor Manufacturing Co. will produce the chips. Under Chief Executive Officer Rene Haas, Arm has shifted from its roots as a provider of smartphone technology and taken a greater role in the data center market. The change is meant to help the business get more of the money generated by what is often complex and expensive work. The shift also helps Arm benefit from bigger-ticket purchases. Even the most expensive smartphone chips cost tens of dollars. The highest-end data center semiconductors can run in the tens of thousands. Arm decided to make the new chip because customers asked for it, Haas said. The product -- a central processing unit, often described as the brains of a computer -- is designed to work alongside the accelerator chips offered by companies such as Nvidia Corp. It helps coordinate work between computers, prepares data and runs elements that provide a response to users making AI queries, Arm said. "The product that we're building is not only compelling -- but we actually have customers who are lined up to buy it," Haas said in an interview. The company said its product offers greater power efficiency compared with traditional CPU designs from Intel Corp. and Advanced Micro Devices Inc. That means that data center owners will be able to wring more computing power from the same footprint and electricity budget, Haas said. Arm's increasing reach is a direct threat to the so-called x86 data center products made by Intel and AMD, Haas said. Taking share from those traditional stalwarts in a rapidly expanding market will allow both his company and its customers to grow, he argues. "The market is plenty big enough for multiple players," Haas said. Arm faces plenty of competition in data center processors. A number of startups and established companies have sought to challenge Nvidia's dominance in the field with a variety of approaches. And Nvidia itself just introduced a new CPU lineup, targeting the category that Arm is now entering. Haas said his chip is aimed at a different part of the market than Nvidia's latest addition. Arm's chip move also threatens to complicate its relationship with customers. Most of the biggest buyers of data center silicon, including Meta, have their own in-house chip programs. And almost all of them license technology and designs from Arm. Data center operators buy chips from a range of suppliers. That includes Meta, which recently signed long-term deals with Nvidia, AMD and startup Cerebras Systems Inc. The social networking company plans to use the AGI CPUs with its other chips. "We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density," Santosh Janardhan, head of infrastructure at Meta, said in a statement. Other companies -- including OpenAI, Cerebras and SK Telecom Co. -- also plan to deploy the AGI CPU in their infrastructure, Arm said. Off-the-shelf systems using the chip are out now from sellers such as Quanta Computer Inc. and Super Micro Computer Inc. They should be available in greater volumes in the second half of this year, Arm said. Under Haas, Arm has increased its revenue by more than 20% a year. Annual sales topped $4 billion for the first time in 2025. At the same time, Arm has maintained a startlingly high level of profitability. Gross margin, the percentage of revenue left after deducting costs of production, was 98% in its most recent quarter. Most of Arm's peers in the chip industry have much higher sales but lower margins. Even Nvidia, with its near lock on sales of AI accelerators, has margins in the mid-70% range. But Arm generates a tiny fraction of the revenue: Nvidia is on course for annual sales of $356 billion this fiscal year, according to Wall Street estimates. SoftBank Group Corp., which owns a majority stake in Arm, is also ramping up its own efforts to get into AI data centers. That push has involved acquiring chip startups and investing heavily in data center owners.
[5]
Arm unveils new AI chip, expects it to add billions in annual revenue
SAN FRANCISCO, March 24 (Reuters) - Arm Holdings announced a new artificial intelligence data center chip on Tuesday which it said will add billions of dollars of revenue and represent a significant shift in the company's strategy. The new chip, called the AGI CPU, will address data-crunching needed for a specific type of AI that is able to act on behalf of users with minimal oversight, instead of responding to queries as part of a chatbot. So-called agentic AI has jumpstarted demand for the central processing units (CPUs) produced by the likes of Intel (INTC.O), opens new tab and Advanced Micro Devices (AMD.O), opens new tab. For years, Arm, majority-owned by Japan's SoftBank Group (9984.T), opens new tab, has relied only on intellectual property for revenue, licensing its designs to companies such as Qualcomm (QCOM.O), opens new tab and Nvidia (NVDA.O), opens new tab and then collecting a royalty payment based on the number of units sold. Last year, Arm signalled to investors it was investing in making its own chip, a process that can cost hundreds of millions of dollars, and that the company had hired key executives to assist with the effort. The AGI CPU will be the first chip under that new strategy. "It's a very pivotal moment for the company," CEO Rene Haas said in an interview with Reuters. The new chip will be overseen by Mohamed Awad, head of the company's cloud AI business, and Arm has additional designs in the works that it plans to release at 12- to 18-month intervals. Meta Platforms (META.O), opens new tab will be the company's lead partner for the AGI CPU and the two companies worked together on the design. Arm's customers for the new chip include ChatGPT maker OpenAI, Cloudflare (NET.N), opens new tab, SAP (SAPG.DE), opens new tab and SK Telecom (017670.KS), opens new tab. Taiwan Semiconductor Manufacturing Co (2330.TW), opens new tab is fabricating the device on its 3-nanometer technology and is made from two distinct pieces of silicon that operate as a single chip. Arm plans to put it into volume production in the second half of this year but has received test chips that function as expected. "It's back, and it works, and it's doing everything we thought it would," Haas said, referring to the new chip. In addition to the chip itself, Arm is working with server makers such as Lenovo (0992.HK), opens new tab and Quanta Computer (2382.TW), opens new tab to offer complete systems. For its current fiscal year, Wall Street expects Arm to generate a net profit of $1.75 per share on revenue of $4.91 billion, according to LSEG estimates. Reporting by Max A. Cherney in San Francisco; Editing by Muralikumar Anantharaman Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence Max A. Cherney Thomson Reuters Max A. Cherney is a correspondent for Reuters based in San Francisco, where he reports on the semiconductor industry and artificial intelligence. He joined Reuters in 2023 and has previously worked for Barron's magazine and its sister publication, MarketWatch. Cherney graduated from Trent University with a degree in history.
[6]
Arm launches own AI chip in high-stakes strategy shift
Meta and OpenAI will be among the first customers of Arm's long-awaited new AI processor, as the SoftBank-backed tech group begins a high-stakes shift in strategy from designing chips for other companies to producing them itself. Arm chief executive Rene Haas unveiled its debut "AGI CPU" in San Francisco on Tuesday. It will create a new rival not only to the traditional central processing units made by Silicon Valley stalwarts Intel and AMD but also several of the chip designer's own customers, including Nvidia, Google and Amazon. Haas called the launch of its first silicon product a "defining moment for our company". "With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing," he said. The long-anticipated Arm CPU -- plans for which were first reported by the FT -- marks a significant departure from Arm's traditional role as a "neutral" platform whose intellectual property is incorporated into chips designed by US tech groups. The Cambridge-headquartered group said the product is intended to meet untapped demand for chips that consume less power in an AI data centre, promising billions of dollars in cost savings for customers compared with traditional CPUs. Despite positioning it as an AI product, the chip will not compete directly with Nvidia's graphics processing units, which have become the workhorses of the AI boom. Instead, it will cater to a growing need for "orchestration" of fleets of AI agents, such as software programming tools Claude Code and OpenAI Codex, as well as other cloud-based AI applications. The chip is being manufactured by Taiwan Semiconductor Manufacturing Company, the same supplier used by Nvidia, Apple and other Arm licensees, and will ship at the end of this year. Meta would be the "lead partner" for the chip, Arm said. Other early customers include ChatGPT maker OpenAI, cloud provider Cloudflare, German enterprise software group SAP, South Korea's SK Telecom and AI chip designer Cerebras, which struck a $10bn infrastructure deal with OpenAI in January. Analysts say the move will transform Arm's business model. Selling its own chips is likely to produce far higher revenues than the licence fees and royalties that it collects from customers at present. But moving into hardware is also likely to damp its gross margins, which are among the tech industry's highest, hitting 98 per cent in its most recent quarter. Ahead of the announcement, analysts at HSBC described the prospective launch as a "game-changing" moment for Arm, with CPU shipments set to soar over AI infrastructure demand. BNP Paribas analysts echoed the sentiment, while adding that Arm needed to answer questions about how it would handle directly competing with its existing customers. Arm said that dozens of companies across the tech industry were "supporting the platform expansion", including Amazon Web Services, Google and Nvidia. Haas said Arm had been careful about entering the market with its own chips, reaching out to its customers to assess their reaction and receiving no pushback. US tech giants supported the move, he said, because Arm's growing reach in the data centre would help drive the growth of their software. Data centre CPUs have historically been dominated by Intel and AMD. Both companies use the same X86 chip architecture that rivals Arm's designs, which started out in mobile devices but has now been used in more than 325bn devices, from cars to servers. Huge infrastructure spending by Big Tech hyperscalers -- and the rapidly growing demands on energy infrastructure -- has seen a shift towards Arm-based chips. Arm on Tuesday claimed its new chip is twice as efficient as similar X86 chips when handling the most demanding AI workloads. But it stopped short of comparing them to Nvidia's Grace and Vera CPUs, which are paired with the semiconductor giant's market-leading GPUs and are based on Arm's technology. Arm's move comes as investors have generally cooled this year on AI infrastructure stocks, worrying about whether the huge capital spending on compute is sustainable. Nvidia has also run up against geopolitical tensions between Washington and Beijing that have delayed its planned return to the Chinese market. Haas said there was "no reason as far as we can tell" that the new Arm CPU could not be sold in China. "They don't fall under any export control restriction, so there isn't any issue there," although "we don't have any customers yet in China", he added. It would also "be a shame" if the UK's build-out of its AI infrastructure did not incorporate CPUs developed by its national champion, Haas added. "We've talked a lot to the folks inside the government on this, and we're hoping to do quite a bit there."
[7]
Arm Holdings, in Break From Past, Will Sell Its Own Computer Chips
For years, the company sold chip designs to other companies. Now it plans to sell its own chips for A.I. data centers. Arm Holdings licenses technology to semiconductor designers that powers nearly all mobile phones and many other products. Now, it has a chip of its own to sell. The company, a British unit of Japan's SoftBank, on Tuesday announced plans for the first silicon product that Arm will design and sell since its founding in 1990. It is a microprocessor aimed at data centers running artificial intelligence tasks. Meta, Facebook's parent company, helped develop the chip and signed up as its first user. Other early customers will include OpenAI, the A.I. pioneer, Arm said. Arm's new offering marks a drastic change in its business model. The company helped pioneer the concept of selling intellectual property rather than products, charging fees and collecting per-chip royalties from hundreds of companies that license Arm's underlying microprocessor architecture or the equivalent of blueprints to design chips. Pierre Ferragu, an analyst with New Street Research, called Arm's shift to selling chips "the most significant strategic pivot in the company's history." Nvidia, which has become the most valuable publicly traded company in the world thanks to its A.I. chips, has focused attention on chips it sells that handle specialized, number-crunching tasks essential to the development of A.I. systems.
[8]
Arm Lends a Hand, Launches In-House AI Chip With Meta as Its First Customer
Arm, one of the world's leading designers of semiconductors, is building its first-ever in-house chipset. The company will reportedly sell its inaugural line of CPUs, called the Arm AGI CPU, to Meta first, with a slew of other companiesâ€"including OpenAI, SAP, Cerebras, and Cloudflareâ€"lining up to get in on the launch. For most of its existence, Arm has opted against producing its own chipsets, instead choosing to license its processor designs to other companies, who then manufacture them. Shifting to in-house manufacturing was an anticipated move and got the full tech launch event treatment. The company hosted a reveal in front of a live audience in San Francisco, per Wired, where CEO Rene Haas announced the Arm AGI CPU and received some pre-taped praise from Nvidia CEO Jensen Huang, Amazon senior vice president James Hamilton, and Google AI infrastructure head Amin Vahdat. Arm's arrival in the space as a manufacturer comes as some in the industry have raised concerns about CPU production slowdowns. Dion Harris, Nvidia’s head of AI infrastructure, told CNBC earlier this year that, "CPUs are becoming the bottleneck in terms of growing out this AI and agentic workflow." Intel and AMD, which make CPUs based on different architectures, have reportedly warned customers to expect a growing delay in CPU deliveries as manufacturing struggles to keep up with demand. A major driver of that demand is driven by AI infrastructure needs, which continue to grow in an effort to support the demand for agentic AI, according to a report from Futurum. Arms' initial offering seems aimed specifically at that niche. Per Wired, the company's AGI CPU is designed to work in tandem with other chips inside data centers to specifically handle tasks from AI agents. (Despite the name "AGI" invoking the idea of artificial general intelligence, there's no indication this chip does anything to facilitate that theoretical benchmark, which multiple CEOs have now claimed to have achieved with no proof and minimal fanfare.) For the time being, Arm's move to manufacturing will probably be seen as a boon for the AI industry that is in desperate need of ramped-up manufacturing to meet demand. But it'll be interesting to see if it continues to be received that way if Arm goes from an ancillary offering to trying to own the AI chip market and eat up other companies' market share.
[9]
ARM Takes Matters Into Its Own Hands, Unveiling the 'AGI CPU' as Its First-Ever Silicon for Agentic AI
ARM made a massive announcement at its ARM Everywhere keynote: according to a new blog post, the firm will sell its own 'AGI CPU' for the first time. With agentic AI workloads, CPU has started to become the next bottleneck for hyperscalers, which is why we have seen x86 solutions from Intel/AMD and ARM-based chips from NVIDIA gaining massive adoption among customers. In light of this, ARM has decided to capitalize on the hype by introducing its first-ever chip, called the ARM AGI CPU, marking a shift from an IP provider to an end-to-end silicon manufacturer. According to ARM's CEO, Rene Haas, this move is targeted at fulfilling enterprise demand driven by agentic AI workloads. Today marks the next phase of the Arm compute platform and a defining moment for our company. With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale. Diving into the specifics of the AGI CPU, we are looking at up to 136 Arm Neoverse V3 cores per CPU, offering 6GB/s memory bandwidth per core. The processor has 2 MB of L2 cache per core and runs at up to 3.7 GHz. As far as I/O specifications are concerned, you are looking at 96x PCIe Gen 6 lanes, along with CXL 3.0 memory expansion, allowing the processor to support "massively parallel, high-performance agentic workloads". Here's a complete rundown of the AGI CPU, based on the details disclosed: In terms of rack-scale deployment, ARM offers ultra-thin 1OU (Open Unit) nodes, which are basically a shift away from multi-unit servers. A single chassis can host up to two nodes, providing a total of 272 cores per blade. The physical rack layout can house up to 30 of these open unit nodes, delivering a total of 8160 cores. You are also looking at unified memory pools connected via the CXL 3.0 fabric, and each rack is rated to run at 36kW with air cooling. Given how prevalent CPU-only racks have become, ARM has designed its solution to meet market demand. ARM says its AGI CPU delivers "2 times higher" performance per rack compared to modern x86 solutions, and while it doesn't compare its solution to Vera, we expect it to be closer to it as well, given similar microarchitectures. The AGI CPU also allows vendors to mix-and-match their rack-scale configurations, since ARM has opened up support for any accelerator (Cerebras, Groq, Meta MTIA) that fits into standard OCP server designs. This essentially means that the ARM IP benefits NVIDIA previously used exclusively have likely now ended.
[10]
Arm launches its first chip and shifts strategy with Meta as key client
Arm has unveiled its first internally developed chip, dubbed the AGI CPU, designed for artificial intelligence inference in data centers. Meta is one of the first customers, alongside several partners such as OpenAI, Cloudflare, and SAP. Until now, Arm positioned itself as a provider of chip designs, maintaining a neutral stance towards its clients, which include Apple, Nvidia, and Google. With this initiative, the group is now becoming a direct competitor to certain players within its own ecosystem. This evolution comes amid surging demand for computing power driven by AI. While GPUs still dominate certain use cases, CPUs are regaining importance with the rise of so-called agentic applications, which require more general-purpose computing. Arm is highlighting the energy efficiency of its architecture, claiming its processor could offer up to twice the performance per watt compared to systems based on x86 architecture. The chip is currently being produced by TSMC using 3-nanometer technology, with a production ramp-up expected during the year. This launch takes place in a market estimated at nearly $1 trillion, where competition is intensifying. For Arm, this is a strategic bet aimed at capturing more value within the semiconductor supply chain, at the risk of redefining its relationships with long-standing partners.
Share
Share
Copy Link
After nearly four decades of licensing chip designs, Arm Holdings revealed its first internally developed processor, the AGI CPU, designed specifically for agentic AI workloads in data centers. Meta will be the first major customer to deploy the 136-core chip at scale later this year. The move marks a shift in business model for the semiconductor company as it competes directly with longtime partners Intel and AMD while capitalizing on surging AI infrastructure spending.
Arm Holdings announced its first internally developed processor at an event Tuesday in San Francisco, marking a historic departure from the semiconductor company's 36-year tradition of exclusively licensing chip designs to other manufacturers
1
. The AGI CPU represents a fundamental shift in business model for the UK-based firm, which has built its reputation providing intellectual property to tech giants including Nvidia, Apple, Qualcomm, and Microsoft2
.
Source: Wccftech
The AI chip is production-ready and designed specifically for running inference in an AI data center, built using Arm's Neoverse V3 family of CPU IP cores
1
. Meta Platforms will serve as the first major customer, having collaborated with Arm on the chip's development and already received samples2
5
.The AGI CPU features 136 Neoverse V3 cores spread across two dies, manufactured by TSMC using its 3nm process
3
. The 300-watt processor operates at clock speeds up to 3.7 GHz, with a 3.2 GHz base frequency, and includes 2 MB of L2 cache per core alongside 128 MB of shared system-level cache3
.
Source: The Register
The chip addresses growing demand for processors optimized for agentic AI—systems that can act on behalf of users with minimal oversight rather than simply responding to chatbot queries
5
. Mohamed Awad, Arm's EVP of cloud AI, explained that while GPUs handle model training and inference, CPUs manage distributed tasks including memory management, storage, workload scheduling, and data movement across systems1
. Arm projects a four-fold increase in CPU demand driven by the proliferation of AI agents3
.Arm executives emphasized the chip's power efficiency credentials, claiming the AGI CPU will be the world's "most efficient agentic CPU on the market"
2
. Compared to the latest x86 chips from Intel and AMD, Arm says its processor delivers better performance per watt and could save customers billions of dollars in electricity spending2
.The processor features 12 channels of DDR5 memory supporting speeds up to 8800 MT/s, delivering 825 GB/s of aggregate bandwidth—approximately 6 GB/s per core
3
. Unlike many modern CPUs, memory and I/O functions integrate into the same die as compute to minimize latency3
. For connectivity, the chip includes 96 lanes of PCIe 6.0 and support for CXL 3.03
.Related Stories
Meta will deploy the AGI CPU at scale later this year, working alongside the social media company's training and inference accelerator
1
. "We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density," said Santosh Janardhan, Meta's head of infrastructure .OpenAI, Cerebras, Cloudflare, SAP, SK Telecom, and Rebellions have also committed as launch partners and early customers
1
3
. Off-the-shelf systems using the chip are available now from manufacturers including Quanta Computer, Super Micro Computer, and Lenovo, with greater volumes expected in the second half of this year .CEO Rene Haas described the move as "a very pivotal moment for the company," noting that customers requested the product
5
. The direct chip vendor strategy aims to capture a larger share of AI infrastructure spending, with Arm expecting the AGI CPU to add billions of dollars in annual revenue5
.
Source: Wired
The shift helps Arm benefit from bigger-ticket purchases in data centers, where high-end processors can cost tens of thousands of dollars compared to smartphone chips priced in the tens of dollars . Under Haas, Arm has increased revenue by more than 20% annually, with sales topping $4 billion for the first time in 2025 while maintaining a 98% gross margin .
The company faces competition from Nvidia, which recently introduced standalone CPUs targeting similar markets, with Meta among its first buyers
2
. Ben Bajarin, CEO of Creative Strategies, notes that Arm could be perceived more as competitor than partner as its strategy evolves, particularly if the company expands beyond specialized agentic AI processors into general-purpose CPUs2
.Arm has validated two OCP rack designs: a 35 kW air-cooled rack with 32 compute blades totaling 8,704 cores, and a denser 200 kW liquid-cooled rack with 42 eight-node servers delivering 45,696 cores—more than double Nvidia's Vera ETL256 CPU racks at 22,528 cores
3
. The company plans additional chip designs at 12- to 18-month intervals5
.Summarized by
Navi
[2]
[3]
1
Policy and Regulation

2
Policy and Regulation

3
Entertainment and Society
