29 Sources
29 Sources
[1]
Arm is releasing the first in-house chip in its 35-year history | TechCrunch
Storied semiconductor and software company Arm Holdings is starting to make its own chips after nearly 36 years of licensing its designs to companies like Nvidia and Apple. At an event Tuesday in San Francisco, the company revealed the Arm AGI CPU, a production-ready chip built for running inference in an AI data center. The UK-based company developed the chip using its Arm Neoverse family of CPU IP cores and through a partnership with Meta. Meta is also the chip's first customer of the Arm AGI CPU, which is designed to work harmoniously with the tech company's training and inference accelerator. Arm also counts OpenAI, Cerebras, and Cloudflare, among others, as launch partners. Arm's transition to making its own silicon has been anticipated for some time. The company started developing the chips back in 2023, according to CNBC reporting, and the processors are already ready to order. TechCrunch reached out to Arm for more information regarding the timeline of the chip's development and release. While it might have been expected, the move is a historic deviation from Arm's long tradition of exclusively licensing its designs to other chipmakers. The company, which is majority owned by Japanese conglomerate Softbank Group, will now be competing alongside many of its partners. The fact that Arm is producing a CPU, as opposed to GPU, is also notable. GPUs, or graphics processing units, have drawn a lot of attention because they are used to train and run AI models. CPUs are an equally important part of a data center rack. In its pro-CPU pitch, Arm notes that these chips manage thousands of distributed tasks, including managing memory and storage, scheduling workloads and moving data across systems. The CPU has become the "pacing element of modern infrastructure -- responsible for keeping distributed AI systems operating efficiently at scale," the company said. This puts new demands on CPUs and requires an evolution of the processor, Arm said. CPUs are also becoming harder to come by. In March, Intel and AMD told their customers in China that wait times for their products would be longer due to CPU shortages, Reuters originally reported. Computer prices have also started to rise amid the growing shortage.
[2]
Arm Is Now Making Its Own Chips
Arm, one of the world's leading chip design firms, announced Tuesday that it is producing its own semiconductors. The move is a departure from its long-standing model of licensing intellectual property to companies that manufacture and sell chips themselves. Speaking to a live audience in San Francisco, Arm CEO Rene Haas made his pitch for how the new Arm CPU could benefit the tech industry and why this is the right time for the company to step outside of its lane and go head-to-head with other chipmakers. Arm's in-house chip efforts were rumored for years. Now, as artificial intelligence proliferates throughout the economy and demand for computing resources skyrockets, Arm is trying to capture a sliver of the market for central processing units (CPUs) optimized to handle AI workloads. The new chip is called the Arm AGI CPU, a nod to artificial general intelligence, an often-invoked but still hypothetical form of AI that could match human performance across domains. It's designed to be coupled with other chips in high-performance servers inside data centers and to handle agentic AI tasks. The chip is being fabricated by Taiwan Semiconductor Manufacturing Corporation, the world's leading semiconductor foundry, and is being built using TSMC's 3nm process. At the chip reveal event, Arm executives emphasized the company's history of designing energy-efficient chips, and claimed its new AGI CPU will be the world's "most efficient agentic CPU on the market." Compared to competitors like the latest x86 chips made by Intel and AMD, Arm says this chip will deliver better performance per watt, or the amount of energy a computer uses to operate, and could save customers billions of dollars in electricity spending. The first major customer of Arm's new chip is Meta, which the company says has received samples of the CPU. OpenAI, SAP, Cerebras, and Cloudflare, as well as the Korean tech firms SK Telecom and Rebellions, have also agreed to buy the chip. Arm projects its AGI CPU will reach "full production availability" in the second half of this year. Nvidia CEO Jensen Huang, Amazon senior vice president and distinguished engineer James Hamilton, and Google AI infrastructure chief Amin Vahdat appeared in pretaped video testimonials praising Arm's new hardware. None committed to buying it, but all three tech giants already use Arm's designs in their own processors. Arm's history traces back to the late 1970s, when it was known as Acorn and produced microprocessors. In the 1990s the entity changed its name to ARM (Advanced RISC Machines) and its then-CEO began licensing the firm's chip designs to other companies. Arm, which has since dropped the all-caps "ARM" branding, saw its business boom during the mobile revolution. By the 2010s many of the world's largest tech companies, including Apple, Nvidia, Microsoft, Amazon, Samsung, and Tesla, were all relying on its technology. Arm appeared eager at the press event to demonstrate it has support from bold-faced names in the tech industry. While the company is mostly taking aim at chipmakers like AMD and Intel, which build CPUs based on a different architecture, it risks potentially alienating some of its longtime partners by releasing its own chip. Nvidia, which primarily makes GPUs, also bundles Arm-based CPUs into its rack systems. Earlier this year, Nvidia said it would sell stand-alone CPUs for the first time. Meta was one of its first buyers. Ben Bajarin, CEO and principal analyst at the research firm Creative Strategies, says that Arm could be perceived more as a competitor than partner as its strategy evolves. Right now, Arm is launching a streamlined CPU with a relatively small number of cores -- the chip's built-in processing units -- designed specifically for running AI agents, Bajarin points out. Over time, Arm may expand into more general-purpose CPUs, while AMD and Intel develop chips tailored for agentic AI. That would put the companies in more direct competition with one another.
[3]
Arm moves beyond IP with AGI CPU silicon -- 136-core data center chip targets AI infrastructure with Meta as lead partner
Arm's first foray into selling production chips, not just licensing IP. Arm today announced the AGI CPU, an up-to 136-core data center processor family that the company designed and will sell as finished silicon. The chip, built on TSMC's 3nm process with Neoverse V3 cores, was co-developed with Meta and represents the first time in Arm's 35-year history that the company has shipped its own production processor rather than licensing IP to partners. The AGI CPU has been designed for what Arm calls "agentic AI infrastructure," the CPU-side orchestration work required to coordinate accelerators and manage data movement in large-scale AI deployments. 136 Neoverse V3 cores at 300 watts The chip packs up to 136 Neoverse V3 cores running at up to 3.2 GHz all-core and 3.7 GHz boost across two dies, all within a 300-watt TDP. It supports 12 channels of DDR5 memory at up to 8800 MT/s, delivering more than 800 GB/s of aggregate memory bandwidth or 6GB/s per core with a target of sub-100ns latency. I/O includes 96 PCIe Gen6 lanes and native CXL 3.0 support for memory expansion and pooling. Arm's reference platform is a 10U dual-node server compliant with the Open Compute Project's DC-MHS standard. Two AGI CPUs fit per blade, and a standard air-cooled 36kW rack holds 30 blades for 8,160 cores total. Arm has also partnered with Supermicro on a liquid-cooled 200kW configuration that houses 336 chips and more than 45,000 cores. Arm claims the AGI CPU delivers more than two times the performance per rack compared to the latest x86 platforms. That figure is, of course, based on the company's own internal estimates at this stage, not independent benchmarks. GPUs have made most of the headlines surrounding AI hardware to date, but there's a demand for more powerful general-purpose compute as agentic systems like OpenClaw explode in popularity. Arm is clearly hoping it can meet and cash in on this demand -- and hopefully that won't be to the detriment of non-AI customers, who seem to have long since been forgotten by the likes of Nvidia and Micron. OpenAI among early customers Meta served as the lead partner on the project and plans to deploy the AGI CPU alongside its custom MTIA accelerators. Santosh Janardhan, head of infrastructure at Meta, said the two companies worked together on the chip and are committed to a multi-generation roadmap. Beyond Meta, Arm confirmed commercial commitments from Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. Sachin Katti, head of industrial compute at OpenAI, said the AGI CPU will play a role in OpenAI's infrastructure by strengthening the orchestration layer that coordinates large-scale AI workloads. Arm has historically operated as an IP licensing company. Its partners, from Apple to Nvidia to AWS, design their own chips using Arm's instruction set architecture and core designs. The AGI CPU adds a third option alongside IP licensing and Arm's Compute Subsystems (CSS) program: Arm-designed, production-ready silicon that customers can deploy directly. Arm said the AGI CPU product line will continue in parallel with the Arm Neoverse CSS product roadmap, and that follow-on products are already committed. The company seems keen to point out that this is an additive move rather than a pivot that competes with existing licensees, though how Arm manages that as it sells chips into the same data centers as Nvidia Grace, AWS Graviton, Google Axion, and Microsoft Cobalt remains to be seen. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[4]
AI-pilled Arm CEO teases mystery products for $1T TAM
Arm CEO Rene Haas took an ice-cold sip of the AI Kool-Aid during a keynote speech at the company's annual conference on Tuesday, teasing a future product that he thinks will pump the British chip designer's total addressable market (TAM) to $1 trillion by the end of the decade. What are those products? That's a question for tomorrow. Tuesday's event was all about Arm's newly announced AGI CPU products, which will free the company from the shackles of its IP licensing model by enabling the company to sell directly to end customers. Haas has high hopes for agentic AI to accelerate the British chip designer's datacenter business. By the end of the decade, he predicts its datacenter silicon will catapult its datacenter TAM to more than $100 billion. During his Tuesday keynote at the Arm Everywhere conference, the CEO said the company currently competes for a datacenter market worth about $3 billion a year in royalties. "When we look at what's going on with agentic AI, the growth of CPUs; the benefit that power-efficient CPUs bring to the data center; we think this represents about $100 billion TAM for us in the future," he said. These figures are predicated in large part on the belief that agentic frameworks, like OpenClaw, will quadruple the demand for CPU cores. While models powering tools like OpenClaw will continue to run on specialized accelerators, the agentic systems built atop them don't. These agents run on CPU cores and need additional CPU compute and memory resources to execute the code generated by the models to automate tasks. Because these agent interactions aren't necessarily tied to a single user's request - one agent may call other agents to complete a task - the volume of traffic these workloads will generate is expected to be rise significantly. Arm already had a role to play here. Its instruction set architecture is used in CPUs like Amazon's Graviton. To further reduce the barrier to entry to adopting its IP, Arm introduced compute subsystems in 2023 - essentially shake-n-bake processor blueprints containing all the ingredients necessary to create custom chips. Customers like Microsoft could tweak the recipe and send it off to their preferred fab to cook. Yet few organizations have the expertise or resources possessed by Microsoft or other hyperscalers. Arm on Tuesday therefore unveiled its first datacenter silicon to bear the Arm brand. The company worked with Meta on the AGI CPU, and both built it to run agentic systems. We took a closer look at the 136-core part earlier on Tuesday, but suffice to say Arm is going to need to ship a lot of them if it expects to be more than a minnow in $100 billion pond. To Haas' credit, at launch Arm's AGI CPU has already secured big-name customers like Meta, OpenAI, SAP, Cloudflare, and SK Telecom, all of whom intend to deploy the chip when it arrives later this year. However, few AI shops stick to one silicon supplier. As we reported earlier this year, Meta is also deploying large numbers of Nvidia's Grace CPUs to power its agentic systems, with plans to expand that footprint to include the GPU giant's new Vera CPUs as well. The social networking giant is also buying custom chips from Broadcom .That said, Arm still makes money on every chip that includes its licensed designs, so wins either way. Then there are the blue and red elephants in the room, Intel and AMD, which benefit from more than two decades of continuity around their x86-64 architecture. The CPU market has never been more competitive. However, Arm's EVP of Cloud AI Mohamed Awad argues that the company's AGI CPU is better suited to agentic tasks thanks to a streamlined core that foregoes extraneous functionality and doesn't rely on simultaneous multithreading, which he argues allows for more deterministic scaling. Whether that design is actually an advantage is up for debate. For Vera, Nvidia opted for simultaneous multithreading (SMT) while Intel has already announced plans to bring hyperthreading back with its Coral Rapids Xeons after briefly abandoning the tech in its upcoming Diamond Rapids parts. Meanwhile, AMD's latest Epyc processors, due out later this year, will offer up to 256 cores. Even with SMT turned off, that's still nearly twice the core count of Arm's new chip. To stay competitive, Arm will be releasing new chips as early as next year with a third-gen AGI CPU already under development. ®
[5]
Arm to Sell Its Own Chips for First Time in Bid for AI Revenue
Ask Mark Gurman Anything About Apple Ask Mark Gurman Anything About Apple Ask Mark Gurman Anything About Apple From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. From its latest devices, to an AI comeback and the future after Tim Cook, join the live conversation on Thursday, March 26 at 11 a.m. EDT. Click to listen Click to listen Click to listen Click to listen Arm Holdings Plc, which made its name licensing technology to semiconductor makers, will begin selling its own chips for the first time, aiming to claim a bigger piece of the massive spending on AI gear. Meta Platforms Inc. will be the first major customer for the UK-based company's chip, called an AGI CPU, Arm said Tuesday at an event in San Francisco. The product will have as many as 136 cores -- a measure of processing power -- and draw 300 watts of electricity, Arm said. Taiwan Semiconductor Manufacturing Co. will produce the chips. Under Chief Executive Officer Rene Haas, Arm has shifted from its roots as a provider of smartphone technology and taken a greater role in the data center market. The change is meant to help the business get more of the money generated by what is often complex and expensive work. The shift also helps Arm benefit from bigger-ticket purchases. Even the most expensive smartphone chips cost tens of dollars. The highest-end data center semiconductors can run in the tens of thousands. Arm decided to make the new chip because customers asked for it, Haas said. The product -- a central processing unit, often described as the brains of a computer -- is designed to work alongside the accelerator chips offered by companies such as Nvidia Corp. It helps coordinate work between computers, prepares data and runs elements that provide a response to users making AI queries, Arm said. "The product that we're building is not only compelling -- but we actually have customers who are lined up to buy it," Haas said in an interview. The company said its product offers greater power efficiency compared with traditional CPU designs from Intel Corp. and Advanced Micro Devices Inc. That means that data center owners will be able to wring more computing power from the same footprint and electricity budget, Haas said. Arm's increasing reach is a direct threat to the so-called x86 data center products made by Intel and AMD, Haas said. Taking share from those traditional stalwarts in a rapidly expanding market will allow both his company and its customers to grow, he argues. "The market is plenty big enough for multiple players," Haas said. Arm faces plenty of competition in data center processors. A number of startups and established companies have sought to challenge Nvidia's dominance in the field with a variety of approaches. And Nvidia itself just introduced a new CPU lineup, targeting the category that Arm is now entering. Haas said his chip is aimed at a different part of the market than Nvidia's latest addition. Arm's chip move also threatens to complicate its relationship with customers. Most of the biggest buyers of data center silicon, including Meta, have their own in-house chip programs. And almost all of them license technology and designs from Arm. Data center operators buy chips from a range of suppliers. That includes Meta, which recently signed long-term deals with Nvidia, AMD and startup Cerebras Systems Inc. The social networking company plans to use the AGI CPUs with its other chips. "We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density," Santosh Janardhan, head of infrastructure at Meta, said in a statement. Other companies -- including OpenAI, Cerebras and SK Telecom Co. -- also plan to deploy the AGI CPU in their infrastructure, Arm said. Off-the-shelf systems using the chip are out now from sellers such as Quanta Computer Inc. and Super Micro Computer Inc. They should be available in greater volumes in the second half of this year, Arm said. Under Haas, Arm has increased its revenue by more than 20% a year. Annual sales topped $4 billion for the first time in 2025. At the same time, Arm has maintained a startlingly high level of profitability. Gross margin, the percentage of revenue left after deducting costs of production, was 98% in its most recent quarter. Most of Arm's peers in the chip industry have much higher sales but lower margins. Even Nvidia, with its near lock on sales of AI accelerators, has margins in the mid-70% range. But Arm generates a tiny fraction of the revenue: Nvidia is on course for annual sales of $356 billion this fiscal year, according to Wall Street estimates. SoftBank Group Corp., which owns a majority stake in Arm, is also ramping up its own efforts to get into AI data centers. That push has involved acquiring chip startups and investing heavily in data center owners.
[6]
Arm jumps as new AI chip to drive billions in annual revenue
March 25 - U.S.-listed shares of Arm Holdings jumped nearly 12% in premarket trading on Wednesday after the chip firm projected billions of dollars in annual revenue from its own new artificial intelligence data-center chip. The new chip marks a pivot for Arm, which has traditionally relied on licensing its designs to companies such as Nvidia (NVDA.O), opens new tab and Qualcomm (QCOM.O), opens new tab and then collecting a royalty payment based on the number of units sold. Unlike current chips that are designed to respond to queries as part of a chatbot, Arm's AGI CPU will be able to handle data-crunching needs of "agentic AI", a system that acts on behalf of users with minimal oversight. Arm expects the data-center chip to generate roughly $15 billion in annual revenue in about five years, CEO Rene Haas said in an interview with Reuters. Overall, the company expects to generate revenue of $25 billion in that period, and annual earnings of $9 per share, he said. "Arm has not taken a baby step, say the production of a die or a chiplet for its customers; it has jumped in with both feet, developing the highly performing and energy efficient Arm AGI CPU," Citigroup analysts said. "The industry move to inference and, in particular, agentic AI is showing the need for more CPUs." The rise of "agentic AI" has already fueled stronger demand for similar chips, which are manufactured by companies like Intel (INTC.O), opens new tab and Advanced Micro Devices (AMD.O), opens new tab. Shares of Intel were up 3.4%, while AMD rose more than 1%. Arm is trading at 63.08 times analysts' estimates for the company's earnings for the next 12 months, compared with AMD's 26.64 and Intel's 71.27, according to data compiled by LSEG. Reporting by Kanishka Ajmera in Bengaluru; Editing by Leroy Leo Our Standards: The Thomson Reuters Trust Principles., opens new tab
[7]
Arm launches own AI chip in high-stakes strategy shift
Meta and OpenAI will be among the first customers of Arm's long-awaited new AI processor, as the SoftBank-backed tech group begins a high-stakes shift in strategy from designing chips for other companies to producing them itself. Arm chief executive Rene Haas unveiled its debut "AGI CPU" in San Francisco on Tuesday. It will create a new rival not only to the traditional central processing units made by Silicon Valley stalwarts Intel and AMD but also several of the chip designer's own customers, including Nvidia, Google and Amazon. Haas called the launch of its first silicon product a "defining moment for our company". "With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing," he said. The long-anticipated Arm CPU -- plans for which were first reported by the FT -- marks a significant departure from Arm's traditional role as a "neutral" platform whose intellectual property is incorporated into chips designed by US tech groups. The Cambridge-headquartered group said the product is intended to meet untapped demand for chips that consume less power in an AI data centre, promising billions of dollars in cost savings for customers compared with traditional CPUs. Despite positioning it as an AI product, the chip will not compete directly with Nvidia's graphics processing units, which have become the workhorses of the AI boom. Instead, it will cater to a growing need for "orchestration" of fleets of AI agents, such as software programming tools Claude Code and OpenAI Codex, as well as other cloud-based AI applications. The chip is being manufactured by Taiwan Semiconductor Manufacturing Company, the same supplier used by Nvidia, Apple and other Arm licensees, and will ship at the end of this year. Meta would be the "lead partner" for the chip, Arm said. Other early customers include ChatGPT maker OpenAI, cloud provider Cloudflare, German enterprise software group SAP, South Korea's SK Telecom and AI chip designer Cerebras, which struck a $10bn infrastructure deal with OpenAI in January. Analysts say the move will transform Arm's business model. Selling its own chips is likely to produce far higher revenues than the licence fees and royalties that it collects from customers at present. But moving into hardware is also likely to damp its gross margins, which are among the tech industry's highest, hitting 98 per cent in its most recent quarter. Ahead of the announcement, analysts at HSBC described the prospective launch as a "game-changing" moment for Arm, with CPU shipments set to soar over AI infrastructure demand. BNP Paribas analysts echoed the sentiment, while adding that Arm needed to answer questions about how it would handle directly competing with its existing customers. Arm said that dozens of companies across the tech industry were "supporting the platform expansion", including Amazon Web Services, Google and Nvidia. Haas said Arm had been careful about entering the market with its own chips, reaching out to its customers to assess their reaction and receiving no pushback. US tech giants supported the move, he said, because Arm's growing reach in the data centre would help drive the growth of their software. Data centre CPUs have historically been dominated by Intel and AMD. Both companies use the same X86 chip architecture that rivals Arm's designs, which started out in mobile devices but has now been used in more than 325bn devices, from cars to servers. Huge infrastructure spending by Big Tech hyperscalers -- and the rapidly growing demands on energy infrastructure -- has seen a shift towards Arm-based chips. Arm on Tuesday claimed its new chip is twice as efficient as similar X86 chips when handling the most demanding AI workloads. But it stopped short of comparing them to Nvidia's Grace and Vera CPUs, which are paired with the semiconductor giant's market-leading GPUs and are based on Arm's technology. Arm's move comes as investors have generally cooled this year on AI infrastructure stocks, worrying about whether the huge capital spending on compute is sustainable. Nvidia has also run up against geopolitical tensions between Washington and Beijing that have delayed its planned return to the Chinese market. Haas said there was "no reason as far as we can tell" that the new Arm CPU could not be sold in China. "They don't fall under any export control restriction, so there isn't any issue there," although "we don't have any customers yet in China", he added. It would also "be a shame" if the UK's build-out of its AI infrastructure did not incorporate CPUs developed by its national champion, Haas added. "We've talked a lot to the folks inside the government on this, and we're hoping to do quite a bit there."
[8]
Arm just unveiled its first in-house chip. Raymond James says buy the stock
Investors should scoop up shares of Arm Holdings after the company unveiled its first in-house central processing unit chip amid the artificial intelligence data center boom, according to Raymond James. The investment firm upgraded Arm to outperform from market perform. It also set a $166 price target on shares, suggesting 23% upside. "We upgrade Arm to Outperform following the company's announced business model shift to include a fabless semiconductor element," analyst Simon Leopold said Wednesday in a note. "In our assumption of coverage, we advocated for Arm to go down this path because it would yield strong operating profit, aid growth and add a new dimension to the strategy." The upgrade comes after Arm debuted Tuesday its first in-house chip , AGI CPU 一 a critical element for powering artificial intelligence inference in data centers. Shares jumped 13% in premarket trading following the announcement. Arm is jumping into the central processing unit manufacturing fray as demand for AI data centers and their underlying hardware is booming. Hyperscalers Alphabet, Microsoft, Meta and Amazon, have committed a combined nearly $700 billion in capital expenses to build new AI data centers. API CPU, which was co-developed with Meta, will also fuel agentic AI applications, in addition to supporting inference workloads. That positions the chip for adoption by a wide variety of AI research and deployment firms, according to Raymond James. "AGI CPU was designed to address the unique requirements of agentic AI and inferencing workloads, use cases including accelerator management, agentic orchestration, increased networking and data plane compute power to support the AI data centers," Raymond James' Leopold wrote. He added, "the industry-leading bandwidth allows for more effective threads of execution per rack vs x86 CPUs, and Arm claims 2x the performance of x86 CPUs in its high-end reference configuration." Meta, which has earmarked up to $135 billion on capital expenditures this year, has committed to multiple generations of the chip, the analyst noted. Other adopters include Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP and SK Telecom. Arm executives said Tuesday in an event in San Francisco that the firm could notch roughly $1 billion in incremental revenue from their new AGI CPU through the end of fiscal year 2028. And, those revenues could grow to $15 billion in fiscal year 2031, they said. Raymond James' call is in line with consensus on the Street. Of the 40 analysts covering the stock, 24 have a buy or strong buy on shares. Shares have risen 23% since the beginning of the year, even as the broader market has underperformed.
[9]
Arm just changed the rules, building its first-ever CPU and betting big on agentic AI
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Looking ahead: Today, Arm announced the Arm AGI CPU, the first production silicon product designed and sold directly by Arm in the company's 35-year history. This isn't an IP license and it's not a compute 'reference' design. It's a finished chip, manufactured by TSMC on 3nm, built for AI data centers and aimed squarely at agentic AI workloads. For the past three years, every data center conversation has started and ended with GPUs. Training clusters and inference racks and accelerator roadmaps. If you worked in data center silicon and you were not talking about GPUs, people looked at you like you were lost. Ryan Shrout is a longtime technology analyst and industry veteran who has spent over two decades covering PC hardware, graphics, and semiconductors. He previously led technical marketing at Intel and was the founding editor of PC Perspective. He is currently President and GM at Signal65. You can follow him on X @ryanshrout. I know the feeling. I spent years at Intel leading data center marketing, trying to make the case that Xeon CPUs had a meaningful role in AI workloads. The response was polite but dismissive. AI was a GPU problem. The CPU was a host processor, a necessary tax on the system, not the point of the system. Turns out that argument was just ahead of its time. The introduction of Arm's AGI CPU matters for reasons that go well beyond one product launch. Agentic AI is structurally changing the CPU-to-GPU ratio in data center infrastructure, and the industry is only beginning to grapple with what that means. The CPU renaissance nobody expected Training-era architectures assumed GPUs would dominate every phase of AI. For pure model training and high-throughput batch inference, that assumption holds. But agentic workloads have introduced an entirely different compute profile. When an agent calls a tool, queries a database, waits for human approval, or orchestrates sub-agents, the GPU is allocated but not active. The CPU handles the work. A Georgia Tech and Intel research paper profiling five representative agentic workloads found that CPU-side tool processing accounts for up to 90.6% of total latency. Production data tells the same story. Anyscale documented that in real-world AI pipelines, CPU-heavy and GPU-heavy stages are often packaged together, leaving GPUs allocated but idle during CPU-bound work. By disaggregating those stages, they achieved an 8x reduction in GPU requirements for the same workload. And our own Signal65 testing shows that even for host node tasks, the CPU you pick can have a material impact on performance and TCO. The pattern scales with agent complexity. Simple chatbot Q&A runs 90-95% on GPUs. RAG with retrieval shifts to 50-75% CPU. Multi-agent orchestration runs 60-70% CPU. Tool-heavy agents like SWE-Agent operate at 80-90% CPU. The more reasoning steps, tool calls, and sub-agent coordination in a workflow, the more the compute balance tilts toward the processor. Futurum Group projects CPU-to-GPU ratios in AI clusters are climbing back toward 1:1 and forecasts CPU market growth reaching 34.9% by 2029, outpacing GPUs and XPUs. Bank of America estimates the total CPU market could more than double from $27 billion in 2025 to $60 billion by 2030. This is not speculative. AMD has said publicly that CPU demand is exceeding expectations, driven specifically by agentic AI applications. Intel has acknowledged it misjudged the demand trajectory and is reallocating wafer capacity from client to server. The supply side did not see this coming. Agents are digital workers, and digital workers need compute The important framing here goes beyond GPU idle cycles and orchestration overhead. Think about what agents actually do once the LLM reasoning step finishes. An agent books a flight, processes an invoice, queries a CRM, compiles a report, schedules a meeting. The inference portion of that workflow runs on GPUs or accelerators. But the actual work the agent executes afterward, hitting APIs, interacting with enterprise applications, reading and writing to databases, moving data through web services, all of that runs on general-purpose compute. The world's enterprise software stack is not getting rewritten for accelerators. Email servers, database engines, ERP systems, CRM platforms, none of these are GPU-native, and nobody is going to make them GPU-native. But agents are going to be calling into these systems at machine speed, millions of transactions where a human used to generate dozens. This is the insight that reshapes the demand model. Agents do not just need CPUs for coordination. They generate entirely new CPU workloads by doing real work, at digital speed, across systems that were built for CPUs and will remain on CPUs. That means net-new demand, not a reshuffling of existing infrastructure but an entirely new compute tier that did not exist before agentic AI. Arm puts a number on it in today's press release. Data centers are expected to require more than 4x the current CPU capacity per GW to support agent-driven applications. That is not a GPU story. That is a CPU story. The Arm AGI processor, from blueprints to silicon Into that demand inflection, Arm is making the biggest strategic bet in its history. The Arm AGI processor is the first product from a new data center silicon product line, with follow-on generations already committed. The specs are serious. Up to 136 Neoverse V3 cores running at up to 3.7GHz in a dual-chiplet design. Dedicated 2MB L2 cache per core. 6GB/s memory bandwidth per core at sub-100ns latency with DDR5-8800 support and up to 6TB memory capacity per chip. 96 lanes of PCIe Gen6, CXL 3.0 for memory expansion, and AMBA CHI extension links. All at a 300-watt TDP, manufactured by TSMC on 3nm. Every core runs a dedicated thread with no SMT and no throttling under sustained load. That matters for agentic workloads where deterministic, predictable per-task performance across thousands of parallel operations is the design target, not peak single-thread burst. The rack-scale story is where it gets especially interesting to data centers. The reference server is a 1OU, 2-node design packing 272 cores per blade. Thirty blades in a standard air-cooled 36kW rack deliver 8,160 cores. Arm has partnered with Supermicro on a liquid-cooled 200kW configuration housing 336 AGI CPUs for over 45,000 cores. In these configurations, Arm claims more than 2x performance per rack versus the latest x86 systems, achieved through compounding architectural advantages: class-leading memory bandwidth that maintains throughput under sustained load, high single-threaded Neoverse V3 performance, and more usable threads doing more work per thread (obviously these kinds of claims need a lot of third-party validation, and its something Signal65 is already looking at). Arm is also contributing the 1OU reference server design, firmware, and diagnostic tooling to the Open Compute Project under the DC-MHS standard. Arm is not just selling a chip, it is trying to seed an open platform ecosystem. Arm's new business model: "We sell IP, compute systems, and chips." The customer list tells a strategic story. Meta is the lead partner and co-developer, deploying AGI CPU alongside its custom MTIA accelerator with a multi-generation roadmap commitment. Additional confirmed deployment partners include Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. OEM systems from ASRock Rack, Lenovo, and Supermicro are available to order now. And more than 50 ecosystem companies, including AWS, Broadcom, Google, Marvell, Micron, Microsoft, Nvidia, Samsung, SK Hynix, and TSMC, have issued supporting statements. OpenAI offered a particularly telling quote, describing the AGI CPU as "strengthening the orchestration layer that coordinates large scale AI workloads." That is exactly the use case profile this chip was built for. This was not a cold start Arm did not show up to the data center market today with a blank resume. All three major hyperscalers already run Arm Neoverse-based server processors. AWS Graviton is now in its fifth generation with 192 Neoverse V3 cores. Microsoft Cobalt 200 is built on CSS V3. Google Axion runs on Neoverse N3. Nvidia Vera, launched at GTC, is also Neoverse-based. Over 1 billion Neoverse cores are deployed in data centers globally. All three major hyperscalers already run Arm Neoverse-based server processors. Independent performance data validates the trajectory. In Signal65 Lab Insight testing, Neoverse-powered AWS Graviton4 delivered up to 168% higher token throughput than AMD EPYC and 162% better performance than Intel Xeon in LLM inference testing with Meta Llama 3.1 8B. In database workloads, Graviton4 handled up to 93% more operations per second than x86-based instances. The Arm architecture is already winning performance benchmarks in the cloud. The AGI CPU brings that to a direct-sold product. This also means that rather than hearing about a new Arm architecture design and having to wait years to see it productized, that timeline could move up substantially, effectively increasing the competitive cadence for Arm vs x86 alternatives. The financial logic is straightforward. Under the IP licensing model, Arm collects 1-2% royalties per chip sold. By selling finished silicon directly, Arm captures the full chip margin. That is a potentially huge revenue increase per unit. Speculation about Arm making this move has circulated for over a year. What makes the timing notable is that agentic AI creates net-new demand that does not directly cannibalize the existing licensee business. Competitive ripple effects The current server CPU market splits roughly 60% Intel, 24% AMD, 6% Nvidia Grace, with the remainder spread across Arm ecosystem customs. That mix is about to get more complicated. Intel is under the most pressure. Already losing share to AMD, Intel has cancelled its mainstream Diamond Rapids-SP platform, leaving its highest-volume data center segment without a new generation until roughly 2028. Of all the incumbents, Intel has the least room to absorb a new entrant. AMD has seen tremendous growth in the data center over the past several years, and EPYC Turin is genuinely competitive. But the concern here is whether that growth trajectory could slow under added competitive pressure. Meta has been one of the biggest AMD data center CPU customers. Meta co-developing a multi-generation CPU roadmap with Arm is a direct signal about where that infrastructure is heading. The power efficiency math, 2x performance per rack at lower power, is a structural challenge for x86 economics over time. Nvidia is positioned as a partner, with Jensen Huang offering a supporting quote in the press release. But Grace and Vera could easily have been seen as the default high-performance Arm CPU for the data center. That assumption is now in question. The AGI CPU competes for the same orchestration and control plane workloads that Vera targets. The question going forward is whether Grace and Vera get scoped to Nvidia-specific infrastructure buildouts, paired with Blackwell GPUs in NVL racks, rather than serving as a general-purpose data center CPU platform. Qualcomm is an interesting player to watch. Their data center ambitions have focused more on NPU-based accelerators than standalone CPUs, and the company has shown off rack-scale inference implementations with decent opening momentum. The AGI CPU does not directly compete with that approach, but it reshapes the conversation about who provides the CPU fabric around those accelerators. The licensee risk for Arm is real but managed. SoftBank acquiring Ampere for $6.5 billion effectively neutralized the most direct conflict. Hyperscaler custom chip designers at AWS, Microsoft, and Google are unlikely to be disrupted since they design their own silicon on Arm IP. The broader perspective is that there is still plenty of room for all of these players. The market is expanding, not just reshuffling. For Arm starting from 0% share in direct silicon sales, and even Qualcomm starting from near-zero in data center CPU presence, every bit of growth is a positive. They are not defending share. They are taking it. And the agentic demand wave means the pie itself is growing. What this means going forward Three forces are converging simultaneously. Agentic AI is creating a new demand category where CPUs are the primary compute element, not a sidecar. Supply has not anticipated this demand, with constrained wafer capacity, stretched lead times, and rising prices across the industry. And Arm's entry as a direct silicon vendor with a purpose-built product validates the opportunity while reshaping the competitive landscape. The most interesting strategic question is not whether the Arm AGI CPU succeeds in isolation. It is whether this moment represents a permanent shift in how we think about data center architecture, from GPU-centric to genuinely heterogeneous, with CPUs playing a far larger role than the training-era mindset assumed. For enterprises and cloud operators, the practical takeaway is simple. Agent-driven workloads are going to need more CPU capacity than anyone planned for, and the options for sourcing that capacity just expanded significantly. This is not the end of GPU dominance. Training and large-scale inference remain GPU territory. But the work that agents do beyond the model is CPU work. And the industry is finally building for that reality.
[10]
Arm Holdings, in Break From Past, Will Sell Its Own Computer Chips
For years, the company sold chip designs to other companies. Now it plans to sell its own chips for A.I. data centers. Arm Holdings licenses technology to semiconductor designers that powers nearly all mobile phones and many other products. Now, it has a chip of its own to sell. The company, a British unit of Japan's SoftBank, on Tuesday announced plans for the first silicon product that Arm will design and sell since its founding in 1990. It is a microprocessor aimed at data centers running artificial intelligence tasks. Meta, Facebook's parent company, helped develop the chip and signed up as its first user. Other early customers will include OpenAI, the A.I. pioneer, Arm said. Arm's new offering marks a drastic change in its business model. The company helped pioneer the concept of selling intellectual property rather than products, charging fees and collecting per-chip royalties from hundreds of companies that license Arm's underlying microprocessor architecture or the equivalent of blueprints to design chips. Pierre Ferragu, an analyst with New Street Research, called Arm's shift to selling chips "the most significant strategic pivot in the company's history." Nvidia, which has become the most valuable publicly traded company in the world thanks to its A.I. chips, has focused attention on chips it sells that handle specialized, number-crunching tasks essential to the development of A.I. systems.
[11]
Arm rolls its own 136-core AGI CPU to chase AI hype train
Turns out artificial general intelligence was a CPU this whole time Arm unveiled its first homegrown silicon -- yes, an actual chip, not another shake-n-bake blueprint -- during an event in San Francisco on Tuesday, and said that flagship customer Meta is set to deploy the 136-core CPU at scale later this year. Dubbed the AGI CPU, the British chip designer's first Arm-branded datacenter processor is designed with agentic AI in mind. You heard it here first folks, artificial general intelligence (AGI) is here and it's a ... it's a CPU. The new hardware represents a sea change in the British chip designer's business model. While Arm is no stranger to datacenter silicon, its involvement in those products up until now has been to license the core IP or instruction set architecture necessary to build them. Despite the hypemaxxed branding, the chip's Arm Neoverse V3 cores won't be running AI models themselves. That's a job for GPUs or one of the growing number of high-end AI ASICs. Instead, Arm sees its first datacenter CPU powering AI agents. In this respect, the chip will compete directly with Nvidia's standalone Vera CPUs and rack systems detailed at GTC last week. "We think that the CPU is going to be fundamental to ultimately achieving AGI," Mohamed Awad, Arm's EVP of cloud AI, told El Reg. While GPUs have gotten the lion's share of attention in recent years, the rise of agentic systems like OpenClaw have brought the need for general-purpose compute back into view. These frameworks need CPU cores and memory to write and execute code, automate tasks, and facilitate the reinforcement learning used to train next gen models. Arm is betting on the proliferation of these agents to drive a four-fold increase in CPU demand, and it's positioning its latest chip to capitalize on this trend. Arm's AGI CPU is a 300-watt part with 136 of its Neoverse V3 cores clocked at up to 3.7 GHz (3.2 GHz base), spread across two dies fabbed on TSMC's 3 nm process. The processor features 2 MB of L2 cache per core along with 128 MB of shared system-level cache (SLC). According to Awad, a concerted effort has been made to avoid including accelerators or functions which eat up die area and ultimately don't benefit the target workload. "The way that legacy CPUs had been built worried about things like support for legacy applications," he said. "We specifically didn't want to add things that weren't going to...be 100 percent utilized in the mission of this device." He added, "This is a clean sheet design meant to address all that." Unlike Nvidia's Vera, Arm has opted to forego simultaneous multithreading for its agent-optimized processors, with Awad arguing that one thread per core allows for more deterministic performance scaling. The CPU is fed by 12 channels of DDR5 -- presumably 6 channels per die -- with support for memory speeds up to 8800 MT/s. At 825 GB/s of aggregate bandwidth, that works out to 6 GB/s per core. Unlike many modern CPUs, the chip's memory and I/O functions are integrated into the same die as the compute in an effort to minimize latency. Because of this, each socket will be exposed to the operating system as two distinct NUMA domains. Finally, for I/O, the processor is equipped with 96 lanes of PCIe 6.0 connectivity and support for CXL 3.0. Meta, which is already deploying large numbers of Nvidia's Arm-based Grace CPUs and plans to use the company's Vera chips, will also be among Arm's first major CPU customers. As part of these efforts, Arm says that its validated two different OCP rack designs. There's a 35 kW air-cooled rack with 32 compute blades which, if our math is right, works out to 8,704 cores per rack. The company has also validated an even denser 200 kW liquid-cooled rack with 42 eight-node servers, which works out to 45,696 cores. For reference, that's more than twice the core count of Nvidia's Vera ETL256 CPU racks at 22,528. Meta isn't the only customer lining up to gobble Arm's new processors. OpenAI, SAP, Cerebras, Cloudflare, F5, SK Telecom, and Rebellions are also listed as early customers. In addition to AI agents, Arm sees applications for the chip as a head node for custom accelerators, or even as a general purpose CPU for networking or storage. In fact, we're told that OEM partners, including Lenovo, are already working on 19-inch systems using the chip. Up until now, enterprise customers have had limited choices with regard to Arm datacenter silicon, with Ampere computing being the only non-cloud-based player in town. Arm's AGI CPU is set to arrive later this year. Whether it'll be what actually brings about The Singularity is another matter entirely. ®
[12]
Arm Lends a Hand, Launches In-House AI Chip With Meta as Its First Customer
Arm, one of the world's leading designers of semiconductors, is building its first-ever in-house chipset. The company will reportedly sell its inaugural line of CPUs, called the Arm AGI CPU, to Meta first, with a slew of other companiesâ€"including OpenAI, SAP, Cerebras, and Cloudflareâ€"lining up to get in on the launch. For most of its existence, Arm has opted against producing its own chipsets, instead choosing to license its processor designs to other companies, who then manufacture them. Shifting to in-house manufacturing was an anticipated move and got the full tech launch event treatment. The company hosted a reveal in front of a live audience in San Francisco, per Wired, where CEO Rene Haas announced the Arm AGI CPU and received some pre-taped praise from Nvidia CEO Jensen Huang, Amazon senior vice president James Hamilton, and Google AI infrastructure head Amin Vahdat. Arm's arrival in the space as a manufacturer comes as some in the industry have raised concerns about CPU production slowdowns. Dion Harris, Nvidia’s head of AI infrastructure, told CNBC earlier this year that, "CPUs are becoming the bottleneck in terms of growing out this AI and agentic workflow." Intel and AMD, which make CPUs based on different architectures, have reportedly warned customers to expect a growing delay in CPU deliveries as manufacturing struggles to keep up with demand. A major driver of that demand is driven by AI infrastructure needs, which continue to grow in an effort to support the demand for agentic AI, according to a report from Futurum. Arms' initial offering seems aimed specifically at that niche. Per Wired, the company's AGI CPU is designed to work in tandem with other chips inside data centers to specifically handle tasks from AI agents. (Despite the name "AGI" invoking the idea of artificial general intelligence, there's no indication this chip does anything to facilitate that theoretical benchmark, which multiple CEOs have now claimed to have achieved with no proof and minimal fanfare.) For the time being, Arm's move to manufacturing will probably be seen as a boon for the AI industry that is in desperate need of ramped-up manufacturing to meet demand. But it'll be interesting to see if it continues to be received that way if Arm goes from an ancillary offering to trying to own the AI chip market and eat up other companies' market share.
[13]
Arm unveils new AI chip, expects it to add billions in annual revenue
SAN FRANCISCO, March 24 (Reuters) - Arm Holdings announced a new artificial intelligence data center chip on Tuesday which it said will add billions of dollars of revenue and represent a significant shift in the company's strategy. The new chip, called the AGI CPU, will address data-crunching needed for a specific type of AI that is able to act on behalf of users with minimal oversight, instead of responding to queries as part of a chatbot. So-called agentic AI has jumpstarted demand for the central processing units (CPUs) produced by the likes of Intel (INTC.O), opens new tab and Advanced Micro Devices (AMD.O), opens new tab. For years, Arm, majority-owned by Japan's SoftBank Group (9984.T), opens new tab, has relied only on intellectual property for revenue, licensing its designs to companies such as Qualcomm (QCOM.O), opens new tab and Nvidia (NVDA.O), opens new tab and then collecting a royalty payment based on the number of units sold. Last year, Arm signalled to investors it was investing in making its own chip, a process that can cost hundreds of millions of dollars, and that the company had hired key executives to assist with the effort. The AGI CPU will be the first chip under that new strategy. "It's a very pivotal moment for the company," CEO Rene Haas said in an interview with Reuters. The new chip will be overseen by Mohamed Awad, head of the company's cloud AI business, and Arm has additional designs in the works that it plans to release at 12- to 18-month intervals. Meta Platforms (META.O), opens new tab will be the company's lead partner for the AGI CPU and the two companies worked together on the design. Arm's customers for the new chip include ChatGPT maker OpenAI, Cloudflare (NET.N), opens new tab, SAP (SAPG.DE), opens new tab and SK Telecom (017670.KS), opens new tab. Taiwan Semiconductor Manufacturing Co (2330.TW), opens new tab is fabricating the device on its 3-nanometer technology and is made from two distinct pieces of silicon that operate as a single chip. Arm plans to put it into volume production in the second half of this year but has received test chips that function as expected. "It's back, and it works, and it's doing everything we thought it would," Haas said, referring to the new chip. In addition to the chip itself, Arm is working with server makers such as Lenovo (0992.HK), opens new tab and Quanta Computer (2382.TW), opens new tab to offer complete systems. For its current fiscal year, Wall Street expects Arm to generate a net profit of $1.75 per share on revenue of $4.91 billion, according to LSEG estimates. Reporting by Max A. Cherney in San Francisco; Editing by Muralikumar Anantharaman Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence Max A. Cherney Thomson Reuters Max A. Cherney is a correspondent for Reuters based in San Francisco, where he reports on the semiconductor industry and artificial intelligence. He joined Reuters in 2023 and has previously worked for Barron's magazine and its sister publication, MarketWatch. Cherney graduated from Trent University with a degree in history.
[14]
Arm stock pops 6% as CEO Haas issues $15 billion revenue expectation for new chip
Inside Arm's $71 million chip lab where its making its first ever CPU Arm Holdings stock popped 6% in after-hours trading on Tuesday as CEO Rene Haas announced 2031 annual revenue expectations that were more than six times what it was in 2025. Haas unveiled Arm's first in-house chip on Tuesday at an event in San Francisco, with Meta as the initial customer. CNBC got an exclusive first look at the chip earlier this month, visiting the lab Arm built for it in Austin, Texas. Arm stock closed about 1.5% lower on Tuesday following the chip announcement. Haas said Arm expects the new chip to generate roughly $15 billion in annual revenue by 2031, with total annual revenue of $25 billion and earnings per share of $9. "We may be under-calling that number," Haas said Tuesday. "I think the demand is higher than we think it is." It's a huge lift for the chip design firm that generated just over $4 billion in annual revenue in 2025. Central processing units are seeing a resurgence of demand as agentic AI changes compute needs. Haas predicted CPUs will see a fourfold increase in demand around agentic AI. The Arm AGI CPU is a data chip optimized for AI inference. It's a long-anticipated move that marks a major change for the so-called Switzerland of chip firms as it enters into fresh competition with its customers. Arm CFO Jason Child said Arm is selling its new chip at about a 50% gross profit. "It expands our market to include customers that were not interested in an IP model, gives our current customers choice, and for Arm it creates a much larger profit opportunity," Child said at the event Tuesday.
[15]
Arm finally moves into silicon
The AI CPU features 136 Neoverse V3 cores and promises double the performance of x86 processors for agentic AI workloads. The architect of many modern smartphones and some PCs, Arm Ltd., just put on a hard hat and picked up a hammer and a nail. Arm, whose intellectual property formed the blueprints of smartphones designed by Apple, Qualcomm, and Samsung, said Tuesday afternoon that it has entered the chip market with help from Meta. However, its first effort will target the data center, rather than the PC or phone. Arm's first chip will be called the Arm AI CPU, designed for "AI data centers running agentic AI workloads" -- most likely, powering Meta's own AI efforts in the cloud. The chip, with 136 Arm Neoverse V3 cores per CPU, will be designed for 1U racks. It will offer twice the performance per rack as an x86 CPU, the company said. "AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change," said Rene Haas, chief executive of Arm, in a statement. "Today marks the next phase of the Arm compute platform and a defining moment for our company. Arm was founded in 1990 as Advanced RISC Machines as a joint venture between Apple, Acorn Computer, and VLSI Technology. Its architecture helped underpin the Apple Newton, according to its corporate history. However, Arm executives realized that the business couldn't be built upon a single product and the company altered its business model to supply what's known as intellectual property, or the design of the chips themselves. It then licensed that IP to customers. Arm's current license agreements comprise two models: one that takes Arm's processor designs and bakes them into silicon, unchanged; and a "black box" architecture license. The latter is what customers like Apple and Samsung have: the freedom to design whatever chip they'd like, as long as it uses the Arm instruction set architecture. It has traditionally been left up to Arm's customers to actually turn Arm's design into silicon. Over 50 ecosystem players endorsed Arm's decision including chip suppliers Mediatek, Micron, Marvell, and ST Micro. Major customers Apple, Nvidia, and Qualcomm were not among them. Arm has been rumored to move into the silicon space for some time. Arm's partnership with Nvidia, the N1/N1X, is also expected to take place this year. Arm has also tipped its own roadmap for cores specifically designed for PCs and phones, called Niva and Lumex, respectively. Still, Arm gave no indication that it would compete with its customers in the PC or smartphone market. The silence of its major customers could be a tell. Who knows if your next phone or PC will include a microprocessor with "Arm" stamped upon it?
[16]
ARM stock price surges today after chip designer announces biggest pivot in its 35-year history
For over three decades, the British semiconductor firm had one primary business model: it designed chips and then licensed those designs to other companies, including Apple and Qualcomm, which would then make their own semiconductors based on Arm's designs. Under this business model, Arm essentially made the blueprints that other companies followed to make their own chips. And every time a company made a chip with Arm's blueprint, Arm earned a licensing fee. That license fee amounted to about 5% per chip made with Arm's blueprint, according to Bloomberg, meaning that if a chip cost $100, ARM made about $5 from it. But now Arm has announced that it will no longer be just a chip blueprint company. It will also begin making and selling its own chips directly to customers, and those chips will be designed to run AI workloads.
[17]
Arm launches 136-core AGI CPU for data centers - SiliconANGLE
Arm Holdings plc today debuted a new central processing unit, the AGI CPU, that's optimized to power artificial intelligence clusters. The company says the chip provides more than twice as much performance per server rack as Intel Corp. silicon. Additionally, Arm claims, the AGI CPU can help data center operators reduce hardware costs. The company expects the chip to provide savings of up to $10 billion per gigawatt of data center capacity. Arm is best known for designing CPU cores, the parts of a processor that carry out calculations. It also makes many of the other components necessary to build a chip. The company sells a mesh interconnect, a kind of miniature network that can link together multiple cores into a functioning CPU. The module is available alongside subsystems that perform supporting tasks such as regulating voltage levels. AMD licenses its CPU component designs to chipmakers that assemble them into processors. Until now, however, the company didn't sell a complete processor of its own. The launch of the AGI CPU marks Arm's entry into that market. "With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale," said Arm Chief Executive Officer Rene Haas The AGI CPU comprises 2 dies made using a 3-nanometer manufacturing process. The dies host up to 136 of Arm's latest Neoverse V3 server CPU core design. The core's instruction set, the language in which it expresses computations, includes extensions specifically optimized for AI workloads. Neoverse V3 also ships with multiple data protection features. A technology called RME enables it to run workloads in isolated, encryption sections of a server's memory to block hacking attempts. The core can also automatically fix data errors that emerge in its cache. Each of the AGI CPU's cores includes 2 megabytes of L2 cache. Arm optimized the chip for sustained performance, which means that it can maintain a consistent speed for extended time periods. Each of the AGI CPU's cores runs a single thread at a frequency of up to 3.7 gigahertz. Arm has released a reference server design for data center suppliers that plan to integrate the processor into their products. It's an air-cooled blade, or compact server, with two AGI CPUs. Data center operators can install up to 30 such blades in a standard air-cooled 36kW server rack for a total of 8,160 cores. Several hardware makers including Lenovo Group Ltd. are already shipping servers based on the AGI CPU. Arm says that its initial customer roster includes OpenAI Group PBC, Cloudflare Inc., Cerebras Systems Inc., SAP SE, Meta Platforms Inc. and several other companies. The Facebook parent is the AGI GPU's lead customer.
[18]
Arm Enters Merchant Silicon with 136-Core AGI Data Center CPU
Arm has formally entered the merchant silicon market with the launch of the AGI CPU, the first complete processor the company has ever sold under its own name. That is a notable change for a firm whose business has traditionally centered on licensing CPU architectures and core designs to other semiconductor vendors. With AGI CPU, Arm is no longer just enabling third-party server chips, it is now shipping one itself. The new processor is built on TSMC's 3 nm manufacturing node and is aimed squarely at AI-focused data center deployments. At the top end, the chip integrates up to 136 Neoverse V3 cores, with up to 2 MB of L2 cache per core and boost frequencies reaching 3.7 GHz. Arm also lists a 300 W TDP, making it a high-density server product designed for sustained throughput rather than low-power enterprise use. On the memory side, the platform supports twelve DDR5-8800 DIMMs for a maximum of 6 TB of RAM, while CXL 3.0 support allows for broader memory expansion and pooling. I/O is equally current, with 96 PCIe 6.0 lanes available for storage, networking, and accelerator connectivity. Arm is positioning the AGI CPU around the needs of AI inference and agentic AI infrastructure, where memory bandwidth, latency, and deterministic behavior are increasingly important. The company says the platform can deliver about 6 GB/s of memory bandwidth per core with latency below 100 ns, and it emphasizes a dedicated-core-per-thread approach intended to avoid idle threads and frequency collapse under sustained cloud workloads. Rather than marketing only a chip, Arm is also pitching a deployment model that scales efficiently at the rack level. To underline that point, Arm demonstrated multiple rack configurations based on the new processor. One example was a 36 kW air-cooled rack containing 30 blade servers and 60 CPUs, for a total of 8,160 cores. Another, developed with Supermicro, is a 200 kW liquid-cooled design capable of housing 336 CPUs and more than 45,000 cores. That makes it clear this launch is not limited to silicon alone; Arm is trying to establish a broader server platform ecosystem around AGI CPU. The strategic implications are difficult to ignore. By shipping its own finished data center CPU, Arm is moving into more direct competition with some of the same companies that license its technology to build custom server silicon. That includes hyperscale and semiconductor players such as Nvidia, Qualcomm, Amazon, Google, and Microsoft. Meta is the first major public customer and co-development partner, while Arm has also named Cloudflare, OpenAI, SAP, and SK Telecom among the companies associated with the rollout. For Arm, AGI CPU is more than a first chip; it is the start of a broader move into fully commercial infrastructure silicon.
[19]
Arm releases first in-house chip in 35 years targeting AI data centers
Arm Holdings has introduced its first in-house chip, the Arm AGI CPU, marking a significant shift in its operational strategy after 35 years of exclusively licensing its designs. This new chip is designed for inference applications in AI data centers and has been developed using Arm's Neoverse family of CPU IP cores in collaboration with Meta, which is also the first customer for the product. The launch signifies Arm's entry into silicon production, positioning the company to compete directly with former partners. This move follows a developmental phase that began in 2023, with the chips now available for order. In addition to Meta, launch partners for the Arm AGI CPU include OpenAI, Cerebras, and Cloudflare. Arm emphasized the critical role of CPUs in modern infrastructure, stating that they handle essential tasks such as memory and storage management, workload scheduling, and data movement across systems. While GPUs have often been highlighted for their use in AI model training, Arm's focus on CPU production underscores its importance in efficiently operating distributed AI systems at scale. The announcement comes amid an ongoing CPU shortage affecting the market. Companies like Intel and AMD are experiencing longer product wait times, particularly in China. The shortage has contributed to rising computer prices, creating a growing demand for CPUs in the wake of increased reliance on AI technologies. Arm's transition into chip manufacturing is a notable deviation from its long-standing practice and reflects a response to evolving market demands and the competitive landscape of the semiconductor industry. The company is now aligning itself to address the challenges and opportunities presented by the shift toward more advanced computing solutions.
[20]
Meta, Arm to Develop a New Class of CPUs for AI Workloads
Meta and Arm will develop multiple generations of CPUs together Meta and Arm announced a partnership on Tuesday to jointly develop a new class of CPUs that support heavy artificial intelligence (AI) workloads. These CPUs will be designed to support data centres and large-scale AI deployments, and are not expected to be available to the end consumer. The two companies also unveiled the Arm AGI CPU, the first chipset designed for Meta's data centres. The two entities also confirmed that several generations of chipsets will be co-developed to facilitate parallel, high-performance agentic workloads. Meta, Arm to Develop AI Chipsets In a newsroom post, the Menlo Park-based tech giant announced the partnership, detailing the focus area. The collaboration will result in the development of multiple generations of CPUs to address the massive compute demand. These chipsets will be optimised for data centres and large gigawatt-scale AI deployments, to help Meta bring new AI-powered innovations to users. The first release to come from the partnership is the Arm AGI CPU. Detailed in the chipmaker's blog post, the production-ready silicon was built on the Arm Neoverse platform. It has a dual chiplet design with the memory and I/O on the same die, and a memory latency of sub-100 nanoseconds. It offers 6GB/s memory and a capacity of 6TB per chip. It also supports a frequency of up to 3.7GHz. "We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data centre performance density and supports a multi-generation roadmap for our evolving AI systems," said Santosh Janardhan, Head of Infrastructure, Meta. Meta will serve as both the co-developer and the lead customer of the Arm AGI CPU. The company claimed that it is designed to work alongside its Meta Training and Inference Accelerator (MTIA) silicon. Additionally, the chipset will also be available to other AI players via Arm. The Meta AI-maker, on the other hand, will release its board and rack designs for the CPU under the Open Compute Project later this year. Notably, apart from Meta, Arm's Neoverse also serves the large AI and cloud companies, including AWS Graviton, Google Axion, Microsoft Azure Cobalt, and Nvidia Vera.
[21]
US stocks: Arm shares jump 15% as new AI chip to drive billions in annual revenue
U.S.-listed shares of Arm Holdings jumped nearly 12% in premarket trading on Wednesday after the chip firm projected billions of dollars in annual revenue from its own new artificial intelligence data-center chip. U.S.-listed shares of Arm Holdings jumped nearly 12% in premarket trading on Wednesday after the chip firm projected billions of dollars in annual revenue from its own new artificial intelligence data-center chip. The new chip marks a pivot for Arm, which has traditionally relied on licensing its designs to companies such as Nvidia and Qualcomm and then collecting a royalty payment based on the number of units sold. Unlike current chips that are designed to respond to queries as part of a chatbot, Arm's AGI CPU will be able to handle data-crunching needs of "agentic AI", a system that acts on behalf of users with minimal oversight. Arm expects the data-center chip to generate roughly $15 billion in annual revenue in about five years, CEO Rene Haas said in an interview with Reuters. Overall, the company expects to generate revenue of $25 billion in that period, and annual earnings of $9 per share, he said. "Arm has not taken a baby step, say the production of a die or a chiplet for its customers; it has jumped in with both feet, developing the highly performing and energy efficient Arm AGI CPU," Citigroup analysts said. "The industry move to inference and, in particular, agentic AI is showing the need for more CPUs." The rise of "agentic AI" has already fueled stronger demand for similar chips, which are manufactured by companies like Intel and Advanced Micro Devices. Shares of Intel were up 3.4%, while AMD rose more than 1%. Arm is trading at 63.08 times analysts' estimates for the company's earnings for the next 12 months, compared with AMD's 26.64 and Intel's 71.27, according to data compiled by LSEG.
[22]
ARM Stock's Momentum Score Leaps Fueled By In-House AGI Chip Ambitions - ARM Holdings (NASDAQ:ARM)
Massive Surge In Technical Strength The semiconductor designer recently saw its Benzinga Edge's Stock Rankings momentum score more than double week-over-week, leaping from 19.64 to 52.09. This score evaluates a company's relative strength based on its price movement patterns and volatility over multiple timeframes, ranking it as a percentile against other stocks. The aggressive expansion has successfully flipped Arm's near-term technicals. According to the Benzinga Edge price trend metrics, the stock is now in an upward trend in the short term, which covers the last couple of months. It is also showing an upward trend in the medium term, reflecting the last couple of quarters. However, the long-term trend indicator shows the stock has been facing a downward trend over the past year. Despite the positive momentum, Arm's value score sits at a remarkably low 3.89. Because the value metric evaluates a stock's relative worth by comparing its market price to fundamental measures of the company's performance, this low score indicates that the market is pricing Arm at a steep premium. The $15 Billion AI Chip CatalystArm Holdings Outperforms In 2026 ARM stock has returned 21.74% year-to-date, outpacing the losses of 6.34% in the Nasdaq Composite index during the same period. It was lower by 6.47% in the last six months, but up 8.10% over the year. The stock closed Tuesday 1.41% lower at $134.96 apiece, and it was higher by 11.08% in premarket on Wednesday. Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[23]
ARM Takes Matters Into Its Own Hands, Unveiling the 'AGI CPU' as Its First-Ever Silicon for Agentic AI
ARM made a massive announcement at its ARM Everywhere keynote: according to a new blog post, the firm will sell its own 'AGI CPU' for the first time. With agentic AI workloads, CPU has started to become the next bottleneck for hyperscalers, which is why we have seen x86 solutions from Intel/AMD and ARM-based chips from NVIDIA gaining massive adoption among customers. In light of this, ARM has decided to capitalize on the hype by introducing its first-ever chip, called the ARM AGI CPU, marking a shift from an IP provider to an end-to-end silicon manufacturer. According to ARM's CEO, Rene Haas, this move is targeted at fulfilling enterprise demand driven by agentic AI workloads. Today marks the next phase of the Arm compute platform and a defining moment for our company. With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale. Diving into the specifics of the AGI CPU, we are looking at up to 136 Arm Neoverse V3 cores per CPU, offering 6GB/s memory bandwidth per core. The processor has 2 MB of L2 cache per core and runs at up to 3.7 GHz. As far as I/O specifications are concerned, you are looking at 96x PCIe Gen 6 lanes, along with CXL 3.0 memory expansion, allowing the processor to support "massively parallel, high-performance agentic workloads". Here's a complete rundown of the AGI CPU, based on the details disclosed: In terms of rack-scale deployment, ARM offers ultra-thin 1OU (Open Unit) nodes, which are basically a shift away from multi-unit servers. A single chassis can host up to two nodes, providing a total of 272 cores per blade. The physical rack layout can house up to 30 of these open unit nodes, delivering a total of 8160 cores. You are also looking at unified memory pools connected via the CXL 3.0 fabric, and each rack is rated to run at 36kW with air cooling. Given how prevalent CPU-only racks have become, ARM has designed its solution to meet market demand. ARM says its AGI CPU delivers "2 times higher" performance per rack compared to modern x86 solutions, and while it doesn't compare its solution to Vera, we expect it to be closer to it as well, given similar microarchitectures. The AGI CPU also allows vendors to mix-and-match their rack-scale configurations, since ARM has opened up support for any accelerator (Cerebras, Groq, Meta MTIA) that fits into standard OCP server designs. This essentially means that the ARM IP benefits NVIDIA previously used exclusively have likely now ended.
[24]
Arm jumps as new AI chip to drive billions in annual revenue
The new chip marks a pivot for Arm, which has traditionally relied on licensing its designs to companies such as Nvidia and Qualcomm and then collecting a royalty payment based on the number of units sold. US-listed shares of Arm Holdings jumped nearly 12% in premarket trading on Wednesday after the chip firm projected billions of dollars in annual revenue from its own new artificial intelligence data-center chip. The new chip marks a pivot for Arm, which has traditionally relied on licensing its designs to companies such as Nvidia and Qualcomm and then collecting a royalty payment based on the number of units sold. Unlike current chips that are designed to respond to queries as part of a chatbot, Arm's AGI CPU will be able to handle data-crunching needs of "agentic AI", a system that acts on behalf of users with minimal oversight. Arm expects the data-center chip to generate roughly $15 billion in annual revenue in about five years, CEO Rene Haas said in an interview with Reuters. Overall, the company expects to generate revenue of $25 billion in that period, and annual earnings of $9 per share, he said. "Arm has not taken a baby step, say the production of a die or a chiplet for its customers; it has jumped in with both feet, developing the highly performing and energy efficient Arm AGI CPU," Citigroup analysts said. "The industry move to inference and, in particular, agentic AI is showing the need for more CPUs." The rise of "agentic AI" has already fueled stronger demand for similar chips, which are manufactured by companies like Intel and Advanced Micro Devices. Shares of Intel were up 3.4%, while AMD rose more than 1%. Arm is trading at 63.08 times analysts' estimates for the company's earnings for the next 12 months, compared with AMD's 26.64 and Intel's 71.27, according to data compiled by LSEG.
[25]
Arm's Big AI Chip Move With Meta Sends Stock Soaring - ARM Holdings (NASDAQ:ARM)
Arm Holdings Plc (NASDAQ:ARM) shares are rising in Wednesday's premarket session. The company announced a major expansion into production silicon with the launch of its first AI-focused data center CPU. This positions Arm to tap into the growing demand for AI infrastructure. Expansion into Production Silicon Arm Holdings has entered production silicon for the first time. The company launched its Arm AGI CPU for AI data centers, moving beyond IP and Compute Subsystems. The new Arm-designed AGI CPU targets agentic AI workloads. It delivers over 2x performance per rack compared to x86 platforms. It features up to 136 Arm Neoverse V3 cores, a 300-watt TDP, and high memory bandwidth. The CPU supports dense deployments with up to 45,000+ cores per rack in liquid-cooled systems. Rising AI Demand Driving CPU Needs Arm said the shift addresses rising demand for CPUs as AI systems move toward continuously running agents. These agents require much greater compute capacity. Data centers may need more than four times the current CPU capacity per gigawatt, driving demand for more efficient, high-performance architectures. The company is expanding its platform to offer three options: licensing Arm IP, adopting Arm Compute Subsystems (CSS), or deploying Arm-designed silicon. The AGI CPU is expected to form the foundation for agentic AI infrastructure. It improves workload density, accelerator utilization, and power efficiency. Ecosystem Support and Strategic Context Meta Platforms Inc (NASDAQ:META) is the lead partner and co-developer. Other customers include Cerebras, Cloudflare, F5, OpenAI, SAP, and SK Telecom. Arm is also working with OEMs such as ASRock Rack, Lenovo, Quanta, and Supermicro. For decades, ARM's platform has focused on scalable, power-efficient computing across billions of devices. The move into silicon marks the next phase of that strategy. It extends ARM's architecture into data center infrastructure for AI at scale. "Today marks the next phase of the Arm compute platform and a defining moment for our company," said CEO Rene Haas. "With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices all built on Arm's foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale." Arm expects the new chip to generate about $15 billion in annual sales within five years, overtaking current operations' sales. Technical Analysis Arm Holdings trades 19.7% above its 20-day SMA and 17.9% above its 100-day SMA, signaling strong short-term momentum. Shares have gained 8.59% over the past 12 months, reflecting steady long-term performance. The stock remains closer to its 52-week highs than lows, indicating a favorable broader trend. The RSI stands at 64.04, indicating neutral conditions with no signs of overbought or oversold levels. MACD shows a bullish signal at 3.1740, remaining above the signal line of 1.6370. This divergence suggests continued positive momentum in the stock. The combination of a neutral RSI and a bullish MACD suggests mixed momentum, indicating that while the stock is not overbought, it still has upward pressure. Key Resistance: $159.00 Key Support: $125.00 ARM Stock Price Activity: ARM Holdings shares were up 12.81% at $152.25 during premarket trading on Wednesday, according to Benzinga Pro data. Photo: Piotr Swat via Shutterstock This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[26]
Arm unveils AGI CPU for AI data centers and partners with Meta
Arm Holdings has announced the Arm AGI CPU, its first production silicon designed for AI data centers. This marks the first time the company is extending its compute platform beyond IP and Compute Subsystems (CSS) into full silicon products. At the same time, Arm confirmed a partnership with Meta, which will act as the lead partner and co-developer for the new CPU platform. Arm stated that its compute platform has powered billions of devices over the past three decades through its licensing model. With increasing demand for deploying Arm-based infrastructure at scale, the company is now expanding its strategy to include Arm-designed silicon products. This approach allows partners to choose between licensing Arm IP, adopting Arm CSS, or deploying complete Arm-designed CPUs. The Arm AGI CPU is built to support agentic AI workloads, where AI systems continuously run tasks involving reasoning, planning, and execution. Arm noted that this shift is increasing the volume of tokens processed across systems and driving higher demand for CPUs to handle coordination, reasoning, and data movement. As a result, data centers are expected to require more than four times the current CPU capacity per gigawatt, increasing the need for higher compute density within existing power limits. Deployment support: Arm stated that the AGI CPU can deliver more than 2x performance per rack compared to x86 CPUs and may enable up to $10 billion in CAPEX savings per gigawatt of AI data center capacity. Meta will co-develop multiple generations of the Arm AGI CPU and use it to support its AI infrastructure. The company said its data centers are increasingly exceeding the capabilities of traditional CPUs as it builds systems for AI training and inference. The Arm AGI CPU will also work alongside Meta's custom silicon, the Meta Training and Inference Accelerator (MTIA), to improve coordination in large-scale AI deployments. Meta added that the CPU will be used to support its applications and broader AI initiatives, and that it plans to release board and rack designs for the platform through the Open Compute Project later this year. Arm confirmed that the AGI CPU is supported by multiple partners across the ecosystem, including: The company is also working with OEMs and ODMs such as ASRock Rack, Lenovo, Quanta Computer, and Supermicro. Early systems are currently available, with broader availability expected in the second half of the year. AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that shift. Today marks the next phase of the Arm compute platform and a defining moment for our company. With the expansion into delivering production silicon with our Arm AGI CPU, we are giving partners more choices, all built on Arm's foundation of high-performance, power-efficient computing, to support agentic AI infrastructure at global scale. Commenting on the partnership, Santosh Janardhan, Head of Infrastructure, Meta, said:
[27]
Arm shares rally as new AI chip to drive billions in annual revenue
Arm sparked a rally in shares of companies that make central processors on Wednesday with a prediction that its new data-center chip would bring in billions of dollars in annual revenue. Shares of the SoftBank Group-controlled company soared 20 per cent to their highest since November, while rivals Intel and Advanced Micro Devices also advanced more than five per cent each. Arm expects the data-center chip to generate roughly US$15 billion in annual revenue in about five years, CEO Rene Haas said in an interview with Reuters. Its forecast was the latest sign that rising use of AI technology that can create apps, write computer code and finish presentations with little human intervention would be a boon for CPU makers. The AI boom had so far mostly delivered gains for Nvidia, whose graphics processors are needed for training AI. Even Nvidia has been responding to the shift and earlier this month unveiled its own CPU chip for AI. For Arm, the new chip marks a departure for a company that has traditionally relied on licensing its designs to companies such as Nvidia and Qualcomm and then collecting a royalty payment based on the number of units sold. Unlike current chips that are designed to respond to queries on a chatbot, Arm said its AGI CPU will be able to handle data-crunching needs of "agentic AI." "Arm has not taken a baby step, say the production of a die or a chiplet for its customers; it has jumped in with both feet, developing the highly performing and energy efficient Arm AGI CPU," Citigroup analysts said. "The industry move to inference and, in particular, agentic AI is showing the need for more CPUs." Arm stock was last at $162.10 in morning trading, and poised to add more than $29 billion to its market value. HSBC analysts forecast that the combination of AGI CPU revenue and server CPU royalties will make fiscal year 2029 "the transitionary period where server CPUs take over smartphones as the dominant contributor" to Arm's overall revenue mix. Arm is trading at 63.08 times analysts' estimates for the company's earnings for the next 12 months, compared with AMD's 26.64 and Intel's 71.27, according to data compiled by LSEG.
[28]
Arm pivots strategy in move cheered by markets
Arm Holdings shares surged on Wednesday, emerging as the top performer on the Nasdaq 100. This rally follows the announcement of its first processor, the Arm AGI CPU. The group anticipates approximately $15bn in revenue by 2031 from this chip, with a gross margin exceeding 50%. This processor was designed specifically for artificial intelligence inference tasks, notably to address the rise of agentic AI. A strategic shift now better perceived The current enthusiasm contrasts with the market reaction during the initial signals of this strategic pivot. When publishing its Q1 2026 results, Arm had mentioned its intention to produce its own chips, an announcement that triggered a drop of more than 13% in the share price on July 31, 2025. Since then, the context has evolved, driven by the explosion in demand for processors adapted to new AI use cases. As Citi analyst Andrew Gardiner points out, these forecasts, deemed "well beyond the most optimistic scenarios," are accompanied by major launch partnerships, notably with Meta and OpenAI, as well as a list of other clients, confirming the existence of solid demand. Persistent questions regarding the model Despite this positive reception, some analysts remain cautious about this repositioning. Ross Seymore at Deutsche Bank believes this strategy could place Arm in direct competition with its historical clients. According to him, this change in model could weigh on profitability, with a possible erosion of margins and multiple compression, as the group is now compared more closely to players in the AI and computing sectors. This view is partially shared by Matt Bryson, an analyst at Wedbush, who calls for caution regarding the immediate success of this first generation of products. He notes, however, that the company seems to be taking the right approach by anticipating a gradual ramp-up, with several generations of chips before achieving true commercial success. At JP Morgan, Harlan Sur highlights challenges related to the supply chain. Short-term constraints could slow adoption, while long-term buy-in remains uncertain in an environment where many players, particularly in the cloud space, are already developing their own processors based on Arm architecture. He nevertheless points out that several structural partnerships already exist, such as those linking Meta to Nvidia and AMD.
[29]
Arm launches its first chip and shifts strategy with Meta as key client
Arm has unveiled its first internally developed chip, dubbed the AGI CPU, designed for artificial intelligence inference in data centers. Meta is one of the first customers, alongside several partners such as OpenAI, Cloudflare, and SAP. Until now, Arm positioned itself as a provider of chip designs, maintaining a neutral stance towards its clients, which include Apple, Nvidia, and Google. With this initiative, the group is now becoming a direct competitor to certain players within its own ecosystem. This evolution comes amid surging demand for computing power driven by AI. While GPUs still dominate certain use cases, CPUs are regaining importance with the rise of so-called agentic applications, which require more general-purpose computing. Arm is highlighting the energy efficiency of its architecture, claiming its processor could offer up to twice the performance per watt compared to systems based on x86 architecture. The chip is currently being produced by TSMC using 3-nanometer technology, with a production ramp-up expected during the year. This launch takes place in a market estimated at nearly $1 trillion, where competition is intensifying. For Arm, this is a strategic bet aimed at capturing more value within the semiconductor supply chain, at the risk of redefining its relationships with long-standing partners.
Share
Share
Copy Link
Arm Holdings is making its own chips for the first time in its 35-year history, unveiling the AGI CPU designed for AI workloads in data centers. Meta is the first customer for the 136-core processor, which Arm says will deliver superior performance per watt compared to traditional x86 platforms. The move shifts Arm from pure IP licensing to direct silicon sales.
Arm Holdings announced Tuesday it is selling its own chips for the first time, marking a historic shift for the company that has spent 35 years exclusively licensing semiconductor technology to partners like Nvidia and Apple
1
. At an event in San Francisco, CEO Rene Haas unveiled the Arm AGI CPU, a production-ready processor built specifically for AI workloads in data centers2
. The chip packs up to 136 Neoverse V3 cores running at up to 3.2 GHz all-core and 3.7 GHz boost, all within a 300-watt TDP3
. Built on TSMC's 3nm process, the AGI CPU represents Arm's departure from its long-standing IP licensing model and positions the company as a direct competitor to chipmakers it once only partnered with5
.
Source: Tom's Hardware
Meta is the first customer for the AGI CPU and served as the lead partner in developing the silicon
1
. Santosh Janardhan, head of infrastructure at Meta, said the two companies worked together on the chip and are committed to a multi-generation roadmap3
. The social networking giant plans to deploy the AGI CPU alongside its custom MTIA accelerators to improve data center performance density5
. Beyond Meta, Arm confirmed commercial commitments from OpenAI, Cerebras, Cloudflare, F5, SAP, Rebellions, and SK Telecom3
. Sachin Katti, head of industrial compute at OpenAI, said the AGI CPU will strengthen the orchestration layer that coordinates large-scale AI workloads3
. Off-the-shelf systems using the chip are available now from sellers such as Quanta Computer and Super Micro Computer, with full production availability expected in the second half of this year5
.Arm designed the AGI CPU specifically for what it calls "agentic AI infrastructure," the CPU-side orchestration work required to coordinate accelerators and manage data movement in large-scale AI deployments
3
. Rene Haas has high hopes for agentic AI to accelerate Arm's data center business, predicting the company's datacenter silicon will catapult its total addressable market to more than $100 billion by the end of the decade4
. These figures are based on the belief that agentic frameworks will quadruple the demand for CPU cores4
. While models powering agentic systems run on specialized accelerators, the agents themselves run on processors and need additional compute and memory resources to execute code generated by models to automate tasks4
. The chip supports 12 channels of DDR5 memory at up to 8800 MT/s, delivering more than 800 GB/s of aggregate memory bandwidth with a target of sub-100ns latency3
.
Source: The Register
Arm executives emphasized the company's history of designing energy-efficient chips and claimed the AGI CPU will be the world's "most efficient agentic CPU on the market"
2
. Compared to competitors like the latest x86 platforms made by Intel and AMD, Arm says this chip will deliver better performance per watt and could save customers billions of dollars in electricity spending2
. The company claims the AGI CPU delivers more than two times the performance per rack compared to the latest x86 platforms, though this figure is based on internal estimates rather than independent benchmarks3
. Mohamed Awad, Arm's EVP of Cloud AI, argues that the company's AGI CPU is better suited to agentic tasks thanks to a streamlined core that foregoes extraneous functionality and doesn't rely on simultaneous multithreading4
.
Source: The Register
Related Stories
Arm's increasing reach is a direct threat to Intel and AMD's x86 data center products, though Haas argues the market is "plenty big enough for multiple players"
5
. The company faces competition from multiple fronts. AMD's latest Epyc processors, due out later this year, will offer up to 256 cores—nearly twice the core count of Arm's new chip even with simultaneous multithreading turned off4
. Nvidia just introduced a new CPU lineup targeting the same category, though Haas said his chip aims at a different part of the market. The move also risks complicating Arm's relationship with customers, as most of the biggest buyers of data center silicon have their own in-house chip programs that license technology from Arm5
. Ben Bajarin, CEO and principal analyst at Creative Strategies, notes that Arm could be perceived more as a competitor than partner as its strategy evolves2
.Arm said the AGI CPU product line will continue in parallel with the Arm Neoverse CSS product roadmap, with follow-on products already committed
3
. To stay competitive, Arm will be releasing new chips as early as next year with a third-gen AGI CPU already under development4
. The company decided to make the new chip because customers asked for it, with Haas stating the product is "not only compelling—but we actually have customers who are lined up to buy it"5
. The shift helps Arm benefit from bigger-ticket purchases, as even the most expensive smartphone chips cost tens of dollars while the highest-end data center silicon can run in the tens of thousands5
. Under Haas, Arm has increased revenue by more than 20% a year, with annual sales topping $4 billion for the first time in 20255
.Summarized by
Navi
[2]
[3]
[4]