2 Sources
[1]
Nvidia-backed Enfabrica releases system aimed at easing memory costs
SAN FRANCISCO, July 29 (Reuters) - Enfabrica, a Silicon Valley-based chip startup working on solving bottlenecks in artificial intelligence data centers, on Tuesday released a chip-and-software system aimed at reining in the cost of memory chips in those centers. Enfabrica, which has raised $260 million in venture capital to date and is backed by Nvidia (NVDA.O), opens new tab, released a system it calls EMFASYS, pronounced like "emphasis." The system aims to address the fact that a portion of the high cost of flagship AI chips from Nvidia or rivals such as Advanced Micro Devices (AMD.O), opens new tab is not the computing chips themselves, but the expensive high-bandwidth memory (HBM) attached to them that is required to keep those speedy computing chips supplied with data. Those HBM chips are supplied by makers such as SK Hynix (000660.KS), opens new tab and Micron Technology (MU.O), opens new tab. The Enfabrica system uses a special networking chip that it has designed to hook the AI computing chips up directly to boxes filled with another kind of memory chip called DDR5 that is slower than its HBM counterpart but much cheaper. By using special software, also made by Enfabrica, to route data back and forth between AI chips and large amounts of lower-cost memory, Enfabrica is hoping its chip will keep data center speeds up but costs down as tech companies ramp up chatbots and AI agents, said Enfabrica Co-Founder and CEO Rochan Sankar. Rochan said Enfabrica has three "large AI cloud" customers using the chip but declined to disclose their names. "It's not replacing" HBM, Sankar told Reuters. "It is capping (costs) where those things would otherwise have to blow through the roof in order to scale to what people are expecting." Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed Our Standards: The Thomson Reuters Trust Principles., opens new tab
[2]
Nvidia-backed Enfabrica releases system aimed at easing memory costs
SAN FRANCISCO (Reuters) -Enfabrica, a Silicon Valley-based chip startup working on solving bottlenecks in artificial intelligence data centers, on Tuesday released a chip-and-software system aimed at reining in the cost of memory chips in those centers. Enfabrica, which has raised $260 million in venture capital to date and is backed by Nvidia, released a system it calls EMFASYS, pronounced like "emphasis." The system aims to address the fact that a portion of the high cost of flagship AI chips from Nvidia or rivals such as Advanced Micro Devices is not the computing chips themselves, but the expensive high-bandwidth memory (HBM) attached to them that is required to keep those speedy computing chips supplied with data. Those HBM chips are supplied by makers such as SK Hynix and Micron Technology. The Enfabrica system uses a special networking chip that it has designed to hook the AI computing chips up directly to boxes filled with another kind of memory chip called DDR5 that is slower than its HBM counterpart but much cheaper. By using special software, also made by Enfabrica, to route data back and forth between AI chips and large amounts of lower-cost memory, Enfabrica is hoping its chip will keep data center speeds up but costs down as tech companies ramp up chatbots and AI agents, said Enfabrica Co-Founder and CEO Rochan Sankar. Rochan said Enfabrica has three "large AI cloud" customers using the chip but declined to disclose their names. "It's not replacing" HBM, Sankar told Reuters. "It is capping (costs) where those things would otherwise have to blow through the roof in order to scale to what people are expecting." (Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed)
Share
Copy Link
Enfabrica, a Silicon Valley startup, has introduced EMFASYS, a chip-and-software system designed to reduce memory costs in AI data centers by utilizing cheaper DDR5 memory alongside expensive high-bandwidth memory.
Enfabrica, a Silicon Valley-based chip startup, has unveiled a groundbreaking chip-and-software system called EMFASYS, aimed at addressing the escalating costs of memory in artificial intelligence (AI) data centers. The company, which has secured $260 million in venture capital funding and boasts Nvidia as a backer, is tackling one of the most pressing challenges in the AI industry 1.
The development of EMFASYS is driven by the recognition that a significant portion of the expense associated with flagship AI chips from industry leaders like Nvidia and Advanced Micro Devices (AMD) is not the computing chips themselves. Instead, it's the costly high-bandwidth memory (HBM) attached to these chips that drives up prices. HBM is essential for keeping the high-speed computing chips supplied with data, and is typically provided by manufacturers such as SK Hynix and Micron Technology 2.
Source: Reuters
Enfabrica's EMFASYS system introduces a novel approach to memory management in AI data centers. The system employs a specially designed networking chip that directly connects AI computing chips to containers filled with DDR5 memory chips. While DDR5 is slower than HBM, it is significantly less expensive, potentially offering substantial cost savings for data center operators 1.
A key component of the EMFASYS system is Enfabrica's proprietary software. This software efficiently routes data between AI chips and large volumes of lower-cost memory, aiming to maintain data center speeds while reducing overall costs. This approach is particularly relevant as tech companies continue to scale up their chatbot and AI agent capabilities 2.
Enfabrica Co-Founder and CEO Rochan Sankar emphasized that EMFASYS is not intended to replace HBM entirely. Instead, it aims to cap costs that would otherwise "blow through the roof" as AI applications scale up to meet growing demand. The company has already secured three "large AI cloud" customers, although their identities remain undisclosed 1.
The introduction of EMFASYS could have far-reaching implications for the AI industry. By potentially reducing one of the most significant cost factors in AI data centers, Enfabrica's solution may enable more companies to scale their AI operations cost-effectively. This development could accelerate the adoption and deployment of AI technologies across various sectors, fostering innovation and competition in the rapidly evolving field of artificial intelligence.
Summarized by
Navi
[2]
Google releases Gemini 2.5 Deep Think, an advanced AI model designed for complex queries, available exclusively to AI Ultra subscribers at $250 per month. The model showcases improved performance in various benchmarks and introduces parallel thinking capabilities.
17 Sources
Technology
15 hrs ago
17 Sources
Technology
15 hrs ago
OpenAI raises $8.3 billion in a new funding round, valuing the company at $300 billion. The AI giant's rapid growth and ambitious plans attract major investors, signaling a significant shift in the AI industry landscape.
10 Sources
Business and Economy
7 hrs ago
10 Sources
Business and Economy
7 hrs ago
Reddit's Q2 earnings reveal significant growth driven by AI-powered advertising tools and data licensing deals, showcasing the platform's successful integration of AI technology.
7 Sources
Business and Economy
15 hrs ago
7 Sources
Business and Economy
15 hrs ago
Reddit is repositioning itself as a search engine, integrating its traditional search with AI-powered Reddit Answers to create a unified search experience. The move comes as the platform sees increased user reliance on its vast community-generated content for information.
9 Sources
Technology
23 hrs ago
9 Sources
Technology
23 hrs ago
OpenAI is poised to launch GPT-5, a revolutionary AI model that promises to unify various AI capabilities and automate model selection for optimal performance.
2 Sources
Technology
15 hrs ago
2 Sources
Technology
15 hrs ago