5 Sources
[1]
Nvidia AI challenger Groq announces European expansion -- Helsinki data center targets burgeoning AI market
Positioning itself as leaner and faster at inference workloads, Groq is hoping to outcompete Nvidia at its own game. American AI hardware and software firm, Groq (not to be confused with Elon Musk's AI venture, Grok), has announced it's establishing its first data center in Europe as part of its efforts to compete in the rapidly expanding AI industry in the EU market, as per CNBC. It's looking to capture a sizeable portion of the inference market, leveraging its efficient Language Processing Unit (LPU), application-specific integrated circuit (ASIC) chips to offer fast, efficient inference that it claims will outcompete the GPU-driven alternatives. "We decided about four weeks ago to build a data center in Helsinki, and we're actually unloading racks into it right now," Groq CEO Jonathan Ross said in his interview with CNBC. "We expect to be serving traffic to it by the end of this week. That's built fast, and it's a very different proposition than what you see in the rest of the market." It's that speed and efficiency of hardware and operation that Ross believes will give Groq an edge in a market that's currently dominated by Nvidia. While the established graphics card manufacturer has cornered the market when it comes to the hardware to train AI, Ross believes that Groq is well-positioned to take over the day-to-day running of those algorithms by powering the inference calculations that allow them to function so effectively. "Inference tends to be a higher volume, but lower margin business," he explained, suggesting Groq was happy to take that on. He even suggested Nvidia's shareholders would be happy because it would help propel the industry forward, benefiting Nvidia in the long term. However, he pulled no punches in going after Nvidia as competition for that inference business. Although complimentary of the power and impressive capabilities of Nvidia GPUs, Ross suggested that those general-purpose chips weren't designed with AI in mind. Groq's LPUs are. They're ASICs, which are specifically designed for AI inference calculations. They use on-chip memory to reduce the latency of tasks, and the chips are designed to handle the linear algebraic calculations that large language AI models require. Ross claims that this makes its LPUs both fast and efficient, using around a third of the power of traditional GPU designs to produce the same output. Ross isn't just banking on the speed of his company's hardware, though; he's banking on their speed of responsiveness. "[Nvidia CEO] Jensen said at GTC, if you want to get GPUs in two years, you need to put your [Purchase Orders] in now [...] That's just a ridiculous requirement," he said. "For us, it's about six months. Which is a fourth of the time, and that totally changes your ability to make predictions about what you need." "Nvidia can only build as many GPUs as it's looking to build this year, because it uses very exotic components like HBM [...] We don't use any of that, and so we're not as supply limited, and that's really important for inference." Backed by major investments from Samsung, and Cisco, Groq's new data center is being built in partnership with American data center firm, Equinix. That will help it get up and running faster and provide opportunities for further expansion. Nvidia has its own long list of AI partners in building what CEO Jensen Huang called "AI factories," all over the world, promising hundreds of billions of dollars of investment. It feels a little like companies are taking sides, and that may be equally important from a software front as much as it is for the hardware. Nvidia has dominated much of the professional graphics card space for some time due to its CUDA software stack, which has become part of many early AI development toolkits built around its CUDA-X platform, in addition to the sheer size and scale of the overall business. But even with Nvidia's gargantuan position in the market, Groq appears confident it can compete. With its LPUs using a much simpler design than Nvidia GPUs, and leveraging a generic compiler, Groq claims its hardware and software can be leaner, faster, and just as, if not more, capable. Ross claimed Groq was ready for its own smaller, upstart competition too. Suggesting that even as hot as AI inference industry is, Groq should come out ahead of startups beginning to build their alternative offerings. "You'll get lots of startups popping up, but building an AI chip is expensive. You're going to (be) spending anywhere from a quarter to half a billion dollars to get that thing to market, and you can't fund everyone to do that." Ross continued. When the interviewer pressed about staff retention, highlighting the problems OpenAI had recently faced with Meta poaching staff with large signing bonuses, Ross appeared as unconcerned as Sam Altman (despite losing researchers to Meta's efforts). "I think we've had an easier time finding and retaining talent, because we're a little adjacent to the AI research space [...] That said, it is a hot industry, and there is a lot of pull for the best talent. In our case, I think a lot of people view us as having a very high growth trajectory to be successful, and they'd like to be on that path, so people join us very much for the equity and growth potential." Regardless of any potential staffing issues, though, Groq is likely to receive a warm welcome in Europe, with many countries in the EU and the UK looking to invest heavily in AI inference data centers in the coming years, as they look to compete with US and Chinese efforts. If this data center can truly start effectively serving customers within weeks of decision makers launching the endeavour, it's unlikely to be the last. However, the power and speed of adoption for Groq will remain crucial to the company's future success. Nvidia made $35.6 billion from the last quarter in data center hardware alone, while AMD made $3.7 billion. If Groq wants to be a meaningful competitor in the space, the company needs to appeal to customers and companies looking to build their own ASIC solutions for AI workloads, which is a very specific subsection of the market. Groq would also require an answer to how scalable their hardware is too, as potential hyperscalers looking for an ASIC solution will be watching the company closely. This is in tandem with potential competition from other, more established ASIC businesses, such as Broadcom, Marvell, and Mediatek. If Groq should prove itself worthy, then it might become a part of the same cohort of indirect Nvidia competitors, looking for a slice of the extremely hot AI pie.
[2]
AI chip startup Groq expands with first European data center
Jonathan Ross, chief executive officer of Groq Inc., during the GenAI Summit in San Francisco, California, US, on Thursday, May 30, 2024. Artificial intelligence semiconductor startup Groq announced Monday it has established its first data center in Europe as it steps up its international expansion. Groq, which is backed by investment arms of Samsung and Cisco, said the data center will be located in Helsinki, Finland and is in partnership with Equinix. Groq is looking to take advantage of rising demand for AI services in Europe following other U.S. firms which have also ramped up investment in the region. The Nordics in particular is a popular location for the data facilities as the region has easy access to renewable energy and cooler climates. Last month, Nvidia CEO Jensen Huang was in Europe and signed several infrastructure deals, including data centers. Groq, which is valued at $2.8 billion, designs a chip that the company calls a language processing unit (LPU). It is designed for inferencing rather training. Inferencing is when a pre-trained AI model interprets live data to come up with a result, much like the answers that are produced by popular chatbots. While Nvidia has a stranglehold on the chips required for training huge AI models with its graphics processing units (GPUs), there is a swathe of startups hoping to take a slice of the pie when it comes to inferencing. SambaNova; Ampere, a company SoftBank is in the process of purchasing; Cerebras and Fractile, are all looking to join the AI inference race. European politicians have been pushing the notion of sovereign AI -- where data centers must be located in the region. Data centers that are located closer to users also help improve the speed of services. Global data center builder Equinix connects different cloud providers together, such as Amazon Web Services and Google Cloud, making it easier for businesses to have multiple vendors. Groq's LPUs will be installed inside the Equinix data center allowing businesses to access Groq's inference capabilities via Equinix. Groq currently has data centers in the U.S. and Canada and Saudi Arabia with its technology.
[3]
AI chipmaker Groq opens first European data centre in Finland
The nine-year-old start-up has raised more than $2.5bn backed by VCs including Access Ventures, AJI Capital and APL Ventures. Artificial intelligence semiconductor start-up Groq is establishing its first European data centre in Helsinki, Finland to meet the region's growing hunger for AI. The start-up offers AI inference capabilities - the critical tech that underpins today's generative AI boom. The company, known for its low latency AI chips, was founded in 2016 by former Alphabet AI chip engineer Jonathan Ross. Start-ups such as Groq, as well as its competitors including Ampere are growing in the AI chip sector dominated by the likes of Nvidia. The company's entrance in Europe is in collaboration with Equinix, a digital infrastructure company with a strong presence in Ireland. The Finnish data centre follows another data centre the two companies built together in Dallas, US. 13.5pc of EU-based enterprises with 10 or more employees used AI in 2024 - up from 8pc the year before. While, in the same year, more than 41pc of large enterprises used AI. Many companies, especially larger firms, are making a conscious shift by utilising AI, with government policies also recognising the importance of using the tech to improve returns and grow. The new European footprint brings AI inference capacity closer to users across Europe, Groq said, which is expected to offer lower latency, faster response times at scale and stronger data governance to its clients. The company's global network, spread across the US, Canada and the Kingdom of Saudi Arabia, serves more than 20m tokens per second, the company added. The nine-year-old start-up has raised more than $2.5bn backed by VCs including Access Ventures, AJI Capital and APL Ventures. According to reports, it is valued at $2.8bn. "As demand for AI inference continues at an ever-increasing pace, we know that those building fast need more - more capacity, more efficiency and with a cost that scales," said Ross, who is also the CEO of the company. Regina Donato Dahlström, the managing director for the Nordics at Equinix said that the Nordics is a "great place" to host AI infrastructure. "With its sustainable energy policies, free cooling and reliable power grid, Finland is a standout choice for hosting this new capacity. According to the International Energy Agency, data centres, both AI-driven and non AI-driven, could use 80pc more energy in 2026 than in 2022. In addition, electricity consumption in AI-driven data centres is expected to rise by 90 TWh - or around 4pc of the EU's current electricity consumption. The EU states that AI-focused data centres tend to cluster in geographical locations and contribute to pressure on local grids and could involve trade-offs with climate goals, land use and energy affordability. Meanwhile, pressure on Ireland's grid is already on the rise, with data centres using up 22pc of the country's electricity in 2024. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
[4]
AI chip firm Groq expands to Europe
Groq, an artificial intelligence semiconductor startup, announced Monday the establishment of its first European data center in Helsinki, Finland, in partnership with Equinix, as part of its international expansion efforts. The move allows Groq to capitalize on growing demand for AI services in Europe, following similar investments by other U.S. firms. The Nordic region is desirable for data facilities because of its access to renewable energy and cooler climates. Last month, Nvidia CEO Jensen Huang signed several infrastructure deals in Europe, including data center agreements. Groq, valued at $2.8 billion and backed by investment arms of Samsung and Cisco, designs language processing units (LPUs) intended for inferencing rather than training. Inferencing involves a pre-trained AI model interpreting live data to produce results, such as those from chatbots. While Nvidia dominates the market for chips required to train AI models using graphics processing units (GPUs), Groq is among several startups, including SambaNova, Ampere (which SoftBank is acquiring), Cerebras, and Fractile, aiming to gain market share in AI inference. According to CEO Jonathan Ross, Groq is seeking to differentiate itself from competitors, including Nvidia, in several areas. Ross stated in a Monday interview with CNBC that Nvidia chips use expensive components, such as high-bandwidth memory, with limited suppliers. In contrast, Groq's LPUs do not use these chips, and its supply chain is primarily based in North America. Groq sparks LPU vs GPU face-off "We're not as supply limited, and that's important for inference, which is very high volume, low margin," Ross told CNBC's "Squawk Box Europe." Ross added, "And the reason that we're so good for Nvidia's shareholders is, we're happy to take that high volume but lower margin business and let others focus on the high-margin training." Ross also emphasized Groq's rapid deployment capabilities, noting that the company decided to build the Helsinki data center four weeks prior to unloading server racks at the location. "We expect to be serving traffic starting by the end of this week. That's built fast and so it's a very different proposition from what you see in the rest of the market," Ross said. European politicians have advocated for sovereign AI, which requires data centers to be located within the region. Data centers closer to users can also improve service speeds. Equinix, a global data center builder, connects cloud providers like Amazon Web Services and Google Cloud, allowing businesses to use multiple vendors. Groq's LPUs will be installed in the Equinix data center, providing businesses access to Groq's inference capabilities through Equinix. Groq currently operates data centers using its technology in the U.S., Canada, and Saudi Arabia.
[5]
Nvidia's AI Chip Rival Groq Launches First European Data Center - NVIDIA (NASDAQ:NVDA)
Groq, a U.S.-based startup specializing in AI chips, has launched its first European data center in Helsinki, Finland as part of the company's international expansion and goal to meet the rising demand for AI services across Europe. The Details: Groq's new Helsinki location was developed in partnership with Equinix, a major global data center provider. Read Next: Meta's Growth Sizzles, But Wait For A Pullback Before Buying In Finland was chosen due to factors that make it ideal for large-scale data centers, including reliable energy infrastructure, abundance of renewable energy and a cool climate. according to CNBC. The decision to build the Helsinki data center was made just four weeks before its launch, highlighting Groq's ability to move quickly. The company expects to start serving customers almost immediately. Why It Matters: Groq, which is valued at $2.8 billion and backed by investors such as Samsung and Cisco, is positioning itself as a challenger to established players like NVIDIA Corp. NVDA in the AI inference market. Groq specializes in inferencing, which is the process of running live data through pre-trained AI models. Inferencing is less demanding than training new AI models -- a process that usually relies on Nvidia's GPUs due to its high computational requirements. Unlike other competitors, Groq's chips do not rely on expensive, hard-to-source components like high-bandwidth memory. Most of its supply chain is based in North America, making Groq less vulnerable to global supply disruptions and tariffs. Groq already operates data centers in the United States, Canada and Saudi Arabia. The new Helsinki center strengthens its presence in Europe and supports its goal of delivering fast, affordable AI inference capabilities worldwide. Read Next: OpenAI Says Tokens Are Not Equity, No Partnership With Robinhood: Musk Says AI Company's Equity 'Is Fake' Image: Shutterstock NVDANVIDIA Corp $157.93-0.88% Stock Score Locked: Edge Members Only Benzinga Rankings give you vital metrics on any stock - anytime. Unlock Rankings Edge Rankings Momentum 78.00 Growth 98.58 Quality Not Available Value 6.92 Price Trend Short Medium Long Overview Market News and Data brought to you by Benzinga APIs
Share
Copy Link
Groq, an AI chip startup, has established its first European data center in Helsinki, Finland, aiming to compete with Nvidia in the AI inference market. The company's rapid deployment and efficient chip design could reshape the AI hardware landscape.
Groq, an American AI hardware and software firm, has announced the establishment of its first European data center in Helsinki, Finland 123. This move marks a significant step in the company's international expansion and its bid to compete in the rapidly growing AI industry, particularly in the European market.
Jonathan Ross, CEO of Groq, highlighted the company's ability to move quickly, stating that the decision to build the Helsinki data center was made just four weeks prior to its launch 14. This rapid deployment capability sets Groq apart from its competitors, with Ross claiming, "We expect to be serving traffic starting by the end of this week. That's built fast and so it's a very different proposition from what you see in the rest of the market" 4.
Source: Benzinga
At the heart of Groq's offering is its Language Processing Unit (LPU), an application-specific integrated circuit (ASIC) chip designed specifically for AI inference calculations 1. The company claims that its LPUs are both faster and more efficient than traditional GPU designs, using around a third of the power to produce the same output 1.
While Nvidia dominates the market for chips used in training AI models, Groq is positioning itself to capture a significant portion of the inference market 2. Ross believes that Groq's hardware and software can be leaner, faster, and potentially more capable than Nvidia's offerings 1.
Source: CNBC
Groq's European expansion is being carried out in partnership with Equinix, a global data center provider 23. The company is backed by major investors, including Samsung and Cisco, and is valued at $2.8 billion 25. This financial backing and strategic partnership provide Groq with the resources and infrastructure needed to compete on a global scale.
The choice of Helsinki for Groq's first European data center is strategic. The Nordic region is known for its access to renewable energy and cooler climates, making it ideal for data center operations 23. Additionally, European politicians have been advocating for "sovereign AI," which requires data centers to be located within the region 2.
Groq is entering a competitive field, with other startups like SambaNova, Ampere, Cerebras, and Fractile also vying for a share of the AI inference market 2. However, Groq believes its unique approach to chip design and rapid deployment capabilities will give it an edge.
Source: Silicon Republic
As demand for AI services continues to grow, particularly in Europe, Groq's expansion could potentially reshape the AI hardware landscape. The company's focus on efficient, fast-deploying inference solutions positions it well to meet the needs of businesses looking to implement AI technologies quickly and cost-effectively 145.
Google DeepMind introduces AlphaEarth Foundations, an AI model that integrates vast amounts of Earth observation data to create a unified digital representation of the planet, offering unprecedented accuracy and efficiency in global mapping and environmental monitoring.
6 Sources
Technology
8 hrs ago
6 Sources
Technology
8 hrs ago
Google announces its intention to sign the European Union's AI Code of Practice, contrasting with Meta's refusal and highlighting the ongoing debate over AI regulation in the tech industry.
12 Sources
Policy and Regulation
16 hrs ago
12 Sources
Policy and Regulation
16 hrs ago
Microsoft's market capitalization surpasses $4 trillion after reporting exceptional quarterly earnings, driven by strong growth in cloud computing and AI services. The company joins Nvidia in the exclusive $4 trillion club, showcasing the impact of AI on tech giants.
6 Sources
Business and Economy
23 mins ago
6 Sources
Business and Economy
23 mins ago
Palo Alto Networks announces a $25 billion acquisition of CyberArk, aiming to strengthen its position in identity security and privileged access management, particularly in the face of growing AI-driven cybersecurity threats.
13 Sources
Business and Economy
16 hrs ago
13 Sources
Business and Economy
16 hrs ago
Arm Holdings announces plans to invest in developing its own chips, marking a significant shift from its traditional licensing model. The company's stock falls as its outlook disappoints investors amid slowing smartphone sales.
7 Sources
Technology
8 hrs ago
7 Sources
Technology
8 hrs ago