4 Sources
[1]
OpenAI turns to Google's AI chips to power its products, The Information reports
June 27 (Reuters) - OpenAI has recently begun renting Google's (GOOGL.O), opens new tab artificial intelligence chips to power ChatGPT and other products, The Information reported on Friday, citing a person involved in the arrangement. The move, which marks the first time OpenAI has used non-Nvidia chips in a meaningful way, shows the Sam Altman-led company's shift away from relying on backer Microsoft's (MSFT.O), opens new tab data centers, potentially boosting Google's tensor processing units (TPUs) as a cheaper alternative to Nvidia's (NVDA.O), opens new tab graphics processing units (GPUs), the report said. As one of the largest purchasers of Nvidia's GPUs, OpenAI uses AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Both OpenAI and Google did not immediately respond to Reuters requests for comment. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house TPUs, which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple (AAPL.O), opens new tab as well as startups like Anthropic and Safe Superintelligence, two OpenAI competitors launched by former OpenAI leaders. Reporting by Juby Babu in Mexico City; Editing by Alan Barona Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Artificial Intelligence
[2]
OpenAI Said to Turn to Google's AI Chips to Power Its Products
These Ai chips will be used to power the company's various services OpenAI has recently begun renting Google's artificial intelligence chips to power ChatGPT and its other products, a source close to the matter told Reuters on Friday. The ChatGPT maker is one of the largest purchasers of Nvidia's graphics processing units (GPUs), using the AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house tensor processing units (TPUs), which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders. The move to rent Google's TPUs signals the first time OpenAI has used non-Nvidia chips meaningfully and shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centers. It could potentially boost TPUs as a cheaper alternative to Nvidia's GPUs, according to the Information, which reported the development earlier. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Google declined to comment while OpenAI did not immediately respond to Reuters when contacted. Google's addition of OpenAI to its customer list shows how the tech giant has capitalized on its in-house AI technology from hardware to software to accelerate the growth of its cloud business.
[3]
OpenAI turns to Google's AI chips to power its products: The Information - The Economic Times
OpenAI has recently begun renting Google's artificial intelligence chips to power ChatGPT and other products, The Information reported on Friday, citing a person involved in the arrangement. The move, which marks the first time OpenAI has used non-Nvidia chips in a meaningful way, shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centres, potentially boosting Google's tensor processing units (TPUs) as a cheaper alternative to Nvidia's graphics processing units (GPUs), the report said. As one of the largest purchasers of Nvidia's GPUs, OpenAI uses AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Both OpenAI and Google did not immediately respond to Reuters requests for comment. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house TPUs, which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two OpenAI competitors launched by former OpenAI leaders.
[4]
NVIDIA Might Be Losing AI Dominance As OpenAI Shifts To Google's Chips - Report
This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy. Google has managed to eke away some of OpenAI's AI computing requirements from NVIDIA, suggests a new report from The Information. The report, which quotes a single source, shares that OpenAI has turned to Google to rely on the latter's tensor processing units (TPUs) to power its AI products. OpenAI has primarily been believed to rely on Microsoft, Oracle and others to use NVIDIA's GPUs to train and run its AI software. According to the report, OpenAI is interested in lowering its operating costs by using Google's TPUs, and Google has not rented out its latest TPUs to its AI rival. Google launched its seventh-generation TPUs in April, which it claimed was its first AI chip inference. The TPUs made quite a splash in 2024 when Apple revealed that it had used the firm's TPUs to train its Apple Intelligence platform. However, training is just one part of the AI equation, as AI chips rely on inferencing to generate outputs in response to user queries. According to The Information, OpenAI is using Google's TPU to power ChatGPT and other AI products. The shift has been recent, and the publication quotes a single anonymous source in its report. The reason OpenAI is using Google's chips instead of NVIDIA's is to reduce costs, as the firm has faced heavy usage lately, as it learned after introducing a new image generation platform earlier this year. The Information also adds that Google is purportedly seeking to offer its TPU chips to cloud computing infrastructure providers. NVIDIA's high prices and short supply of its GPUs have opened up a market for alternative chips that can offer users lower costs. If Google is able to convince the infrastructure providers to adopt its chips as well, then it could aim at NVIDIA's market share and eke it away as supply continues to be constrained. Google also relies on its own chips to train its AI models. The firm is one of the most diverse AI companies in the industry as it operates across both the hardware and software stacks. Along with its TPUs, Google also offers its Gemini AI model and has been busy integrating Gemini across its platforms such as Gmail and Google Search. OpenAI's reliance on Microsoft and Oracle means that it has exclusively relied on NVIDIA's GPUs. Oracle has positioned itself to hold one of the largest inventories of NVIDIA GPUs in the industry. If OpenAI relies on TPUs from Google, then Google will have managed to break NVIDIA's near-monopoly like status on the high-end AI industry. Earlier this year, OpenAI made headlines after it partnered with Oracle as part of President Trump's Startgate project in a decision that appeared to have snubbed Microsoft.
Share
Copy Link
OpenAI has begun using Google's TPUs to power ChatGPT and other products, marking a significant shift from its reliance on NVIDIA GPUs and Microsoft's data centers.
In a surprising move that could reshape the AI hardware landscape, OpenAI has recently begun renting Google's artificial intelligence chips to power its flagship product ChatGPT and other offerings 1. This marks the first time OpenAI has used non-NVIDIA chips in a significant capacity, signaling a potential shift in the AI chip market.
Source: NDTV Gadgets 360
OpenAI, known as one of the largest purchasers of NVIDIA's graphics processing units (GPUs), is exploring Google's tensor processing units (TPUs) as a potentially cheaper alternative 2. The company hopes that renting TPUs through Google Cloud will help lower the cost of inference computing, a crucial process where AI models use trained knowledge to make predictions or decisions based on new information 3.
This move also represents a shift away from OpenAI's reliance on backer Microsoft's data centers, potentially diversifying its infrastructure and reducing dependency on a single provider 1.
For Google, this deal comes at a time when the tech giant is expanding the external availability of its in-house TPUs, which were historically reserved for internal use 2. This expansion has already helped Google win customers including Apple and AI startups like Anthropic and Safe Superintelligence 3.
Source: Wccftech
NVIDIA has long dominated the AI chip market, but this development could potentially challenge its position. The high prices and short supply of NVIDIA's GPUs have created an opportunity for alternative chip providers 4. If Google can convince more cloud computing infrastructure providers to adopt its TPUs, it could start to erode NVIDIA's market share.
Despite this collaboration, it's worth noting that Google and OpenAI remain competitors in the broader AI race. Reports suggest that Google is not renting its most powerful TPUs to OpenAI, indicating a careful balance between cooperation and competition 1.
This development follows OpenAI's earlier plans to add Google Cloud services to meet its growing computing capacity needs, marking an unexpected collaboration between two prominent AI competitors 3.
As the AI industry continues to evolve rapidly, this shift in hardware strategy by OpenAI could have far-reaching implications for the competitive landscape, potentially reshaping the dynamics of AI chip supply and demand in the coming years.
The Trump administration is planning a series of executive actions to increase energy supply and infrastructure for AI development, aiming to maintain U.S. leadership in the AI race against China.
9 Sources
Technology
1 day ago
9 Sources
Technology
1 day ago
Microsoft's in-house AI chip, codenamed Braga, has been delayed by at least six months, pushing its mass production to 2026. The chip is expected to underperform compared to Nvidia's Blackwell, highlighting the challenges tech giants face in developing custom AI processors.
8 Sources
Technology
17 hrs ago
8 Sources
Technology
17 hrs ago
Meta Platforms is in advanced talks with private capital firms to raise $29 billion for building AI data centers, highlighting the company's aggressive push into artificial intelligence infrastructure.
4 Sources
Business and Economy
17 hrs ago
4 Sources
Business and Economy
17 hrs ago
Google releases Gemma 3n, an open-source AI model designed for on-device use, capable of running on just 2GB RAM. This multimodal model supports various input types and works across 140 languages, marking a significant advancement in accessible AI technology.
3 Sources
Technology
1 day ago
3 Sources
Technology
1 day ago
Anthropic, a leading AI company, has initiated the Economic Futures Program to research and address the potential economic fallout of AI, including job losses and market disruptions. The program offers grants, hosts policy forums, and aims to gather comprehensive data on AI's economic impact.
5 Sources
Business and Economy
17 hrs ago
5 Sources
Business and Economy
17 hrs ago