Curated by THEOUTPOST
On Wed, 18 Dec, 8:02 AM UTC
2 Sources
[1]
Why OpenAI Needs So Much Money
Cade Metz has covered artificial intelligence for more than 15 years. Early last year, OpenAI raised $10 billion. Just 18 months later, the company had burned through most of that money. So it raised $6.6 billion more and arranged to borrow an additional $4 billion. But in another 18 months or so from now, OpenAI will need another cash infusion because the San Francisco start-up is spending more than $5.4 billion a year. And by 2029, OpenAI expects to spend $37.5 billion a year. OpenAI's accelerating expenses are the main reason the corporate structure of the company, which began as a nonprofit research lab, could soon change. OpenAI must raise billions of additional dollars in the years to come, and its executives believe it will be more attractive to investors as a for-profit company. In many ways, artificial intelligence has inverted how computer technology used to be created. For decades, Silicon Valley engineers designed new technologies one small step at a time. As they built social media apps like Facebook or shopping sites like Amazon, they wrote line after line of computer code. With each new line, they carefully defined what the app would do. But when companies build A.I. systems, they go big first: They feed these systems enormous amounts of data. The more data companies feed into these systems, the more powerful they become. Just as a student learns more by reading more books, an A.I. system can improve its skills by analyzing ingesting larger pools of data. Chatbots like ChatGPT learn their skills by ingesting practically all the English language text on the internet. That requires larger and larger amounts of computing power from giant data centers. Inside those data centers are computers packed with thousands of specialized computer chips called graphics processing units, or GPUs, which can cost more than $30,000 apiece. The cost is pushed higher because the chips, data centers and electricity needed to do this digital work are in short supply. Sean Holzknecht, chief executive of Colovore, a data center operator whose facilities are adopting specialized chips used to build A.I., said this new kind of computing facility cost 10 to 20 times what a traditional data center does. These chips spend months running the mathematical calculations that allow ChatGPT to pinpoint patterns in all that data. The price tag for each "training run" can climb into the hundreds of millions of dollars. "Imagine needing to read the internet over and over and over," said David Katz, a partner at Radical Ventures, a venture capital firm that has invested in A.I. start-ups. "This is the most computationally intensive task the world has ever seen." Google, Microsoft, OpenAI and others are now working to expand the global pool of data centers needed to build their technologies. They plan to spend hundreds of billions to increase the number of computer chips manufactured each year, install them in facilities across the world and secure the electricity needed to run them. Those costs are particularly onerous when companies like OpenAI, Google and Anthropic offer chats to consumers at no charge. Some of them are charging consumers around $20 a month to use their most powerful technologies -- and even that may not recoup the cost of delivery. (The New York Times has sued OpenAI and its partner, Microsoft, claiming copyright infringement of news content related to A.I. systems. The two companies have denied the suit's claims). Since building the initial version of ChatGPT, OpenAI has steadily improved its chatbot, feeding it increasingly large amounts of data, including images and sounds as well as text. The company recently unveiled a version of ChatGPT that "reasons" through math, science and computer programming problems. It built this technology using a technique called reinforcement learning. Through this process, the system learns additional behavior over months of trial and error. Trying to solve various math problems, for instance, it can learn which methods lead to the right answer and which do not. When people use this system, it "thinks" before responding. When someone asks it a question, it explores many possibilities before delivering an answer. OpenAI sees this technology, called OpenAI o1, as the future of its business. And it requires even more computing power. That is why the company expects that its computing costs will grow sevenfold by 2029 as it chases the dream of artificial general intelligence -- a machine that can do as much as the human brain, or more. "If you are trying to chase science fiction," said Nick Frosst, a former Google researcher and co-founder of Cohere, a start-up that builds similar technology, "the costs will keep going to up."
[2]
Why does OpenAI need so much money?
OpenAI's accelerating expenses are the main reason the corporate structure of the company, which began as a nonprofit research lab, could soon change. OpenAI must raise billions of additional dollars in the years to come, and its executives believe it will be more attractive to investors as a for-profit company.Early last year, OpenAI raised $10 billion. Just 18 months later, the company had burned through most of that money. So it raised $6.6 billion more and arranged to borrow an additional $4 billion. But in another 18 months or so from now, OpenAI will need another cash infusion because the San Francisco startup is spending more than $5.4 billion a year. And by 2029, OpenAI expects to spend $37.5 billion a year. OpenAI's accelerating expenses are the main reason the corporate structure of the company, which began as a nonprofit research lab, could soon change. OpenAI must raise billions of additional dollars in the years to come, and its executives believe it will be more attractive to investors as a for-profit company. In many ways, artificial intelligence has inverted how computer technology used to be created. For decades, Silicon Valley engineers designed new technologies one small step at a time. As they built social media apps like Facebook or shopping sites like Amazon, they wrote line after line of computer code. With each new line, they carefully defined what the app would do. But when companies build AI systems, they go big first: They feed these systems enormous amounts of data. The more data companies feed into these systems, the more powerful they become. Just as a student learns more by reading more books, an AI system can improve its skills by ingesting larger pools of data. Chatbots like ChatGPT learn their skills by ingesting practically all the English language text on the internet. That requires larger and larger amounts of computing power from giant data centers. Inside those data centers are computers packed with thousands of specialized computer chips called graphics processing units, or GPUs, which can cost more than $30,000 apiece. The cost is pushed higher because the chips, data centers and electricity needed to do this digital work are in short supply. Sean Holzknecht, CEO of Colovore, a data center operator whose facilities are adopting specialized chips used to build AI, said this new kind of computing facility cost 10 to 20 times what a traditional data center does. These chips spend months running the mathematical calculations that allow ChatGPT to pinpoint patterns in all that data. The price tag for each "training run" can climb into the hundreds of millions of dollars. "Imagine needing to read the internet over and over and over," said David Katz, a partner at Radical Ventures, a venture capital firm that has invested in AI startups. "This is the most computationally intensive task the world has ever seen." Google, Microsoft, OpenAI and others are now working to expand the global pool of data centers needed to build their technologies. They plan to spend hundreds of billions to increase the number of computer chips manufactured each year, install them in facilities across the world and secure the electricity needed to run them. Those costs are particularly onerous when companies like OpenAI, Google and Anthropic offer chats to consumers at no charge. Some of them are charging consumers around $20 a month to use their most powerful technologies -- and even that may not recoup the cost of delivery. (The New York Times has sued OpenAI and its partner, Microsoft, claiming copyright infringement of news content related to AI systems. The two companies have denied the suit's claims). Since building the initial version of ChatGPT, OpenAI has steadily improved its chatbot, feeding it increasingly large amounts of data, including images and sounds as well as text. The company recently unveiled a version of ChatGPT that "reasons" through math, science and computer programming problems. It built this technology using a technique called reinforcement learning. Through this process, the system learns additional behavior over months of trial and error. Trying to solve various math problems, for instance, it can learn which methods lead to the right answer and which do not. When people use this system, it "thinks" before responding. When someone asks it a question, it explores many possibilities before delivering an answer. OpenAI sees this technology, called OpenAI o1, as the future of its business. And it requires even more computing power. That is why the company expects that its computing costs will grow sevenfold by 2029 as it chases the dream of artificial general intelligence -- a machine that can do as much as the human brain, or more. "If you are trying to chase science fiction," said Nick Frosst, a former Google researcher and co-founder of Cohere, a startup that builds similar technology, "the costs will keep going to up."
Share
Share
Copy Link
OpenAI's escalating expenses and funding requirements highlight the enormous costs associated with developing advanced AI systems, potentially leading to changes in the company's structure and raising questions about the sustainability of AI development.
OpenAI, the San Francisco-based artificial intelligence company, is facing an unprecedented financial challenge as it pursues cutting-edge AI technology. Despite raising $10 billion early last year, the company burned through most of that funding in just 18 months, necessitating an additional $6.6 billion raise and a $4 billion loan arrangement 12.
The company's expenses are staggering, with current annual spending exceeding $5.4 billion. Even more astonishing is OpenAI's projection that by 2029, its yearly expenditure will reach $37.5 billion 12. This exponential increase in costs is driving the company to consider transforming from its original nonprofit research lab structure to a for-profit entity to attract more investors 12.
The development of AI systems has inverted the traditional approach to creating computer technology. Instead of building applications step-by-step with carefully defined code, AI companies like OpenAI are adopting a "go big first" strategy 12. This approach involves feeding enormous amounts of data into AI systems, which become more powerful as they ingest larger pools of information.
The primary driver behind OpenAI's massive funding needs is the computing power required to process and analyze vast amounts of data. This necessitates:
David Katz, a partner at Radical Ventures, describes AI development as "the most computationally intensive task the world has ever seen" 12.
Despite the financial challenges, OpenAI continues to push the boundaries of AI technology:
OpenAI's financial situation reflects broader trends in the AI industry:
As OpenAI and other companies pursue increasingly advanced AI systems, the associated costs are expected to continue rising. Nick Frosst, co-founder of Cohere and former Google researcher, succinctly summarizes the situation: "If you are trying to chase science fiction, the costs will keep going up" 12. This raises important questions about the sustainability of AI development and the potential need for new funding models in the industry.
Reference
[1]
[2]
OpenAI, the company behind ChatGPT, is experiencing explosive growth but facing significant financial losses. As it seeks new funding and considers restructuring, questions arise about its long-term sustainability and impact on the AI industry.
8 Sources
8 Sources
OpenAI, the artificial intelligence company behind ChatGPT, is reportedly in discussions for a new funding round that could value the company at $150 billion. This move comes as the AI race intensifies and development costs soar.
19 Sources
19 Sources
OpenAI CEO Sam Altman reveals that the company is losing money on its $200 monthly ChatGPT Pro subscriptions due to unexpectedly high usage, highlighting the challenges of balancing AI costs with sustainable pricing in the rapidly evolving AI industry.
10 Sources
10 Sources
OpenAI, the creator of ChatGPT, has raised $10 billion in just one week through a combination of venture funding and a credit facility. This massive influx of capital comes as the company faces significant financial challenges and debates over its future direction.
66 Sources
66 Sources
OpenAI announces a shift towards a for-profit structure, citing the need for substantial capital to compete in AI development. The move aims to attract more investors while maintaining its mission through a public benefit corporation model.
34 Sources
34 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved