2 Sources
2 Sources
[1]
Sam Altman Says Intelligence Will Be a Utility, and He's Just the Man to Collect the Bills
In an appearance at BlackRock’s U.S. Infrastructure Summit on Wednesday, OpenAI CEO Sam Altman offered a surprisingly clear articulation of how he imagines the future of artificial intelligenceâ€"it's just not clear if he meant it the way that it sounded. While speaking with Adebayo Ogunlesi (who happens to be a member of OpenAI's board of directors), Altman said, “We see a future where intelligence is a utility, like electricity or water, and people buy it from us on a meter,†which conjures up the nightmarish image of someone being unable to pay their intelligence bill. Altman expanded on this idea, stating that his company has a "fundamental belief in abundance of intelligence" and arguing, "One of the most important things in the future is that we make intelligence, to borrow an old phrase from the energy industry that didn't quite work: 'Too cheap to meter.'" Evoking energy costs is a bold choice for Altman, because the failure to ever achieve that "too cheap to meter" status has turned AI expansion into a major pain point for residents who have the displeasure of calling a data center their neighbor. Altman's company and the industry he's become the face of have been responsible for skyrocketing energy costs across the country (though they're at least starting to agree to fit the bill). Also, describing intelligence as "too cheap to meter" doesn't quite sound the same as when that phrase is applied to energyâ€"it seems more akin to "you get what you pay for." But Altman's point is simple enough: AI companies are currently in the business of selling "tokens"â€"the units that models use to process and generateâ€"and as demand scales up, compute becomes finite, meaning companies will either have to charge more per unit or just not meet demand. Avoiding that outcome, where access to AI comes with a big bill, means a rapid expansion of processing power, which isn't exactly cheap itself. And while OpenAI and other firms have agreed to pick up the tab on energy costs for these projects, the funding for those data center buildouts is starting to look shaky. OpenAI just backed out of a planned expansion to its Stargate project in Texas due to financing issues. The way Altman is talking, suggesting that intelligence could be a utility, it's hard not to recall previous comments from him and OpenAI CFO Sarah Friar calling on the federal government to essentially guarantee their investments. Friar said she expects a federal "backstop" to guarantee the company will be able to finance its massive and rapidly expanding data center infrastructure. Altman echoed the comments in a separate appearance, stating, “Given the magnitude of what I expect AI’s economic impact to look like, I do think the government ends up as the insurer of last resort.†The execs later walked back the suggestion that the government treats them as "too big to fail," but it seems like Altman is once again dabbling in that suggestion, albeit less directly. By suggesting intelligence as a "utility," there is a tacit acknowledgement that it will need to be subsidized by the government, the way other utilities are. He's just seemingly left out that particular part of his roadmap to the future.
[2]
Just like electricity, AI will soon become metered service for all, says OpenAI CEO Sam Altman
OpenAI is building new data centres with partners to support the growing use of AI. OpenAI CEO Sam Altman believes the future of artificial intelligence may be far simpler than many people imagine. Speaking at the BlackRock US Infrastructure Summit in Washington, DC, Altman explained that AI may soon become something people use the way they use electricity or water. Instead of paying a fixed monthly fee for AI tools, users could pay based on how much computing power they use. Altman believes AI will gradually turn into a basic service that people use whenever they need help with work, learning or problem-solving. The idea is that intelligence delivered through machines will function like a utility. Altman explained that the role of AI in everyday work is already expanding faster than many people expected. Furthermore, he pointed out that AI tools in several industries are now capable of completing tasks that once required hours of effort from skilled professionals. Software development is one area where this shift is clearly visible. Since their inception, AI systems have increasingly helped engineers write, test, and review code. Also read: Xiaomi reportedly plans to expand deeper into home appliances in India in 2026 However, the technology is not limited to programming, as the AI tools today are also supporting research, science, and other fields that rely on detailed analysis and large amounts of information. Altman said that employees are beginning to spend less time on technical execution and more time guiding AI systems that perform the tasks in the workplaces. Sam Altman believes that AI will be able to handle bigger and bigger tasks over time. Right now, an AI system might finish work that usually takes a few hours. But in the near future, the same systems could take on projects that normally take several days or even weeks. Also read: Can AI predict floods before they happen? Google's new tool aims to warn cities early He also explained that he personally uses AI when running OpenAI. When he comes up with a new product idea or thinks about a business plan, he often asks AI tools for their feedback before sharing the idea with his team. However, creating such powerful AI systems needs a lot of computing power, and huge data centres with special hardware are required to train and run these models. At last, he concluded that to support this, OpenAI is building more infrastructure with partners like Microsoft and Oracle while exploring massive global chip initiatives with investors like SoftBank.
Share
Share
Copy Link
OpenAI CEO Sam Altman outlined his vision for artificial intelligence becoming a utility like electricity or water at BlackRock's U.S. Infrastructure Summit. Users would pay based on computing power consumed, he explained. But his comments raise concerns about energy costs, infrastructure financing, and potential government subsidies for AI expansion.
Sam Altman delivered a bold vision for the future of artificial intelligence at the BlackRock U.S. Infrastructure Summit in Washington, DC, speaking alongside Adebayo Ogunlesi, a member of OpenAI's board of directors. The OpenAI CEO stated plainly: "We see a future where intelligence will be a utility, like electricity or water, and people buy it from us on a meter."
1
This concept suggests that AI will soon become metered service, with users paying based on how much computing power they consume rather than fixed monthly fees. Altman expanded on this vision by invoking an old energy industry phrase, expressing his company's goal to make intelligence "too cheap to meter."1

Source: Digit
The path toward making AI utility a reality requires immense computing power and infrastructure that OpenAI currently lacks. AI companies sell "tokens"—units that models use to process and generate responses—and as demand scales, compute becomes finite.
1
This scarcity means companies must either charge more per unit or fail to meet demand. To avoid this outcome, OpenAI is building new data centers with partners including Microsoft and Oracle, while exploring massive global chip initiatives with investors like SoftBank.2
However, the AI infrastructure expansion faces significant hurdles. OpenAI recently backed out of a planned expansion to its Stargate project in Texas due to financing issues, signaling that funding for data center buildouts is starting to look shaky.1
Altman's choice to evoke energy costs proves particularly fraught, given that the AI industry has been responsible for skyrocketing energy costs across the country. Residents living near data centers have experienced this pain firsthand, though AI companies are at least starting to agree to fit the bill for these expenses.
1
The energy and financial costs of AI expansion have created a major pain point that extends beyond individual communities. OpenAI CFO Sarah Friar previously stated she expects a federal "backstop" to guarantee the company will be able to finance its massive and rapidly expanding data center infrastructure. Altman echoed these comments in a separate appearance, suggesting the government should serve as "the insurer of last resort" given AI's expected economic impact.1
While both executives later walked back suggestions that the government should treat them as "too big to fail," Altman's utility framing carries a tacit acknowledgement that government subsidies may be necessary, similar to how other utilities receive federal support.1
Related Stories
Sam Altman explained that AI tools are already completing tasks that once required hours of effort from skilled professionals, with software development showing this shift most clearly. AI systems increasingly help engineers write, test, and review code, while also supporting research, science, and other fields requiring detailed analysis.
2
Employees are beginning to spend less time on technical execution and more time guiding AI systems that perform the tasks. Altman believes AI will handle bigger tasks over time—systems that currently finish work taking a few hours could soon take on projects normally requiring several days or weeks. He revealed he personally uses AI when running OpenAI, asking AI tools for feedback on new product ideas or business plans before sharing them with his team.2
This growing reliance on metered service models and the need for electricity-like accessibility raises critical questions about who will ultimately pay for the infrastructure required to deliver intelligence as a utility.Summarized by
Navi
[1]
23 Sept 2025•Technology

11 Jun 2025•Technology

25 Sept 2024

1
Technology

2
Technology

3
Business and Economy
