Nvidia's Jensen Huang pushes tokens as AI's new currency, but experts say the metric falls short

2 Sources

Share

Nvidia CEO Jensen Huang is championing tokens as the fundamental unit of the AI economy, predicting employees may need token budgets worth half their salary. But while token economics offers a compelling framework for measuring AI adoption and billing, critics argue it measures volume rather than value—creating potential budgetary challenges without clear links to business outcomes.

Nvidia Positions Tokens as the Foundation of AI Economics

Nvidia CEO Jensen Huang used the company's annual GTC conference this week to promote a vision where tokens become the defining currency of artificial intelligence. These basic units of output from large language models—roughly 1,300 tokens generate 1,000 words of text—are being positioned as the key AI metric for measuring both production efficiency and economic value

1

. Huang's framework centers on cost per token as the critical measure, arguing that as long as Nvidia's chips produce tokens at the lowest cost and demand outstrips supply, the AI boom remains healthy

1

.

Source: PYMNTS

Source: PYMNTS

The shift toward token economics reflects broader changes in how enterprise AI spending is structured. Unlike seat-based pricing that characterized earlier software generations, token consumption offers granular, real-time measurement tied directly to behavior

2

. OpenAI data reveals the scale of this transformation: average reasoning token consumption per organization has surged approximately 320 times over the past 12 months, signaling systematic integration of more intelligent models into expanding products and services

2

. Huang went further, estimating that employee token allocations could eventually reach half of base salary in value, stating at GTC that he could "totally imagine in the future every single engineer in our company will need an annual token budget"

2

.

The Missing Link Between Token Production and Customer Value

Despite the compelling narrative around measuring AI adoption through tokens, significant gaps exist in connecting token production to actual value creation. Just because the cost of tokens is falling doesn't automatically mean AI-powered services become valuable or generate revenue across the industry, as Huang suggests

1

. The fundamental problem is that tokens measure volume, not outcome

2

.

This disconnect becomes stark when examining unit economics. A poorly structured prompt forcing a model to iterate or regenerate consumes more tokens than a concise query, yet both may produce equally useful—or useless—output

2

. More troubling, if an AI agent saves a customer service representative 15 minutes but costs $4 in inference tokens to run, the ROI turns negative

2

. Organizations attempting to link AI usage to business outcomes find themselves struggling to connect token consumption metrics with tangible value delivery.

The situation grows more complex with newer AI models. Reasoning models that emerged late in 2024, starting with OpenAI's o1, consume far larger numbers of tokens to arrive at answers. These are now being supplemented by agents promising to automate white-collar work, bringing an explosion in token use and potentially hefty bills for companies offering workers unlimited AI access

1

. Software engineering has seen initial efforts to measure how token use links to output, with some attempting to apportion tokens to workers. Tech companies envision AI becoming core to employment, where white-collar worker costs equal salary plus a certain number of tokens per month—but that remains a pipe dream for now

1

.

Profitability Concerns for AI Companies and Commoditization Risks

The second major gap in token economics involves how companies producing tokens will maintain profitability. If AI factories all use Nvidia's latest chips, gaining a cost-per-token advantage or retaining pricing power becomes difficult

1

. Price declines accompanying plunging production costs tell a concerning story. When OpenAI launched GPT-4 two years ago, it charged $33 for 1 million tokens. Today, it charges only 9 cents for 1 million tokens from its cheapest model

1

. While beneficial for customers, this feeds worries about commoditization.

These concerns echo debates from cloud computing's early days, when skeptics questioned whether Amazon Web Services could profit from selling basic storage and computing as commodities. Cloud providers eventually built higher-value platforms on which customers could run entire businesses, though oligopoly dynamics and regulatory pressure reducing switching costs may also explain healthy profit margins

1

. Whether OpenAI and Anthropic can execute a similar transformation remains unclear, though the opportunity exists

1

. For now, intense competition among frontier AI companies continues, and how this resolves will significantly shape industry profitability

1

.

Budgetary Challenges and the Risk of Misaligned Incentives

As companies move from pilots to production deployments and from experimental chatbots to thousands of autonomous agentic workflows running continuously, token consumption has created what some describe as massive budgetary leaks

2

. While unit prices fall, overall enterprise AI spending and system scaling continue rising. The number of users, model complexity, and workload intensity drive greater token consumption and consequently higher costs

2

.

The dynamic invites comparisons to earlier enterprise metrics that proved easier to game than interpret. Click-through rates once proxied for advertising effectiveness; hours logged functioned as productivity measures. Both created incentives diverging from intended outcomes

2

. If token consumption becomes a performance indicator tied to employee evaluations, workers may optimize for AI interaction frequency rather than task quality

2

. Knowing that "AI spend is up 40%" provides insufficient insight without systems linking every workload, tenant, and token to their owners or business outcomes

2

. Organizations need comprehensive visibility connecting AI adoption and billing to measurable value, not just consumption volume, as they navigate this emerging economic reality.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo