More than 1 million businesses now use OpenAI tools, and the usage data is striking
ChatGPT's workplace footprint grew roughly ninefold in a year. The company says the real prize is not consumer adoption but fixing how work gets done inside firms.
OpenAI published a report on December 17, making the case that enterprise adoption of its tools has reached a scale worth taking seriously, and the numbers it released are difficult to dismiss.
More than 1 million business customers now use OpenAI products. ChatGPT message volume grew 8 times year over year. Workplace seats across ChatGPT products exceed 7 million. ChatGPT Enterprise seats specifically grew approximately 9 times in the same period.
The figures come from de-identified, aggregated usage data and a survey of 9,000 workers across nearly 100 enterprises. The methodology carries the usual caveats of company-commissioned research, but the scale of the numbers, if accurate, marks a meaningful shift in how AI is actually being used rather than merely discussed.
Why OpenAI says enterprise is the point
The report frames the enterprise focus as more than a commercial strategy. Most economically valuable activity, the company argues, happens inside firms rather than in consumer interactions, and solving problems at the organisational level is what funds broader access to AI.
That framing reflects something real about the unit economics of large language model development. Consumer products generate engagement. Enterprise contracts generate the sustained revenue that pays for the next generation of models.
Ronnie Chatterji, OpenAI's chief economist, outlined where the company sees the next phase heading: stronger performance on economically valuable tasks, better understanding of organisational context, and a shift from asking models for outputs to delegating complex, multi-step workflows. The implication is that current enterprise adoption, however fast it is growing, is still closer to the beginning of the curve than the middle.
The productivity claims
75% of workers surveyed said AI improved either the speed or quality of their output. ChatGPT Enterprise users reported saving 40 to 60 minutes per active working day, with data science, engineering, and communications teams reporting savings at the higher end of 60 to 80 minutes.
The operational findings are more specific. 87% of IT workers reported faster issue resolution. 85% of marketing and product users reported faster campaign execution.
Self-reported productivity data from a company's own survey of its customers should be read carefully. People who have chosen to use a tool tend to report that it helps them. But the directionality of the findings, and their consistency across different job functions, is at least suggestive that something is happening at a workflow level rather than just a sentiment level.
The API numbers reveal something different
The API usage figures are the most technically revealing part of the report. More than 9,000 organisations have processed over 10 billion tokens. Nearly 200 have exceeded 1 trillion tokens. Average reasoning token consumption per operation increased approximately 320 times.
That last figure is the one worth sitting with. Reasoning tokens are more expensive and computationally intensive than standard outputs. Customers are using them at 320 times the rate they did a year ago are building applications that require the model to think through problems rather than retrieve or summarise information. That points toward the agentic and multi-step workflow use cases Chatterji described, rather than simpler query-response interactions.
The Custom GPTs and Projects numbers point in the same direction. Weekly users of those features rose roughly 19 times year to date, and about 20% of Enterprise messages were processed through a Custom GPT or Project. Businesses are not just giving employees access to ChatGPT. They are building tailored versions of it around their own workflows and data.
What the report does not address
The report is a marketing document as much as it is a research document, and it leaves several obvious questions unanswered. It does not address how deeply integrated these tools are into critical workflows versus supplementary tasks. Saving an hour a day on drafting emails is a different kind of adoption than having AI make decisions that affect customers or revenue.
Related reading
- OpenAI is building a smart speaker with a camera that watches you
- OpenAI commits $7.5M to The Alignment Project
- Meta and NVIDIA agree multiyear partnership to build hyperscale AI infrastructure
It also does not address churn or the distribution of usage within organisations. A seat count of 7 million tells you how many people have access. It does not tell you how many use the tools daily, weekly, or at all.
What the report does establish is that the adoption curve for enterprise AI is steeper than most forecasters predicted two years ago, and that the companies best positioned to capture what comes next are those that have already embedded AI deeply enough into their workflows that removing it would be disruptive. OpenAI is making a clear argument that it is building that kind of dependency at scale.
The recap
- OpenAI published a report on enterprise AI on December 17, 2025
- ChatGPT message volume grew 8x year‑over‑year in enterprises
- Report says next phase will emphasize delegation of multi‑step workflows