Summary: A report says OpenAI now expects to burn roughly $115B through 2029 as it scales infrastructure, data, and R&D for next‑gen models. The figure underscores how capital‑intensive frontier AI has become, with spending tied heavily to cutting‑edge chips and power.
Why it matters
Partners and customers should expect pricing and service tiers to evolve with compute costs. Investors and vendors will watch funding mix (revenues vs. capital raises) and any build‑own‑operate deals for data centers and energy.
Key facts
- Forecast: ~$115B cumulative burn through 2029 (per The Information, via Reuters)
- Drivers: GPU/accelerator spend, training runs, inference scaling
- Risk: supply of chips and power constraining growth
What to watch
Long‑term supply contracts with chipmakers and power providers; margin impacts of inference vs. training; hints of model‑as‑infrastructure tie‑ups.