OpenAI announced a seven-year, $38 billion deal with Amazon Web Services to power products like ChatGPT and Sora. The agreement gives the company access to hundreds of thousands of Nvidia GPUs to train and run its models, with capacity slated to come online by the end of 2026 and room to grow in 2027 and beyond.
“Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
The data centers will deploy hundreds of thousands of chips, including Nvidia’s GB200 and GB300 AI accelerators, in clusters built to power ChatGPT’s responses, generate AI videos, and train OpenAI’s next wave of models.
Wall Street welcomed the announcement, with Amazon stock climbing to new highs, while long-time OpenAI investor Microsoft briefly dipped following the news.
OpenAI’s move underscores the enormous compute demands of running large-scale generative AI. After recent restructuring to gain more operational and financial independence from Microsoft, OpenAI has pursued additional cloud partnerships and is reportedly exploring its own GPU hardware to alleviate supply pressures, while continuing to partner with companies like Google and Oracle in separate arrangements.