OpenAI Sam Altman Says the Company Is 'Out of GPUs'

An anonymous reader quotes a report from TechCrunch: OpenAI CEO Sam Altman said that the company was forced to stagger the rollout of its newest model, GPT-4.5, because OpenAI is "out of GPUs." In a post on X, Altman said that GPT-4.5, which he described as "giant" and "expensive," will require "tens of thousands" more GPUs before additional ChatGPT users can gain access. GPT-4.5 will come first to subscribers to ChatGPT Pro starting Thursday, followed by ChatGPT Plus customers next week.

Perhaps in part due to its enormous size, GPT-4.5 is wildly expensive. OpenAI is charging $75 per million tokens (~750,000 words) fed into the model and $150 per million tokens generated by the model. That's 30x the input cost and 15x the output cost of OpenAI's workhorse GPT-4o model. "We've been growing a lot and are out of GPUs," Altman wrote. "We will add tens of thousands of GPUs next week and roll it out to the Plus tier then [] This isn't how we want to operate, but it's hard to perfectly predict growth surges that lead to GPU shortages."

Comments (0)
No login
gif
color_lens
Login or register to post your comment