OpenAI CEO Sam Altman’s ambitious plans for artificial intelligence, potentially requiring 100 million GPUs and an estimated $3 trillion investment, underscore the escalating demand for computing power in the rapidly evolving AI landscape.
Generative AI is proving to be a resource-intensive technology, necessitating vast computing power and cooling infrastructure to support its advancements. OpenAI has reportedly expressed dissatisfaction with Microsoft’s inability to meet its cloud computing demands, a key component of their multi-billion-dollar agreement. The ChatGPT maker even suggested that Microsoft’s shortcomings could lead to a rival AI firm achieving Artificial General Intelligence (AGI) first.
Earlier this year, OpenAI unveiled the “Stargate project,” a $500 billion initiative aimed at constructing data centers across the United States to bolster its AI capabilities. This project’s announcement reportedly led to Microsoft losing its status as OpenAI’s primary cloud provider. However, a recent report by The Wall Street Journal indicates that the Stargate project is facing difficulties in getting off the ground, with OpenAI reportedly now aiming to build a smaller data center by the end of the year.
Despite these challenges, Sam Altman recently stated that OpenAI is on track to bring “well over 1 million GPUs online” by the close of the current year. This development follows revelations that OpenAI was compelled to undertake “unnatural things” due to a GPU shortage. This shortage was exacerbated by the viral success of Studio Ghibli memes generated by its GPT-4o image generator, leading the company to borrow compute capacity from research divisions and slow down certain features.
Altman humorously noted that when ChatGPT gained over one million new users in under an hour, the sudden surge in demand nearly caused the company’s GPUs to “melt.” While proud of his team, Altman emphasized the critical need to scale OpenAI’s current GPU capacity from a potential 1 million to an astounding 100 million. Tom’s Hardware estimates that achieving this scale could cost OpenAI up to $3 trillion. To put this into perspective, this figure is just a billion dollars shy of Microsoft’s recent market capitalization, which reached $4 trillion.
This escalating demand for computing power aligns with claims by AI safety researcher Roman Yampolskiy, director of the Cyber Security Laboratory at the University of Louisville. Yampolskiy posits that AGI could be achieved today if sufficient financial resources and computing power were readily available. The ongoing trajectory of AI suggests that its evolution will continue to necessitate ever-increasing computing power for more sophisticated advancements. It remains to be seen whether OpenAI can secure the over $3 trillion required for this massive GPU expansion and if such an investment would indeed be enough to achieve monumental feats like AGI or even superintelligence.




