As the race in generative AI heats up, OpenAI may opt for an incremental update with GPT-4.5 this year, sidestepping the substantial resource demands a full leap to GPT-5 would entail.
Resource Constraints May Steer OpenAI Toward GPT-4.5 Amid Rising AI Development Costs
Although GPT-4 is currently at the pinnacle of the increasingly complex realm of generative artificial intelligence, its competitors, such as Anthropic's Claude and Meta's open-source Llama, continue to improve, demanding another version of OpenAI's flagship Large Language Model (LLM). While many expect Sam Altman's non-profit to release GPT-5 in 2024, several observers believe that these expectations are unrealistic, especially given the size of the necessary resources.
According to Dan Hendrycks (via Wccftech), director of the Center for AI Safety, each incremental iteration of OpenAI's GPT LLM necessitated a tenfold increase in computer resources. As a result, if OpenAI skips GPT-4.5 and goes straight to GPT-5, it will have a 100x increase in computing requirements compared to GPT-4, which is comparable to running around 1 million H100 chips for three months straight.
This assumption is confirmed by the words of Anthropic's CEO, Dario Amodei, who recently stated that it currently costs roughly $1 billion to train a cutting-edge LLM, with the cost projected to rise to between $5 billion and $10 billion by 2025/26. Importantly, $1 billion in training costs corresponds to a 10x increase in computational resources that may be reasonably inferred for GPT-4.5.
AI's Power Surge: NVIDIA's New H100 Units to Match National Energy Usage Amid Rising AI Competition
According to a recent report, NVIDIA's H100 units, which will be deployed this year, are estimated to consume approximately 13,000 GWh of electricity per year, similar to Lithuania and Guatemala's annual electricity consumption. By 2027, worldwide data center power usage will skyrocket to 85 to 134 TWh (terawatt-hours).
GPT-4's competitors are quickly catching up. Look no further than Meta's Llama 3 LLM (70 billion parameters), now ranked fifth on the Arena leadership board. Critically, Llama 3 outperforms all other open-source LLMs, even without the planned 405-billion-parameter model.
Some experts feel that for GPT-5, OpenAI will need to alter the "original curriculum," which now relies on "poorly curated human conversations" and a "naive" training procedure. This supports our original hypothesis that OpenAI will release an iterative GPT-4.5 model this year rather than upending the stakes entirely with GPT-5.
Photo: Jonathan Kemper/Unsplash


Foxconn Shares Slip After Q4 Profit Miss Despite Record Revenue and Strong AI Outlook
Microsoft Eyes Legal Action as Amazon-OpenAI Deal Threatens Azure Exclusivity
Amazon's AWS Could Hit $600 Billion in Revenue as AI Reshapes Cloud Growth
Amazon's "Transformer" Phone: Can It Succeed Where Fire Phone Failed?
Micron Technology Plans Second Taiwan Chip Facility to Meet AI Memory Demand
Nvidia's Jensen Huang Forecasts $1 Trillion in AI Chip Demand Through 2027
Samsung Bets Big on AI-Driven Chip Demand in 2025
Malaysia Semiconductor Industry Eyes Helium Supply Risks Amid Middle East Conflict
Xiaomi's AI Model "Hunter Alpha" Mistaken for DeepSeek's Next Release
Super Micro Computer Shares Plunge After Co-Founder Charged in AI Chip Smuggling Case
Cyberattack on Stryker Triggers U.S. Government Warning Over Microsoft Intune Security
SK Hynix Chairman Warns of Memory Chip Shortage Through 2030 Amid AI Boom
Micron Technology Beats Q2 Earnings Estimates, Issues Strong AI-Driven Outlook
AMD CEO Lisa Su Heads to Samsung's South Korea Chip Facility Amid AI Expansion Talks
xAI Faces Federal Lawsuit Over Grok AI-Generated Child Sexual Abuse Material
Nvidia's Jensen Huang Credits Samsung for Manufacturing New AI Chips, Boosting Stock
Hua Hong Group's 7nm Breakthrough Signals China's Growing Chip Independence 



