As the race in generative AI heats up, OpenAI may opt for an incremental update with GPT-4.5 this year, sidestepping the substantial resource demands a full leap to GPT-5 would entail.
Resource Constraints May Steer OpenAI Toward GPT-4.5 Amid Rising AI Development Costs
Although GPT-4 is currently at the pinnacle of the increasingly complex realm of generative artificial intelligence, its competitors, such as Anthropic's Claude and Meta's open-source Llama, continue to improve, demanding another version of OpenAI's flagship Large Language Model (LLM). While many expect Sam Altman's non-profit to release GPT-5 in 2024, several observers believe that these expectations are unrealistic, especially given the size of the necessary resources.
According to Dan Hendrycks (via Wccftech), director of the Center for AI Safety, each incremental iteration of OpenAI's GPT LLM necessitated a tenfold increase in computer resources. As a result, if OpenAI skips GPT-4.5 and goes straight to GPT-5, it will have a 100x increase in computing requirements compared to GPT-4, which is comparable to running around 1 million H100 chips for three months straight.
This assumption is confirmed by the words of Anthropic's CEO, Dario Amodei, who recently stated that it currently costs roughly $1 billion to train a cutting-edge LLM, with the cost projected to rise to between $5 billion and $10 billion by 2025/26. Importantly, $1 billion in training costs corresponds to a 10x increase in computational resources that may be reasonably inferred for GPT-4.5.
AI's Power Surge: NVIDIA's New H100 Units to Match National Energy Usage Amid Rising AI Competition
According to a recent report, NVIDIA's H100 units, which will be deployed this year, are estimated to consume approximately 13,000 GWh of electricity per year, similar to Lithuania and Guatemala's annual electricity consumption. By 2027, worldwide data center power usage will skyrocket to 85 to 134 TWh (terawatt-hours).
GPT-4's competitors are quickly catching up. Look no further than Meta's Llama 3 LLM (70 billion parameters), now ranked fifth on the Arena leadership board. Critically, Llama 3 outperforms all other open-source LLMs, even without the planned 405-billion-parameter model.
Some experts feel that for GPT-5, OpenAI will need to alter the "original curriculum," which now relies on "poorly curated human conversations" and a "naive" training procedure. This supports our original hypothesis that OpenAI will release an iterative GPT-4.5 model this year rather than upending the stakes entirely with GPT-5.
Photo: Jonathan Kemper/Unsplash


SoftBank Completes $41 Billion OpenAI Investment in Historic AI Funding Round
TSMC Shares Hit Record High as Goldman Sachs Raises Price Target on AI Demand Outlook
Mercedes-Benz to Launch Advanced Urban Self-Driving System in the U.S., Challenging Tesla FSD
Trump Administration Lifts Sanctions on Three Intellexa-Linked Executives
Dell Revives XPS Laptop Lineup With New XPS 14 and XPS 16 to Boost Premium PC Demand
Samsung Electronics Hits Record High as AI Momentum Fuels Investor Optimism
AMD Unveils Next-Generation AI and PC Chips at CES, Highlights Major OpenAI Partnership
SMIC Shares Climb as China Boosts Chipmaking Support Amid AI Optimism
China’s LandSpace Takes Aim at SpaceX With Reusable Rocket Ambitions
China Proposes Stricter Rules for AI Services Offering Emotional Interaction
Grok AI Faces Global Scrutiny Over Safeguard Failures and Illegal Content on X
Nvidia Unveils Rubin Platform to Power Next Wave of AI Infrastructure
Neuralink Plans Automated Brain Implant Surgeries and Mass Production by 2026
Reddit Emerges as a Major Winner in the Shift to AI-Powered Search
Trump Blocks HieFo’s Emcore Chip Assets Deal Over National Security Concerns 



