As the race in generative AI heats up, OpenAI may opt for an incremental update with GPT-4.5 this year, sidestepping the substantial resource demands a full leap to GPT-5 would entail.
Resource Constraints May Steer OpenAI Toward GPT-4.5 Amid Rising AI Development Costs
Although GPT-4 is currently at the pinnacle of the increasingly complex realm of generative artificial intelligence, its competitors, such as Anthropic's Claude and Meta's open-source Llama, continue to improve, demanding another version of OpenAI's flagship Large Language Model (LLM). While many expect Sam Altman's non-profit to release GPT-5 in 2024, several observers believe that these expectations are unrealistic, especially given the size of the necessary resources.
According to Dan Hendrycks (via Wccftech), director of the Center for AI Safety, each incremental iteration of OpenAI's GPT LLM necessitated a tenfold increase in computer resources. As a result, if OpenAI skips GPT-4.5 and goes straight to GPT-5, it will have a 100x increase in computing requirements compared to GPT-4, which is comparable to running around 1 million H100 chips for three months straight.
This assumption is confirmed by the words of Anthropic's CEO, Dario Amodei, who recently stated that it currently costs roughly $1 billion to train a cutting-edge LLM, with the cost projected to rise to between $5 billion and $10 billion by 2025/26. Importantly, $1 billion in training costs corresponds to a 10x increase in computational resources that may be reasonably inferred for GPT-4.5.
AI's Power Surge: NVIDIA's New H100 Units to Match National Energy Usage Amid Rising AI Competition
According to a recent report, NVIDIA's H100 units, which will be deployed this year, are estimated to consume approximately 13,000 GWh of electricity per year, similar to Lithuania and Guatemala's annual electricity consumption. By 2027, worldwide data center power usage will skyrocket to 85 to 134 TWh (terawatt-hours).
GPT-4's competitors are quickly catching up. Look no further than Meta's Llama 3 LLM (70 billion parameters), now ranked fifth on the Arena leadership board. Critically, Llama 3 outperforms all other open-source LLMs, even without the planned 405-billion-parameter model.
Some experts feel that for GPT-5, OpenAI will need to alter the "original curriculum," which now relies on "poorly curated human conversations" and a "naive" training procedure. This supports our original hypothesis that OpenAI will release an iterative GPT-4.5 model this year rather than upending the stakes entirely with GPT-5.
Photo: Jonathan Kemper/Unsplash


EU Prepares Antitrust Probe Into Meta’s AI Integration on WhatsApp
TSMC Accuses Former Executive of Leaking Trade Secrets as Taiwan Prosecutors Launch Investigation
Australia Releases New National AI Plan, Opts for Existing Laws to Manage Risks
Intel Boosts Malaysia Operations with Additional RM860 Million Investment
Morgan Stanley Boosts Nvidia and Broadcom Targets as AI Demand Surges
Australia Moves Forward With Teen Social Media Ban as Platforms Begin Lockouts
Senate Sets December 8 Vote on Trump’s NASA Nominee Jared Isaacman
Nexperia Urges China Division to Resume Chip Production as Supply Risks Mount
YouTube Agrees to Follow Australia’s New Under-16 Social Media Ban
Samsung Launches Galaxy Z TriFold to Elevate Its Position in the Foldable Smartphone Market
OpenAI Moves to Acquire Neptune as It Expands AI Training Capabilities
AI-Guided Drones Transform Ukraine’s Battlefield Strategy
Banks Consider $38 Billion Funding Boost for Oracle, Vantage, and OpenAI Expansion
Sam Altman Reportedly Explored Funding for Rocket Venture in Potential Challenge to SpaceX
ByteDance Unveils New AI Voice Assistant for ZTE Smartphones
Apple Alerts EU Regulators That Apple Ads and Maps Meet DMA Gatekeeper Thresholds 



