As the race in generative AI heats up, OpenAI may opt for an incremental update with GPT-4.5 this year, sidestepping the substantial resource demands a full leap to GPT-5 would entail.
Resource Constraints May Steer OpenAI Toward GPT-4.5 Amid Rising AI Development Costs
Although GPT-4 is currently at the pinnacle of the increasingly complex realm of generative artificial intelligence, its competitors, such as Anthropic's Claude and Meta's open-source Llama, continue to improve, demanding another version of OpenAI's flagship Large Language Model (LLM). While many expect Sam Altman's non-profit to release GPT-5 in 2024, several observers believe that these expectations are unrealistic, especially given the size of the necessary resources.
According to Dan Hendrycks (via Wccftech), director of the Center for AI Safety, each incremental iteration of OpenAI's GPT LLM necessitated a tenfold increase in computer resources. As a result, if OpenAI skips GPT-4.5 and goes straight to GPT-5, it will have a 100x increase in computing requirements compared to GPT-4, which is comparable to running around 1 million H100 chips for three months straight.
This assumption is confirmed by the words of Anthropic's CEO, Dario Amodei, who recently stated that it currently costs roughly $1 billion to train a cutting-edge LLM, with the cost projected to rise to between $5 billion and $10 billion by 2025/26. Importantly, $1 billion in training costs corresponds to a 10x increase in computational resources that may be reasonably inferred for GPT-4.5.
AI's Power Surge: NVIDIA's New H100 Units to Match National Energy Usage Amid Rising AI Competition
According to a recent report, NVIDIA's H100 units, which will be deployed this year, are estimated to consume approximately 13,000 GWh of electricity per year, similar to Lithuania and Guatemala's annual electricity consumption. By 2027, worldwide data center power usage will skyrocket to 85 to 134 TWh (terawatt-hours).
GPT-4's competitors are quickly catching up. Look no further than Meta's Llama 3 LLM (70 billion parameters), now ranked fifth on the Arena leadership board. Critically, Llama 3 outperforms all other open-source LLMs, even without the planned 405-billion-parameter model.
Some experts feel that for GPT-5, OpenAI will need to alter the "original curriculum," which now relies on "poorly curated human conversations" and a "naive" training procedure. This supports our original hypothesis that OpenAI will release an iterative GPT-4.5 model this year rather than upending the stakes entirely with GPT-5.
Photo: Jonathan Kemper/Unsplash


SK Hynix Shares Surge on Hopes for Upcoming ADR Issuance
SoftBank Shares Slide as Oracle’s AI Spending Plans Fuel Market Jitters
IBM Nears $11 Billion Deal to Acquire Confluent in Major AI and Data Push
Evercore Reaffirms Alphabet’s Search Dominance as AI Competition Intensifies
Trump Signs Executive Order to Establish National AI Regulation Standard
Australia’s Under-16 Social Media Ban Sparks Global Debate and Early Challenges
SpaceX Reportedly Preparing Record-Breaking IPO Targeting $1.5 Trillion Valuation
Moore Threads Stock Slides After Risk Warning Despite 600% Surge Since IPO
Intel’s Testing of China-Linked Chipmaking Tools Raises U.S. National Security Concerns
Trello Outage Disrupts Users as Access Issues Hit Atlassian’s Work Management Platform
Trump Criticizes EU’s €120 Million Fine on Elon Musk’s X Platform
Mizuho Raises Broadcom Price Target to $450 on Surging AI Chip Demand
SK Hynix Considers U.S. ADR Listing to Boost Shareholder Value Amid Rising AI Chip Demand
U.S. Greenlights Nvidia H200 Chip Exports to China With 25% Fee
U.S.-EU Tensions Rise After $140 Million Fine on Elon Musk’s X Platform
SpaceX Insider Share Sale Values Company Near $800 Billion Amid IPO Speculation 



