NVIDIA Corporation (NASDAQ: NVDA) used the CES 2026 convention in Las Vegas to reaffirm its leadership in artificial intelligence infrastructure, announcing that its next-generation Rubin data center platform is now in full production and on track for release later this year. The move highlights Nvidia’s accelerated release cycle as competition intensifies from rivals such as Advanced Micro Devices (NASDAQ: AMD) and custom silicon developed by major cloud providers.
During his keynote address, CEO Jensen Huang revealed that all six chips in the Rubin platform have successfully returned from manufacturing partners and passed initial milestone tests. This puts the new AI accelerator systems on schedule for customer deployments in the second half of 2026. By unveiling Rubin early, Nvidia is signaling confidence in its roadmap while keeping enterprises closely aligned with its hardware ecosystem.
The Rubin GPU is designed to meet the growing demands of agentic AI models, which rely on multistep reasoning rather than simple pattern recognition. According to Nvidia, Rubin delivers 3.5 times faster AI training performance and up to 5 times higher inference performance compared to the current Blackwell architecture. The platform also introduces the new Vera CPU, featuring 88 custom cores and offering double the performance of its predecessor. Nvidia says Rubin-based systems can achieve the same results as Blackwell while using far fewer components, reducing cost per token by as much as tenfold.
Positioned as a modular “AI factory” or “supercomputer in a box,” the Rubin platform integrates the BlueField-4 DPU, which manages AI-native storage and long-term context memory. This design improves power efficiency by up to five times, a critical factor for hyperscale data centers. Early adopters include Microsoft (NASDAQ: MSFT), Amazon AWS (NASDAQ: AMZN), Google Cloud (NASDAQ: GOOGL), and Oracle Cloud Infrastructure (NYSE: ORCL).
Beyond data centers, Nvidia also highlighted major advances in robotics and autonomous vehicles, calling the current period a “ChatGPT moment” for physical AI. New offerings such as Alpamayo AI models for self-driving systems and the Jetson T4000 robotics module further underscore Nvidia’s bet that reasoning-based AI will drive a massive, trillion-dollar infrastructure upgrade across industries.


Amazon Stock Rebounds After Earnings as $200B Capex Plan Sparks AI Spending Debate
Anthropic Eyes $350 Billion Valuation as AI Funding and Share Sale Accelerate
Nvidia Nears $20 Billion OpenAI Investment as AI Funding Race Intensifies
Nasdaq Proposes Fast-Track Rule to Accelerate Index Inclusion for Major New Listings
AMD Shares Slide Despite Earnings Beat as Cautious Revenue Outlook Weighs on Stock
OpenAI Expands Enterprise AI Strategy With Major Hiring Push Ahead of New Business Offering
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
Nintendo Shares Slide After Earnings Miss Raises Switch 2 Margin Concerns
Global PC Makers Eye Chinese Memory Chip Suppliers Amid Ongoing Supply Crunch
SpaceX Pushes for Early Stock Index Inclusion Ahead of Potential Record-Breaking IPO
Uber Ordered to Pay $8.5 Million in Bellwether Sexual Assault Lawsuit
SoftBank and Intel Partner to Develop Next-Generation Memory Chips for AI Data Centers
Baidu Approves $5 Billion Share Buyback and Plans First-Ever Dividend in 2026
Palantir Stock Jumps After Strong Q4 Earnings Beat and Upbeat 2026 Revenue Forecast
Oracle Plans $45–$50 Billion Funding Push in 2026 to Expand Cloud and AI Infrastructure
SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
Australian Scandium Project Backed by Richard Friedland Poised to Support U.S. Critical Minerals Stockpile 



