NVIDIA Corporation (NASDAQ: NVDA) used the CES 2026 convention in Las Vegas to reaffirm its leadership in artificial intelligence infrastructure, announcing that its next-generation Rubin data center platform is now in full production and on track for release later this year. The move highlights Nvidia’s accelerated release cycle as competition intensifies from rivals such as Advanced Micro Devices (NASDAQ: AMD) and custom silicon developed by major cloud providers.
During his keynote address, CEO Jensen Huang revealed that all six chips in the Rubin platform have successfully returned from manufacturing partners and passed initial milestone tests. This puts the new AI accelerator systems on schedule for customer deployments in the second half of 2026. By unveiling Rubin early, Nvidia is signaling confidence in its roadmap while keeping enterprises closely aligned with its hardware ecosystem.
The Rubin GPU is designed to meet the growing demands of agentic AI models, which rely on multistep reasoning rather than simple pattern recognition. According to Nvidia, Rubin delivers 3.5 times faster AI training performance and up to 5 times higher inference performance compared to the current Blackwell architecture. The platform also introduces the new Vera CPU, featuring 88 custom cores and offering double the performance of its predecessor. Nvidia says Rubin-based systems can achieve the same results as Blackwell while using far fewer components, reducing cost per token by as much as tenfold.
Positioned as a modular “AI factory” or “supercomputer in a box,” the Rubin platform integrates the BlueField-4 DPU, which manages AI-native storage and long-term context memory. This design improves power efficiency by up to five times, a critical factor for hyperscale data centers. Early adopters include Microsoft (NASDAQ: MSFT), Amazon AWS (NASDAQ: AMZN), Google Cloud (NASDAQ: GOOGL), and Oracle Cloud Infrastructure (NYSE: ORCL).
Beyond data centers, Nvidia also highlighted major advances in robotics and autonomous vehicles, calling the current period a “ChatGPT moment” for physical AI. New offerings such as Alpamayo AI models for self-driving systems and the Jetson T4000 robotics module further underscore Nvidia’s bet that reasoning-based AI will drive a massive, trillion-dollar infrastructure upgrade across industries.


SpaceX Pushes for Early Stock Index Inclusion Ahead of Potential Record-Breaking IPO
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
Jensen Huang Urges Taiwan Suppliers to Boost AI Chip Production Amid Surging Demand
FDA Targets Hims & Hers Over $49 Weight-Loss Pill, Raising Legal and Safety Concerns
Nasdaq Proposes Fast-Track Rule to Accelerate Index Inclusion for Major New Listings
Instagram Outage Disrupts Thousands of U.S. Users
Oracle Plans $45–$50 Billion Funding Push in 2026 to Expand Cloud and AI Infrastructure
Nvidia Nears $20 Billion OpenAI Investment as AI Funding Race Intensifies
TrumpRx Website Launches to Offer Discounted Prescription Drugs for Cash-Paying Americans
Nvidia Confirms Major OpenAI Investment Amid AI Funding Race
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
Elon Musk’s SpaceX Acquires xAI in Historic Deal Uniting Space and Artificial Intelligence
Missouri Judge Dismisses Lawsuit Challenging Starbucks’ Diversity and Inclusion Policies
Google Cloud and Liberty Global Forge Strategic AI Partnership to Transform European Telecom Services
Elon Musk’s Empire: SpaceX, Tesla, and xAI Merger Talks Spark Investor Debate 



