CES 2026: AI, NVIDIA & Autonomous Tech Driving Innovation


This year in Las Vegas, the spotlight is firmly on deployable engineering solutions—technologies designed to shorten development cycles, unlock operational data, and industrialize digital services at scale. In this article, we highlight 7 new CES innovations, summarized with a single defining word, an engineering lens on what really matters.

In a hurry? Here are the key notes to know:

  • AI is becoming the operating system of industry and mobility: From Siemens and NVIDIA’s Industrial AI OS to Samsung’s system-wide AI ecosystem, CES 2026 shows AI moving from isolated features to foundational infrastructure that runs design, production, and operations end to end.
  • Autonomous vehicles are finally being engineered for real-world conditions: Foldable steering wheels, safety-certified thermal cameras, and terahertz vision sensors signal a shift from autonomy demos to deployable, regulation-ready vehicle architectures.
  • Perception is the new battleground for safety and autonomy: Thermal (LWIR) and terahertz sensing are emerging as critical complements to lidar and radar, extending visibility in darkness, fog, rain, and glare where legacy sensors fail.
  • Engineering value now comes from integration, not invention alone: Whether it’s SDV app marketplaces, autonomous robots in public spaces, or AI-driven factories, the winning innovations are those that integrate hardware, software, and AI into scalable, production-ready systems.

1. Foundational: Siemens & NVIDIA Industrial AI Operating System

While many CES announcements focus on individual products, Siemens and NVIDIA are targeting something much deeper: the operating system of the industrial world itself.

The two companies yesterday announced a major expansion of their strategic partnership to jointly build what they describe as the Industrial AI Operating System. This platform is designed to embed AI across the entire industrial value chain. This means from electronic design and simulation to manufacturing, operations, and supply chains.

The Technology

The core idea is to transform digital twins from static simulation tools into active, AI-driven systems. By combining NVIDIA’s AI infrastructure (accelerated computing, Omniverse libraries, CUDA-X, PhysicsNeMo, AI models) and Siemens’ industrial stack (automation, electrification, EDA, simulation, digital twins, industrial AI expertise), factories can continuously analyze their digital twins. They can test changes virtually and deploy validated optimizations directly to the shopfloor.

Siemens refers to this as an “AI Brain” for manufacturing—one that enables real-time adaptation rather than periodic optimization.

The partnership spans multiple layers of industrial engineering. These include AI-native electronic design automation (EDA) (with GPU acceleration across Siemens’ simulation and verification tools). It will also involve adaptive manufacturing, where production systems adjust dynamically based on AI-driven insights

Siemens plans to complete GPU acceleration across its entire simulation portfolio, targeting 2× to 10× speed-ups in key workflows.

Industry Relevance

The collaboration will produce a repeatable blueprint for next-generation AI factories. The first reference site will launch in 2026 at Siemens’ Electronics Factory in Erlangen, Germany.

Several industrial leaders—including Foxconn, HD Hyundai, KION Group, and PepsiCo—are already evaluating elements of the platform.

For industrial engineers, this announcement signals a structural shift: AI is moving from optimization tool to operational foundation. Design, simulation, production, and operations are no longer sequential phases, but part of a continuously learning, AI-driven loop.

2. Plugged-In: Sibros SDV App Marketplace

Software-defined vehicles (SDVs) promise flexibility. But in practice, extracting value from vehicle data often means long development cycles, custom integrations, and heavy engineering workloads.

At CES 2026, Sibros is addressing this gap with the launch of its SDV App Marketplace. This curated catalog of more than 50 ready-to-deploy vehicle applications is specifically designed for automakers, fleet operators, and mobility providers.

Built on the Sibros Deep Connected Platform, the marketplace delivers turnkey applications across the full vehicle lifecycle. This includes battery health monitoring, predictive maintenance, guided diagnostics. But also R155/R156 cybersecurity compliance, emissions reporting, crash data automation, geofencing, or route optimization.

What makes the approach operational rather than experimental is the absence of custom integration work. Applications natively leverage Sibros’ existing SDV stack: full-vehicle OTA updates, high-frequency data logging, edge computing, remote commands, and cryptographic security.

Industry Relevance

For engineering teams, this shifts SDV development from “build everything” to “deploy and configure”, enable faster time-to-value. This also means they keep control over vehicle data and software architecture.

Sibros is also opening the marketplace to partners, allowing third-party providers to build and distribute specialized SDV applications on a shared, production-grade platform.

3. Self-Starter: IntBot’s Unmanned Robot Booth

Robotics demos are often tightly scripted, controlled, and carefully fenced off from real-world uncertainty. At CES 2026, IntBot is doing the opposite.For the first full day of the show, the company’s booth was operated entirely by a robot. No staff, no handlers and no fallback.

Nylo, IntBot’s flagship humanoid social robot, can autonomously greet visitors, initiate conversations, answer questions, and manage the unpredictable social flow of a major tradeshow floor.

IntBot positions Nylo as part of a new class of systems: Physical Agents—AI that does not live solely in the cloud or on screens, but is embodied in the physical world, capable of reading social cues, understanding intent, and acting accordingly. From an engineering standpoint, this is less about spectacle and more about system maturity.

Nylo’s autonomy is powered by IntEngine, IntBot’s proprietary multimodal, multi-loop social intelligence system. Vision, audio and language model inputs are synchronized to coordinate speech, facial expression, and gesture. The robot therefore behaves less like a reactive chatbot and more like a socially aware agent.

Industry Relevance

This is not Nylo’s first exposure to high-density public interaction. In 2025, the robot served as an AI information desk at NVIDIA GTC, interacting face-to-face with thousands of attendees. IntBot already operates commercial fleets in hospitality, where robots handle repetitive guest interactions and offload routine tasks from human staff.

CES 2026 marks the first time IntBot extends this capability to a fully unmanned exhibition booth—a stress test for social autonomy in one of the most chaotic environments imaginable.

4. Twistable: Autoliv & Tensor Foldable Steering Wheel

As vehicles move toward higher levels of autonomy, one question keeps resurfacing among interior, safety, and systems engineers: what happens to the steering wheel? In Level 4 autonomous driving, the steering wheel becomes a static obstacle, consuming space and constraining interior design.

At CES 2026, Autoliv and Tensor are presenting a production-ready answer with the world’s first foldable steering wheel. Co-developed for the Tensor Robocar, it is scheduled for volume production in the second half of 2026.

The foldable steering wheel supports dual-mode operation: It is deployed as a conventional steering wheel in manual mode. In L4, the steering wheel fully retracts, clearing the driver’s area and activating a passenger airbag integrated into the instrument panel.

Industry Relevance

What makes this CES announcement significant is that foldable steering wheels are no longer confined to concept vehicles. Autoliv and Tensor are explicitly targeting series production, bringing adaptive control and safety hardware into real-world deployment.

For engineers working on autonomous platforms, occupant safety, or vehicle interiors, this innovation signals a clear trend: vehicle architecture is beginning to physically adapt to software-defined driving modes—and safety systems must evolve in lockstep.

Autoliv Foldable Steering Wheel

5. Brainy: Samsung “Companion to AI Living” Ecosystem 

At CES 2026, Samsung did not present a single breakthrough device. Instead, it showcased something more structurally significant: AI as a system-wide design principle.

Under the banner “Your Companion to AI Living”, Samsung positions artificial intelligence as a foundational architecture spanning displays, appliances, mobile devices, health platforms, and cloud services—all interconnected through a common ecosystem.

The Technology

From an engineering perspective, the key shift is that AI is embedded across product development, operations, and user experience, rather than applied selectively. Samsung’s connected ecosystem—now exceeding 430 million SmartThings users—provides the scale needed to train, deploy, and continuously refine AI behaviors across heterogeneous hardware.

At the center of the visual display portfolio, Samsung introduced its 130-inch Micro RGB display, driven by independently controlled microscopic red, green, and blue diodes. Combined with the Micro RGB AI Engine Pro, the system enables precise color control and scene-by-scene optimization at a scale previously confined to professional environments.

The most telling software layer is Vision AI Companion (VAC), which transforms TVs from passive displays into context-aware interfaces. VAC enables multimodal interaction (voice, content recognition, and user context) to guide viewing, audio balancing, meal planning, and even device-to-device workflows.

Industry Relevance

For engineers, Samsung’s CES message is clear: the next phase of AI is not about isolated intelligence, but about building coherent, secure, and scalable systems where hardware, software, and data evolve together.

Samsung at CES 2026

6. Hotspot: Teledyne FLIR Tura LWIR Camera

Autonomous vehicles demand sensors that can see reliably in all conditions. At CES 2026, Teledyne FLIR OEM introduced the Tura, the first automotive long-wave infrared (LWIR) camera designed to meet ASIL-B functional safety standards (ISO 26262), explicitly targeting ADAS and autonomous driving applications.

The Tura camera features Resolution: 640 × 512, High sensitivity FIR sensor for detecting pedestrians, animals, and other vulnerable road users and Robust all-weather performance, including darkness, fog, smoke, and glare.This allows detection beyond the reach of headlights, improving pedestrian and obstacle recognition during night driving or low-visibility conditions.

Unlike conventional thermal cameras, Tura is built from the ground up for functional safety compliance, enabling its integration into ADAS systems such as nighttime automatic emergency braking (PAEB). The ASIL-B certification ensures that both hardware and software components meet stringent automotive safety standards, a critical requirement for high-speed autonomous operations and regulatory compliance like FMVSS 127.

Industry Relevance

For engineers and systems designers, the Tura represents a perceptive thermal sensing platform that combines safety-grade reliability, AI-optimized integration, and robust environmental performance—making it a key enabler for fully autonomous vehicles operating under real-world conditions.

Teledyne FLIR OEM

7. Clear-Sighted: Teradar Summit Terahertz Vision Sensor 

At CES 2026, Teradar unveiled Summit™, the world’s first terahertz (THz) vision sensor for automotive applications, targeting advanced driver assistance systems (ADAS) and autonomous vehicles (L2–L5).

Terahertz waves sit between radar and lidar in the electromagnetic spectrum, combining high-resolution detection with all-weather penetration. Summit exploits this to provide consistent perception in rain, fog, snow, glare, and dust, overcoming limitations of legacy sensors.

Among the key specifications are its range (300 m), Native resolution (0.13°), 4D measurements, Point cloud (3D + Doppler). This enables detection of small objects at long distances, day or night, with uncompromised reliability—a critical requirement for L3–L5 autonomous driving.

Teradar estimates that widespread deployment of Summit could prevent up to 150,000 weather-related road fatalities annually. By maintaining visibility where radar, lidar, and cameras fail, Summit adds a new dimension to automotive safety and autonomy.

Industry Relevance

For engineers designing next-generation ADAS or autonomous vehicles, Teradar Summit represents a penetrative sensing layer that extends operational safety into extreme environments, unlocking capabilities previously unattainable with conventional sensors.

Teradar Summit the worlds first terahertz vision sensor for ADAS and autonomous driving

Leave a Reply

Your email address will not be published. Required fields are marked *