Unlock Critical Insights from Top Tech Practitioners >> DISCOVER OUR voiceS of innovation program
GET CRITICAL TECH INSIGHTS > DISCOVER OUR VOICES OF INNOVATION PROGRAM
X

Intelligence Everywhere: Defining the Era of Edge and Physical AI

As CES 2026 opens, the narrative of Artificial Intelligence (AI) has grown from cloud-only models to tangible, real-world deployments. This transition is defined by two architectural forces: Physical AI, which enables autonomous machines to reason and act in unstructured environments, and Edge AI, which brings high-performance intelligence directly to personal devices to enhance privacy and responsiveness.

The Arm compute platform serves as one foundational architecture for this revolution. By delivering high performance within the strict power and thermal constraints required for mobile and autonomous operation, Arm enables a diverse ecosystem of partners to scale AI inference across a range of environments.

We talked to Arm to raise the curtain for CES 2026; this article examines four pioneering use cases being spotlighted at the show:

  • The Arm-powered Nuro Driver™, Nuro’s scalable AI driver designed for Level 4 autonomy in complex urban settings.
  • AGIBOT’s milestone of mass-producing 5,000 humanoid robots on Arm-powered NVIDIA platforms.
  • RelaJet’s Arm-powered AI hearing devices, delivering real-time speech enhancement to improve accessibility and everyday communication.
  • ThinkAR x Envision’s breakthrough in assistive wearables using the Arm Cortex-M55 to provide independence for the visually impaired.

Together, these collaborations demonstrate that “Intelligence Everywhere” is no longer a vision for the future—it’s reality.

Introduction – The Shift to Real-World Intelligence

For the last few years, the promise of AI was confined to the data center. However, 2025 served as the “iPhone moment” for AI, moving it from experimental chatbots to foundational technologies that drive our cars, manage our workspaces, and assist our daily lives. As we open CES 2026, the industry has converged on two defining forces: Physical AI and Edge AI.

  • Physical AI: Machines—from robotaxis to humanoids—that can see, reason, and act autonomously in unstructured human environments.
  • Edge AI: The migration of complex intelligence onto the personal devices we use every day, ensuring that intelligence is responsive, private, and always available.

The Arm compute platform is an indispensable common thread across these innovations. By delivering high-performance compute within the strict power and thermal budgets required for mobility and portability, Arm allows partners like Nuro, AGIBOT, RelaJet, and ThinkAR x Envision to bridge the gap between digital models and physical reality. Let’s explore some of the tech that will take center stage in Las Vegas.

Physical AI Part 1 – The AI-Defined Vehicle (Partner: Nuro)

Use Case: Robotaxis at Scale

Autonomous mobility is entering an era of AI-defined platforms built for real-world operations and commercial scale. Nuro is helping lead that shift with its universal autonomy platform that enables automotive and mobility companies to scale autonomy across applications, vehicles, and geographies.

At CES 2026, Nuro will present the global robotaxi it is developing with partners Lucid and Uber. On display in the NVIDIA Showcase, the vehicle features the Nuro Driver™, Nuro’s scalable AI driver designed for Level 4 autonomy in complex urban environments.

End-Customer Benefits

  • Cost-efficient path to scale: Nuro’s platform is designed for production-oriented deployments using commercially available, automotive-grade sensors and a compute architecture that reduces complexity and power consumption, supporting strong unit economics at scale.
  • AI-first scalability: Nuro’s end-to-end AI foundation model blends state-of-the-art AI with clear, verifiable safety logic, enabling Nuro Driver to adapt quickly to new vehicles, applications, and geographies. This helps reduce integration effort and accelerate time to market while maintaining comfortable, reliable performance.

The Configuration: Arm Neoverse in NVIDIA DRIVE AGX Thor

Nuro Driver™ is built on NVIDIA DRIVE AGX Thor SoC, which utilizes 14x Arm Neoverse V3AE CPU cores.

The Arm Advantage

  • Performance Scaling: The Neoverse V3AE delivers a 2.3x higher integer performance uplift compared to the previous generation, essential for the high-throughput decision-making required in complex urban driving scenarios.
  • Functional Safety: Architected for ISO 26262 ASIL-D standards, Arm technology supports deterministic reliability for safety-critical AV functions.

Go-to-Market Strategy

Nuro licenses the Nuro Driver to OEMs and mobility providers, supporting partner programs from integration through testing, validation, and deployment.

Physical AI Part 2 – Robotics as Indispensable Partners (Partner: AGIBOT)

Use Case: Humanoid Industrial & Commercial Services

At the show, AGIBOT will showcase its full robotics portfolio – headlined by three flagship models: the full-size humanoid A2, half-size humanoid X2, and wheeled humanoid G2. Key on-site use cases include robot group dancing, verbal and gestural interaction, and intelligent manipulation.

End-Customer Benefits

  • Seamless Integration: Humanoids like the A2 can navigate spaces designed for humans—using stairs, opening doors, and operating tools—without requiring modification to the factory or office floor.
  • Mass-Production Reliability: AGIBOT's milestone of 5,000 mass-produced units ensures that these are not mere prototypes, but tested productivity partners ready for real-world applications.

The Configuration: Arm technology in Jetson Orin and Thor

AGIBOT’s advanced robots are built on the platforms of NVIDIA Jetson Orin, featuring Cortex-A78AE cores, and NVIDIA Jetson Thor platform, which features a 14-core Arm NeoverseV3AE CPU. This provides the high-performance compute and low latency needed to run multi-modal models like vision, language, and motion planning in parallel.

The Arm Advantage

  • Ultra-Fast Response: The Arm-based Jetson platform enables 10ms processing from sensor “glass-to-action,” allowing the robot to hold a raw egg without breaking it or navigate dynamic crowds safely.
  • Energy Efficiency: Jetson Thor delivers 3.5x better energy efficiency than the previous Orin generation, extending the robot's untethered operation time in industrial settings.

Go-to-Market Strategy

AGIBOT has officially targeted eight key application scenarios, including exhibition/event reception and guide services, entertainment and performances, smart manufacturing, and data collection training. These scenario-specific solutions are already entering volume deployment phases globally.

At CES, the industry is expected to showcase a new wave of AI-capable devices across the consumer landscape – from PCs to gaming systems and wearables. Many of these systems are being built on the Arm compute platform, reflecting its growing role in delivering the performance, efficiency, and scalability required for edge AI.

While much of this momentum is visible in mainstream productivity and entertainment categories, some of the most impactful edge AI applications go further. By tailoring intelligence to real-world needs, on-device AI is enabling new experiences in accessibility and assistive technology. Two such examples are highlighted below.

Edge AI Part 1 – Intelligent Hearing (Partner: RelaJet)

Use Case: On-Device AI for Assistive and Augmented Hearing

As edge AI matures, some of its most profound impacts are emerging not on screens, but in human perception itself. At CES 2026, RelaJet is showcasing AI-powered hearing devices that use real-time, on-device intelligence to help users not just hear more sound, but understand speech more clearly in complex, noisy environments.

Founded by a hearing-impaired technologist, RelaJet has built its Otoadd hearing solutions around a simple but powerful idea: AI should actively assist human communication, not passively amplify noise. By running advanced speech enhancement models directly on Arm-based processors, RelaJet enables intelligent hearing assistance that works anywhere, without reliance on the cloud.

End-Customer Benefits

  • Speech Clarity in Real Environments: RelaJet’s AI speech enhancement engine separates human voices from background noise in real time, allowing users to focus on conversations in cafés, offices, public transport, and social settings.
  • Lower Cognitive Load: By intelligently filtering sound at the source, the system reduces the mental fatigue associated with trying to follow conversations in noisy environments.
  • Privacy and Reliability: All core AI processing runs locally on the device, ensuring consistent performance without an internet connection and keeping sensitive audio data private.

The Configuration: Arm Cortex-M based Edge AI

RelaJet’s hearing devices are built on Arm Cortex-M–class processors, optimized to deliver real-time AI inference within the extreme power and size constraints of wearable form factors. The AI models are compressed and tuned to run efficiently on-device, enabling continuous speech separation, noise reduction, and feedback suppression in a compact, battery-powered system.

The Arm Advantage

  • Ultra-Low-Power AI Inference: The efficiency of the Arm architecture enables always-on audio processing without compromising battery life, a critical requirement for in-ear devices worn throughout the day.
  • Low-Latency Processing: Arm-based compute allows RelaJet’s AI models to respond instantly to changing acoustic conditions, maintaining natural conversation flow.
  • Scalable Edge Intelligence: The flexibility of the Arm ecosystem allows RelaJet to deliver medical-grade hearing assistance today while expanding toward broader personal audio and augmented hearing applications in the future.

Go-to-Market Strategy

RelaJet is commercializing its AI hearing technology through the Otoadd product line, with devices already in market.

Edge AI Part 2 – Assistive Wearables (Partner: ThinkAR x Envision)

Use Case: Hands-Free Visual Assistance

ThinkAR and Envision are demonstrating lightweight AR glasses designed as a "visual assistant" for the blind and low-vision community. At CES, they will showcase the camera-enabled glasses powered by the Arm Cortex-M55, which streams video to Ally, Envision’s AI assistant.

End-Customer Benefits

  • Indoor Navigation: Users can navigate complex grocery stores by asking “Hey Ally, help me pick up some apples.” The glasses then play a "beacon sound" through spatial audio that the user follows to the exact location.
  • Real-Time Identification: Users can hold an object and ask, “Is this a red onion?” The AI identifies the item in their hand instantly.
  • Independence: The ability to read letters, sort mail, or identify a “body of water in a grassy field” provides users with the confidence to roam without worry.

The Configuration: Arm Cortex-M55 & M4

The ThinkAR AiLens incorporates the Arm Cortex-M55 and Cortex-M4 processors. The Cortex-M55 is one of Arm’s most AI-capable microcontrollers, specifically designed for endpoint intelligence.

The Arm Advantage

  • Helium Vector Processing: The Cortex-M55 features Arm Helium technology, delivering up to 15x machine learning performance uplift over previous M-series processors.
  • Streaming Persistence: The efficiency of the Arm architecture enables 45 minutes of continuous live video streaming, a critical requirement for a device that must provide constant feedback.

Go-to-Market Strategy

ThinkAR and Envision are partnering to deliver these affordable AI glasses to the global assistive market, offering a subscription-based AI service (Ally Pro) to provide continuous software updates and new AI capabilities.

Conclusion – Physical & Edge AI Built on Arm

As the curtain rises on CES 2026, the technology landscape will be defined by one core reality: Intelligence is moving to the point of action. The success of the partners featured in this paper—Nuro, AGIBOT, RelaJet, and ThinkAR—highlights that advanced AI is no longer a luxury of the cloud. It is a tool for everyday independence, a driver of industrial productivity, and the brain of future mobility.

The Arm compute platform remains the primary catalyst for this shift. By providing a common, high-performance, and power-efficient architectural foundation, Arm ensures that whether it is a 130W robotaxi or a milliwatt AR glass, the intelligence powering it is reliable, safe, and ready for the real world.

Subscribe to our newsletter.

As CES 2026 opens, the narrative of Artificial Intelligence (AI) has grown from cloud-only models to tangible, real-world deployments. This transition is defined by two architectural forces: Physical AI, which enables autonomous machines to reason and act in unstructured environments, and Edge AI, which brings high-performance intelligence directly to personal devices to enhance privacy and responsiveness.

The Arm compute platform serves as one foundational architecture for this revolution. By delivering high performance within the strict power and thermal constraints required for mobile and autonomous operation, Arm enables a diverse ecosystem of partners to scale AI inference across a range of environments.

We talked to Arm to raise the curtain for CES 2026; this article examines four pioneering use cases being spotlighted at the show:

  • The Arm-powered Nuro Driver™, Nuro’s scalable AI driver designed for Level 4 autonomy in complex urban settings.
  • AGIBOT’s milestone of mass-producing 5,000 humanoid robots on Arm-powered NVIDIA platforms.
  • RelaJet’s Arm-powered AI hearing devices, delivering real-time speech enhancement to improve accessibility and everyday communication.
  • ThinkAR x Envision’s breakthrough in assistive wearables using the Arm Cortex-M55 to provide independence for the visually impaired.

Together, these collaborations demonstrate that “Intelligence Everywhere” is no longer a vision for the future—it’s reality.

Introduction – The Shift to Real-World Intelligence

For the last few years, the promise of AI was confined to the data center. However, 2025 served as the “iPhone moment” for AI, moving it from experimental chatbots to foundational technologies that drive our cars, manage our workspaces, and assist our daily lives. As we open CES 2026, the industry has converged on two defining forces: Physical AI and Edge AI.

  • Physical AI: Machines—from robotaxis to humanoids—that can see, reason, and act autonomously in unstructured human environments.
  • Edge AI: The migration of complex intelligence onto the personal devices we use every day, ensuring that intelligence is responsive, private, and always available.

The Arm compute platform is an indispensable common thread across these innovations. By delivering high-performance compute within the strict power and thermal budgets required for mobility and portability, Arm allows partners like Nuro, AGIBOT, RelaJet, and ThinkAR x Envision to bridge the gap between digital models and physical reality. Let’s explore some of the tech that will take center stage in Las Vegas.

Physical AI Part 1 – The AI-Defined Vehicle (Partner: Nuro)

Use Case: Robotaxis at Scale

Autonomous mobility is entering an era of AI-defined platforms built for real-world operations and commercial scale. Nuro is helping lead that shift with its universal autonomy platform that enables automotive and mobility companies to scale autonomy across applications, vehicles, and geographies.

At CES 2026, Nuro will present the global robotaxi it is developing with partners Lucid and Uber. On display in the NVIDIA Showcase, the vehicle features the Nuro Driver™, Nuro’s scalable AI driver designed for Level 4 autonomy in complex urban environments.

End-Customer Benefits

  • Cost-efficient path to scale: Nuro’s platform is designed for production-oriented deployments using commercially available, automotive-grade sensors and a compute architecture that reduces complexity and power consumption, supporting strong unit economics at scale.
  • AI-first scalability: Nuro’s end-to-end AI foundation model blends state-of-the-art AI with clear, verifiable safety logic, enabling Nuro Driver to adapt quickly to new vehicles, applications, and geographies. This helps reduce integration effort and accelerate time to market while maintaining comfortable, reliable performance.

The Configuration: Arm Neoverse in NVIDIA DRIVE AGX Thor

Nuro Driver™ is built on NVIDIA DRIVE AGX Thor SoC, which utilizes 14x Arm Neoverse V3AE CPU cores.

The Arm Advantage

  • Performance Scaling: The Neoverse V3AE delivers a 2.3x higher integer performance uplift compared to the previous generation, essential for the high-throughput decision-making required in complex urban driving scenarios.
  • Functional Safety: Architected for ISO 26262 ASIL-D standards, Arm technology supports deterministic reliability for safety-critical AV functions.

Go-to-Market Strategy

Nuro licenses the Nuro Driver to OEMs and mobility providers, supporting partner programs from integration through testing, validation, and deployment.

Physical AI Part 2 – Robotics as Indispensable Partners (Partner: AGIBOT)

Use Case: Humanoid Industrial & Commercial Services

At the show, AGIBOT will showcase its full robotics portfolio – headlined by three flagship models: the full-size humanoid A2, half-size humanoid X2, and wheeled humanoid G2. Key on-site use cases include robot group dancing, verbal and gestural interaction, and intelligent manipulation.

End-Customer Benefits

  • Seamless Integration: Humanoids like the A2 can navigate spaces designed for humans—using stairs, opening doors, and operating tools—without requiring modification to the factory or office floor.
  • Mass-Production Reliability: AGIBOT's milestone of 5,000 mass-produced units ensures that these are not mere prototypes, but tested productivity partners ready for real-world applications.

The Configuration: Arm technology in Jetson Orin and Thor

AGIBOT’s advanced robots are built on the platforms of NVIDIA Jetson Orin, featuring Cortex-A78AE cores, and NVIDIA Jetson Thor platform, which features a 14-core Arm NeoverseV3AE CPU. This provides the high-performance compute and low latency needed to run multi-modal models like vision, language, and motion planning in parallel.

The Arm Advantage

  • Ultra-Fast Response: The Arm-based Jetson platform enables 10ms processing from sensor “glass-to-action,” allowing the robot to hold a raw egg without breaking it or navigate dynamic crowds safely.
  • Energy Efficiency: Jetson Thor delivers 3.5x better energy efficiency than the previous Orin generation, extending the robot's untethered operation time in industrial settings.

Go-to-Market Strategy

AGIBOT has officially targeted eight key application scenarios, including exhibition/event reception and guide services, entertainment and performances, smart manufacturing, and data collection training. These scenario-specific solutions are already entering volume deployment phases globally.

At CES, the industry is expected to showcase a new wave of AI-capable devices across the consumer landscape – from PCs to gaming systems and wearables. Many of these systems are being built on the Arm compute platform, reflecting its growing role in delivering the performance, efficiency, and scalability required for edge AI.

While much of this momentum is visible in mainstream productivity and entertainment categories, some of the most impactful edge AI applications go further. By tailoring intelligence to real-world needs, on-device AI is enabling new experiences in accessibility and assistive technology. Two such examples are highlighted below.

Edge AI Part 1 – Intelligent Hearing (Partner: RelaJet)

Use Case: On-Device AI for Assistive and Augmented Hearing

As edge AI matures, some of its most profound impacts are emerging not on screens, but in human perception itself. At CES 2026, RelaJet is showcasing AI-powered hearing devices that use real-time, on-device intelligence to help users not just hear more sound, but understand speech more clearly in complex, noisy environments.

Founded by a hearing-impaired technologist, RelaJet has built its Otoadd hearing solutions around a simple but powerful idea: AI should actively assist human communication, not passively amplify noise. By running advanced speech enhancement models directly on Arm-based processors, RelaJet enables intelligent hearing assistance that works anywhere, without reliance on the cloud.

End-Customer Benefits

  • Speech Clarity in Real Environments: RelaJet’s AI speech enhancement engine separates human voices from background noise in real time, allowing users to focus on conversations in cafés, offices, public transport, and social settings.
  • Lower Cognitive Load: By intelligently filtering sound at the source, the system reduces the mental fatigue associated with trying to follow conversations in noisy environments.
  • Privacy and Reliability: All core AI processing runs locally on the device, ensuring consistent performance without an internet connection and keeping sensitive audio data private.

The Configuration: Arm Cortex-M based Edge AI

RelaJet’s hearing devices are built on Arm Cortex-M–class processors, optimized to deliver real-time AI inference within the extreme power and size constraints of wearable form factors. The AI models are compressed and tuned to run efficiently on-device, enabling continuous speech separation, noise reduction, and feedback suppression in a compact, battery-powered system.

The Arm Advantage

  • Ultra-Low-Power AI Inference: The efficiency of the Arm architecture enables always-on audio processing without compromising battery life, a critical requirement for in-ear devices worn throughout the day.
  • Low-Latency Processing: Arm-based compute allows RelaJet’s AI models to respond instantly to changing acoustic conditions, maintaining natural conversation flow.
  • Scalable Edge Intelligence: The flexibility of the Arm ecosystem allows RelaJet to deliver medical-grade hearing assistance today while expanding toward broader personal audio and augmented hearing applications in the future.

Go-to-Market Strategy

RelaJet is commercializing its AI hearing technology through the Otoadd product line, with devices already in market.

Edge AI Part 2 – Assistive Wearables (Partner: ThinkAR x Envision)

Use Case: Hands-Free Visual Assistance

ThinkAR and Envision are demonstrating lightweight AR glasses designed as a "visual assistant" for the blind and low-vision community. At CES, they will showcase the camera-enabled glasses powered by the Arm Cortex-M55, which streams video to Ally, Envision’s AI assistant.

End-Customer Benefits

  • Indoor Navigation: Users can navigate complex grocery stores by asking “Hey Ally, help me pick up some apples.” The glasses then play a "beacon sound" through spatial audio that the user follows to the exact location.
  • Real-Time Identification: Users can hold an object and ask, “Is this a red onion?” The AI identifies the item in their hand instantly.
  • Independence: The ability to read letters, sort mail, or identify a “body of water in a grassy field” provides users with the confidence to roam without worry.

The Configuration: Arm Cortex-M55 & M4

The ThinkAR AiLens incorporates the Arm Cortex-M55 and Cortex-M4 processors. The Cortex-M55 is one of Arm’s most AI-capable microcontrollers, specifically designed for endpoint intelligence.

The Arm Advantage

  • Helium Vector Processing: The Cortex-M55 features Arm Helium technology, delivering up to 15x machine learning performance uplift over previous M-series processors.
  • Streaming Persistence: The efficiency of the Arm architecture enables 45 minutes of continuous live video streaming, a critical requirement for a device that must provide constant feedback.

Go-to-Market Strategy

ThinkAR and Envision are partnering to deliver these affordable AI glasses to the global assistive market, offering a subscription-based AI service (Ally Pro) to provide continuous software updates and new AI capabilities.

Conclusion – Physical & Edge AI Built on Arm

As the curtain rises on CES 2026, the technology landscape will be defined by one core reality: Intelligence is moving to the point of action. The success of the partners featured in this paper—Nuro, AGIBOT, RelaJet, and ThinkAR—highlights that advanced AI is no longer a luxury of the cloud. It is a tool for everyday independence, a driver of industrial productivity, and the brain of future mobility.

The Arm compute platform remains the primary catalyst for this shift. By providing a common, high-performance, and power-efficient architectural foundation, Arm ensures that whether it is a 130W robotaxi or a milliwatt AR glass, the intelligence powering it is reliable, safe, and ready for the real world.

Subscribe to our newsletter.

Transcript

Subscribe to TechArena

Subscribe