top of page

Extend Robotics Bridges Simulation and Reality with Intuitive XR Interface, Accelerating Real-World AI Adoption with NVIDIA

Updated: 3 days ago

Extend Robotics enables industries to rapidly deploy robots using fully immersive Extended Reality (XR) based teleoperation and advanced robot training systems on a wide-range of hardware platforms, leveraging NVIDIA Jetson AGX Orin, NVIDIA Isaac Lab and NVIDIA Isaac GR00T Blueprint.



Addressing Challenges in Embodied AI


Embodied AI represents the state-of-the-art framework for real-world automation, but it faces significant hurdles. Extend Robotics AMAS solution is designed to offer:


  • Truly remote XR teleoperation interface: AMAS provides the dexterity, precision and speed needed for human-like robotic control beyond visual line of sight, giving user perception beyond physical presence.

  • Streamline Physical AI Training platform: AMAS enables efficient, high-quality data capture from human demonstrations, reducing the labour and capital intensity of training embodied AI.

  • Versatile deployment: Compatible with a wide range of third-party robots—from collaborative arms to humanoids—AMAS eliminates vendor lock-in and accelerates deployment across diverse hardware.

 

By integrating AMAS with NVIDIA technologies, this collaboration unlocks the data flywheel that powers the next generation of intelligent robots.



AMAS: A Truly Remote XR Teleoperation Interface


Extend Robotics’ AMAS is an accessible, XR-based SaaS solution that redefines how humans interact with robots. The app is accessible via both Meta store and Steam (follow this link for more information). Key capabilities include:


  • 3D Volumetric Telepresence: Live streaming of fused 3D data from multiple RGB-D cameras. Combining with 6DoF head tracking removes motion sickness and maximises spatial perception, with hardware accelerated 3D video compression

  • High-Dexterity Mobile Teleoperation: Hand tracking and upper-body tracking enables intuitive and precise mobile manipulation at high degrees of freedom and ultra-low latency

  • Dual Control: AMAS seamlessly teleoperates both physical robots and simulated robots Isaac Lab.

  • Fleet Monitoring: A web console provides real-time oversight and instant intervention.

 

In demonstrations, AMAS showcases an avatar robot mirroring operator movements—whether plugging components in a factory or performing similar tasks in a simulated environment in Isaac Lab.

AMAS XR based Tele-operation interface, with Hand tracking (show all 12 DoFs), elbow tracking,  Live streaming of 3D data, with head tracking, while fusing multiple cameras
Tele-operation interface for NVIDIA Isaac Lab avatar robot, same as a physical robot

Breakthroughs in Synthetic Data Generation and AI Training


The pipelineleverages recent advancements in generative AI and NVIDIA Isaac GR00T Blueprint to supercharge synthetic manipulation motion data generation:

  • GR00T-Mimic: Demonstrations captured via AMAS to seed synthetic motion trajectories, amplifying a single example into thousands of training scenarios.

  • GR00T-Gen: Adds diversity to these datasets—varying object positions, lighting, and environmental factors—and helps augment the dataset to required photorealism, ensuring robustness in real-world conditions.

This synthetic data pipeline, integrated with Isaac Sim and Isaac Lab, enables rapid-post-training and validation of embodied AI models like NVIDIA Isaac GR00T N1, an open foundation model for humanoid robot reasoning.

 


A Scalable Framework for Embodied AI Imitation and Reinforcement Learning


Extend Robotics’ framework ties the above components all together with Extend Command Console—a web console managing data collection, AI training, and model inference with human supervision, which supports:


  • Multi-modal Dataset Preparation: AMAS pipeline automatically prepares your dataset in LeRobot, HDF5 or ROSbag formats.

  • Training: train/fine-tune a selection of AI models such as Isaac GR00T N1.5, Action Chunking Transformers (ACT), Pi0, etc.

  • Inference: Models like Isaac GR00T N1 can be deployed autonomously on edge devices powered by Jetson AGX Orin Modules, ensuring low communication latency.

  • Supervision: Operators remotely monitor autonomous behaviours via AMAS and can intervene in edge cases to refine models and collect additional data.

 

This closed-loop system, visualized in a high-level diagram (use case data → simulation → training → deployment → supervision → edge case data), enables continuous improvement of robotic intelligence across physical and virtual domains.



The first video demonstrates our web console for data and AI, which accelerates the workflow of data collection, export, AI training, and deployment to the Extend Cortex.


Web console for data and AI: Data collection, and Data export, AI Training, and deploying resulting model to cortex

The second video illustrates AI model inferencing with remote human supervision, showing autonomous robot behaviour, real-time monitoring, and operator intervention via AMAS (enabling edge-case data collection).


AI model inferencing with remote human supervision and edge-case data collection

Real-World Impact and Applications


Extend Robotics are working with industrial partners on delivering transformative automation solutions to complex, labour-intensive industries with difficult or harsh environments. Key partners include:

  • Leyland Trucks: Factory Automation:

    • Use Case: 1. Pre-Trained Robot to insert High Voltage MSD unit

    • AMAS is used to supervise and train the robot to plug a Master Service Disconnect (MSD) unit into a socket triggering a high voltage connection. The robot is simulated and validated in Isaac Sim before real world deployment. Automating MSD fitting eliminates the need for staff to undergo expensive safety operations and perform high-risk jobs, reducing costs and operational risk.

    • Use case: 2. Painting truck components using robot. The robot is trained to perform high variable painting of custom truck components, removing workers from highly hazardous paint booths. Painting automation will reduce labour cost and improve the final quality of the product through highly precise and automated painting by the robot.

    • Content: Videos showcase these applications (Blog post link).

  • Satellite Application Catapult: Manufacturing in Space:

    • Use Case: AMAS enables remote manufacturing, inspection, and in-orbit servicing, building on prior Satellite Applications Catapult testing (Blog post link). Reduces human exposure to hazardous aerospace environments, enhancing safety and precision.

  • AtkinsRealis: Nuclear Materials Handling:

    • Use Case: AMAS teleoperates robots in nuclear glovebox operations, handling radioactive materials safely. (Blog post link). Reduces  risk of physical harm to workers, delivering reliable uptime in hazardous settings.

  • Saffron Grange: Robots for Agriculture:

    • Use Case: AMAS controls grape-harvesting robots under an Innovate UK project, featured in The Economist. Automates labour-intensive harvesting, reducing costs and scaling operations sustainably.

    • Content: Video available (Blog post link).

 

These successes highlight AMAS’s potential to play an integral role in industrial applications, from lowering costs and enhancing safety to delivering consistent performance across variable tasks.



Enablement with NVIDIA Technologies


Extend Robotics relies on NVIDIA’s robust ecosystem for:

  • Edge Computing: NVIDIA Jetson AGX Orin modules

  • Simulation: NVIDIA Isaac Sim and Isaac Lab provide virtual environments for data collection and AI training.

  • Foundation Model: Isaac GR00T N1 drives humanoid reasoning, post-trained with AMAS data.

  • Synthetic Data Generation: GR00T-Mimic and GR00T-Gen for generating and enhancing training datasets.

  • Cloud and Training Infrastructure: AWS SageMaker with NVIDIA accelerated computing, with potential future use of NVIDIA DGX Cloud.

  • XR Acceleration: RTX GPUs power AMAS’s immersive interface.

  • Hardware accelerated 3D video codec

 

Support from the NVIDIA Inception program for startups has been instrumental, providing resources and expertise to fuel this innovation.


Conclusion

By combining Extend Robotics’ AMAS with NVIDIA’s Isaac platform, this collaboration is redefining embodied AI. From factory floors to vineyards, we’re enabling robots to learn faster, perform better, and work safer—extending human capabilities beyond physical limits. As we continue to innovate, the future of AI-driven automation is brighter than ever.


We are proud to be featured in NVIDIA blog posts:


We appreciate the relavent featuring by other medias:

bottom of page