Teaching Robots to “Feel” Using Haptics and AMAS XR Interface
- Chang Liu
- Jun 18
- 2 min read
Haptic feedback, which simulates the sense of touch through vibrations or motions, is revolutionising multimodal imitation learning, where AI systems learn complex tasks by mimicking human behaviour across multiple sensory inputs.
We’re excited to share that our latest research paper has been accepted to the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2025) - one of the world’s leading robotics conferences!
This work is a collaboration between Extend Robotics and the Robot Intelligence Lab at Imperial College London, led by Dr. Petar Kormushev, a leading expert in robot learning and reinforcement learning.
The paper focuses on how haptic feedback - the sensation of touch, can make imitation learning more powerful. Instead of just copying what a human sees or hears, robots can now also learn from what a human feels. This is made possible by our AMAS XR interface - a platform that blends VR, AR, and real-time haptics to let people control robots remotely while collecting rich, multi-sensory data.
This research is part of the Haptic-ACT project, which explores how touch can help robots master contact-rich tasks with more precision and flexibility.
Why it matters: By giving robots a sense of touch, we can teach them to:
Agriculture: Robots trained with AMAS XR data excel in picking delicate fruits, such as strawberries or grapes, by learning to apply gentle, precise pressure based on haptic feedback, minimizing crop damage.
Manufacturing: Haptic-driven learning enables robots to perform complex assembly tasks, like fitting components with tight tolerances, by replicating the haptic of human operators.
Logistics: In warehouses, robots use AMAS XR-collected data to handle contact-rich tasks, such as packing fragile items or manipulating irregularly shaped objects, with human-like dexterity.
We’re proud to present this work at IROS 2025 in Hangzhou, China this October - and to keep building toward intuitive, human-robot collaboration.
AMAS XR Interface: Bridging Simulation and Reality
The AMAS XR interface is a cloud-based, XR-native system designed for seamless teleoperation and data collection. It enables operators to control robots with precision using intuitive gestures and real-time haptic feedback, while also serving as a robust data pipeline for training AI models. Extend Robotics reports that AMAS XR achieves high accuracy in mapping user movements to robot actions, supports low-latency RGB-D streaming (under 150ms glass-to-glass), and requires low bandwidth (3-20 Mbps), making it ideal for real-world applications Extend Robotics Blog.
In the Extend Robotics-Robot Intelligence Lab collaboration, AMAS XR drives significant results:
Truly remote XR teleoperation interface: AMAS provides the dexterity, precision and speed needed for human-like robotic control beyond visual line of sight, giving user perception beyond physical presence.
Streamline Physical AI Training platform: AMAS enables efficient, high-quality data capture from human demonstrations, reducing the labour and capital intensity of training embodied AI.
Versatile deployment: Compatible with a wide range of third-party robots—from collaborative arms to humanoids—AMAS eliminates vendor lock-in and accelerates deployment across diverse hardware.
