Listen to this article |
Researchers at the University of Bristol based at the Bristol Robotics Laboratory have designed a bi-touch system that allows robots to carry out manual tasks by sensing what to do from a digital helper. The system can help a bimanual robot display tactile sensitivity close to human-level dexterity using AI to inform its actions.
The research team developed a tactile dual-arm robotic system that learns bimanual skills through Deep Reinforcement Learning (Deep-RL). This kind of learning is designed to teach robots to do things by letting them learn from trial and error, similar to training a dog with rewards and punishments.
The team started their research by building up a virtual world that contains two robot arms equipped with tactile sensors. Next, they designed reward functions and a goal-update mechanism that could encourage the robot agents to learn to achieve the bimanual tasks. They then developed a real-world tactile dual-arm robot system to apply the agent.
“With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks [tailored to] the touch. And more importantly, we can directly apply these agents from the virtual world to the real world without further training,” lead author Yijiong Lin from the University of Bristol’s Faculty of Engineering, said. “The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”
For robotic manipulation, for example, the robot learns to make decisions by attempting various behaviors to achieve designated tasks, like lifting objects without dropping or breaking them. When the robot succeeds, it gets a prize, when it fails, it learns what not to do.
Over time, it figures out the best ways to grab things using these rewards and punishments. The AI agent is visually blind while doing this learning, and relies only on tactile feedback and proprioceptive feedback, which is a body’s ability to sense movement, action, and location.
“Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual [behaviors] with touch in simulation, which can be directly applied to the real world,” co-author Professor Nathan Lepora said. “Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”
Using this method, the researchers were able to successfully enable the dual-arm robot to safely lift items as fragile as a single Pringle chip. This development could be useful in industries like fruit picking and domestic service, and eventually to recreate touch in artificial limbs.
The team’s research was published in IEEE Robotics and Automation Letters.
Credit: Source link
Comments are closed.