To Enable Advanced Research on Artificial Humanoid Control, Microsoft’s Robotics Team is Releasing A Library of Pre-Trained Simulated Humanoid Control Models with Enriched Data for Training New Ones
Simulated humanoids present an intriguing platform for investigating motor intelligence with their ability to mimic the whole spectrum of human motion. An important area of study in machine learning is the acquisition and application of motor skills. Physically simulating human talents presents significant control challenges. A controller must deal with a high-dimensional, unstable and discontinuous system, requiring precise timing and coordination to achieve the desired motion.
All current learning methods find it challenging to acquire complicated humanoid behaviors using a tabula-rasa approach. Motion capture (MoCap) data has rapidly become an integral part of humanoid control studies. MoCap trajectories are sequences of configurations and pose that the human body assumes throughout the motion in question, and as such, they contain kinematic information on the motion. This information can help a simulated humanoid learn basic motor skills through MoCap demonstrations, making it easier to train complex control strategies.
Unfortunately, using MoCap data in a physics simulator necessitates recovering the actions (e.g., joint torques) that produce the series of kinematic poses in a given MoCap trajectory (i.e., tracking the trajectory). Finding an action sequence that makes a humanoid track a MoCap sequence is not straightforward. Reinforcement learning and adversarial learning are two approaches that have been used to address this issue. Training agents to recreate hours of MoCap data also necessitates a lot of computing, and the computational burden of detecting these actions grows with the amount of MoCap data. Even though MoCap datasets are widely available, few research organizations with sizable computational resources have been able to use them to further learning-based humanoid control.
A recent Microsoft study introduced MoCapAct, a dataset of high-quality MoCaptracking rules for a MuJoCo-based simulated humanoid, along with a collection of rollouts from these expert policies.
Aiming to remove the current barriers and enable using MoCap data in humanoid control research, MoCap is designed to be compatible with the highly popular dm_control humanoid simulation environment. CMU MoCap is one of the largest publicly available MoCap datasets. The policies from MoCapAct can track all 3.5 hours of that data.
The researchers demonstrate MoCapAct’s use for learning varied motions by investigating its expert policies and using the expert rollouts to train a single hierarchical policy that can track all of the considered MoCap clips. The policy’s low-level part is then recycled for efficient RL task learning.
The team trained a GPT network to generate a motion in the MuJoCo simulator in response to a motion prompt using the dataset for generative motion completion.
This dataset allows research groups to avoid the time- and energy-consuming process of learning low-level motor skills using MoCap data. This considerably reduces the entrance hurdle for simulated humanoid control, promising rich possibilities for exploring multi-task learning and motor intelligence. The team believes their approach can be used in training alternative policy frameworks like decision transformers or in setups like offline reinforcement learning.
This Article is written as a research summary article by Marktechpost Staff based on the research paper 'MoCapAct: A Multi-Task Dataset for
Simulated Humanoid Control'. All Credit For This Research Goes To Researchers on This Project. Check out the paper, github, project page and reference article.
Please Don't Forget To Join Our ML Subreddit
Tanushree Shenwai is a consulting intern at MarktechPost. She is currently pursuing her B.Tech from the Indian Institute of Technology(IIT), Bhubaneswar. She is a Data Science enthusiast and has a keen interest in the scope of application of artificial intelligence in various fields. She is passionate about exploring the new advancements in technologies and their real-life application.
Credit: Source link
Comments are closed.