AI_Site

VB-Com Learning Vision-Blind Composite Humanoid Locomotion Against Deficient Perception

pdf_1169  ·  Junli Ren1,2 Tao Huang1,3 Huayi Wang1,3 Zirui Wang1,4 Qingwei Ben1,5 Jiangmiao Pang1, † Ping Luo1,2, † ·

Abstract—The performance of legged locomotion is closely tied to the accuracy and comprehensiveness of state observations. “Blind policies”, which rely solely on proprioception, are considered highly robust due to the reliability of proprioceptive observations. However, these policies significantly limit locomotion speed and often require collisions with the terrain to adapt. In contrast, “Vision policies” allows the robot to plan motions in advance and respond proactively to unstructured terrains with an online perception module. However, perception is often compromised by noisy real-world environments, potential sensor failures, and the limitations of current simulations in presenting dynamic or deformable terrains. Humanoid robots, with high degrees of freedom and inherently unstable morphology, are particularly susceptible to misguidance from deficient perception, which can result in falls or termination on challenging dynamic terrains. To leverage the advantages of both vision and blind policies, we propose VB-Com, a composite framework that enables humanoid robots to determine when to rely on the vision policy and when to switch to the blind policy under perceptual deficiency. We demonstrate that VB-Com effectively enables humanoid robots to traverse challenging terrains and obstacles despite perception deficiencies caused by dynamic terrains or perceptual noise.

Code


Tasks


Datasets


Problems


Methods


Results from the Paper