ChatPaper.aiChatPaper

RobustDexGrasp:基於單視角感知的通用物體穩健靈巧抓取

RobustDexGrasp: Robust Dexterous Grasping of General Objects from Single-view Perception

April 7, 2025
作者: Hui Zhang, Zijian Wu, Linyi Huang, Sammy Christen, Jie Song
cs.AI

摘要

從單一視角感知中實現對各種物體的穩健抓取是靈巧機器人的基礎能力。以往的研究通常依賴於完全可觀測的物體、專家示範或靜態抓取姿態,這些限制限制了其泛化能力和對外部干擾的適應性。本文提出了一種基於強化學習的框架,該框架能夠從單一視角感知中實現對多種未見物體的零樣本動態靈巧抓取,同時執行適應外部干擾的動作。我們採用了一種以手為中心的物體表示方法來提取形狀特徵,該方法強調與交互相關的局部形狀,從而增強了對形狀變化和不確定性的魯棒性。為了使手在有限觀測條件下有效適應干擾,我們提出了一種混合課程學習策略,該策略首先利用模仿學習來提煉一個基於特權實時視覺-觸覺反饋訓練的策略,然後逐步轉向強化學習,以在觀測噪聲和動態隨機化引起的干擾下學習適應性動作。我們的實驗展示了在隨機姿態下抓取未見物體的強大泛化能力,在247,786個模擬物體上達到了97.0%的成功率,在512個真實物體上達到了94.6%的成功率。我們還通過定量和定性評估,展示了我們方法對各種干擾(包括未觀測到的物體移動和外部力)的魯棒性。項目頁面:https://zdchan.github.io/Robust_DexGrasp/
English
Robust grasping of various objects from single-view perception is fundamental for dexterous robots. Previous works often rely on fully observable objects, expert demonstrations, or static grasping poses, which restrict their generalization ability and adaptability to external disturbances. In this paper, we present a reinforcement-learning-based framework that enables zero-shot dynamic dexterous grasping of a wide range of unseen objects from single-view perception, while performing adaptive motions to external disturbances. We utilize a hand-centric object representation for shape feature extraction that emphasizes interaction-relevant local shapes, enhancing robustness to shape variance and uncertainty. To enable effective hand adaptation to disturbances with limited observations, we propose a mixed curriculum learning strategy, which first utilizes imitation learning to distill a policy trained with privileged real-time visual-tactile feedback, and gradually transfers to reinforcement learning to learn adaptive motions under disturbances caused by observation noises and dynamic randomization. Our experiments demonstrate strong generalization in grasping unseen objects with random poses, achieving success rates of 97.0% across 247,786 simulated objects and 94.6% across 512 real objects. We also demonstrate the robustness of our method to various disturbances, including unobserved object movement and external forces, through both quantitative and qualitative evaluations. Project Page: https://zdchan.github.io/Robust_DexGrasp/

Summary

AI-Generated Summary

PDF32April 10, 2025