kpam-sc: Generalizable manipulation planning using keypoint affordance and shape completion

W Gao, R Tedrake - 2021 IEEE International Conference on …, 2021 - ieeexplore.ieee.org
2021 IEEE International Conference on Robotics and Automation (ICRA), 2021ieeexplore.ieee.org
While traditional approaches to manipulation planning assume known object templates,
recent approaches to" category-level manipulation" aim to manipulate a category of objects
with potentially unknown instances and large intra-category shape variation. In this paper
we explore an object representation to enable precise category-level manipulation,
capturing a notion of the object configuration and extent, while being generalizable to novel
instances. Building on our previous work, kPAM 1, we combine semantic keypoints with …
While traditional approaches to manipulation planning assume known object templates, recent approaches to "category-level manipulation" aim to manipulate a category of objects with potentially unknown instances and large intra-category shape variation. In this paper we explore an object representation to enable precise category-level manipulation, capturing a notion of the object configuration and extent, while being generalizable to novel instances. Building on our previous work, kPAM 1 , we combine semantic keypoints with dense geometry (a point cloud or mesh) as the interface between the perception module and motion planner. Leveraging advances in learning-based keypoint detection and shape completion, both dense geometry and keypoints can be perceived from raw sensor input. Using the proposed hybrid object representation, we formulate the manipulation task as a motion planning problem which encodes both the object target configuration and physical feasibility for a category of objects. In this way, many existing manipulation planners can be generalized to categories of objects, and the resulting perception-to-action manipulation pipeline is robust to large intra-category shape variation. Extensive hardware experiments demonstrate our pipeline can produce robot trajectories that accomplish tasks with never-before-seen objects. The video demo is available on this link: https://sites.google.com/view/generalizable-manipulation.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果