论文标题

NeuralGrasps:学习多个机器人手的grasps的隐式表示

NeuralGrasps: Learning Implicit Representations for Grasps of Multiple Robotic Hands

论文作者

Khargonkar, Ninad, Song, Neil, Xu, Zesheng, Prabhakaran, Balakrishnan, Xiang, Yu

论文摘要

我们引入了来自多个机器人手的对象的神经隐式表示。多个机器人手之间的不同掌握被编码为共享的潜在空间。学会了每个潜在矢量以两个3D形状的签名距离函数来解码对象的3D形状,并以机器人手的3D形状在握住姿势中。此外,学会了潜在空间中的距离度量,以保留不同机器人手之间的grasps之间的相似性,在不同的机器人手中,根据机器人手的接触区域定义了grasps的相似性。该属性使我们能够在包括人手在内的不同抓地力之间转移抓地力,并且GRASP转移有可能在机器人之间分享抓地力,并使机器人能够从人类那里学习抓握技能。此外,我们隐式表示中对象和grasps的签名距离函数可用于6D对象姿势估计,并从部分点云中掌握触点优化,这可以在现实世界中启用机器人抓握。

We introduce a neural implicit representation for grasps of objects from multiple robotic hands. Different grasps across multiple robotic hands are encoded into a shared latent space. Each latent vector is learned to decode to the 3D shape of an object and the 3D shape of a robotic hand in a grasping pose in terms of the signed distance functions of the two 3D shapes. In addition, the distance metric in the latent space is learned to preserve the similarity between grasps across different robotic hands, where the similarity of grasps is defined according to contact regions of the robotic hands. This property enables our method to transfer grasps between different grippers including a human hand, and grasp transfer has the potential to share grasping skills between robots and enable robots to learn grasping skills from humans. Furthermore, the encoded signed distance functions of objects and grasps in our implicit representation can be used for 6D object pose estimation with grasping contact optimization from partial point clouds, which enables robotic grasping in the real world.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源