Visual Reconstruction and Localization-Based Robust Robotic 6-DoF Grasping in the Wild
Liang, Ji1,2; Zhang, Jiguang3; Pan, Bingbing2,3; Xu, Shibiao2,3; Zhao, Guangheng1,2; Yu, Ge1; Zhang, Xiaopeng2,3
刊名IEEE ACCESS
2021
卷号9页码:72451-72464
关键词Grasping Three-dimensional displays Manipulators Pose estimation Solid modeling Service robots Feature extraction Robotic grasping manipulator 6-DoF pose estimation point cloud reconstruction
ISSN号2169-3536
DOI10.1109/ACCESS.2021.3079245
通讯作者Pan, Bingbing(panbingbing18@mails.ucas.ac.cn) ; Zhao, Guangheng(zhgh@csu.ac.cn)
英文摘要The intelligent grasping expects that the manipulator has the ability to grasp objects with high degree of freedom in a wild (unstructured) environment. Due to low perception ability in handing targets and environments, most industrial robots are limited to top-down 4-DoF grasping. In this work, we propose a novel low-cost coarse to fine robotic grasping framework. First, we design a global localization based environment perception method, which enables the manipulator to roughly and automatically locate work space. Then, constrained by the above initial localization, a 3D point cloud reconstruction based 6-DoF pose estimation method is proposed for the manipulator further fine locating grasping target. Finally, our framework realizes full function of visual 6DoF robotic grasping, which includes two different visual servoing and grasp planning strategies for different objects grasping. Meanwhile, it also can integrate various state-of-arts 6DoF pose estimation algorithms to facilitate various practical grasping applications or researches. Experimental results show that our method achieves autonomous robotic grasping with high degree of freedom in an unknown environment. Especially for objects with occlusion, singular shape or small scale, our method can still maintain robust grasping.
资助项目Prospective Topic of Technology and Engineering Center for Space Utilization, Chinese Academy of Sciences[Y8031851SY] ; Open Research Fund of Key Laboratory of Space Utilization, Chinese Academy of Sciences[LSU-KFJJ-2020-04]
WOS研究方向Computer Science ; Engineering ; Telecommunications
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000652043100001
资助机构Prospective Topic of Technology and Engineering Center for Space Utilization, Chinese Academy of Sciences ; Open Research Fund of Key Laboratory of Space Utilization, Chinese Academy of Sciences
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/44687]  
专题模式识别国家重点实验室_三维可视计算
通讯作者Pan, Bingbing; Zhao, Guangheng
作者单位1.Chinese Acad Sci, Technol & Engn Ctr Space Utilizat, Key Lab Space Utilizat, Beijing 100094, Peoples R China
2.Univ Chinese Acad Sci, Technol & Engn Ctr Space Utilizat, Beijing 100864, Peoples R China
3.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Liang, Ji,Zhang, Jiguang,Pan, Bingbing,et al. Visual Reconstruction and Localization-Based Robust Robotic 6-DoF Grasping in the Wild[J]. IEEE ACCESS,2021,9:72451-72464.
APA Liang, Ji.,Zhang, Jiguang.,Pan, Bingbing.,Xu, Shibiao.,Zhao, Guangheng.,...&Zhang, Xiaopeng.(2021).Visual Reconstruction and Localization-Based Robust Robotic 6-DoF Grasping in the Wild.IEEE ACCESS,9,72451-72464.
MLA Liang, Ji,et al."Visual Reconstruction and Localization-Based Robust Robotic 6-DoF Grasping in the Wild".IEEE ACCESS 9(2021):72451-72464.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace