Optimal Grasping Strategy for Robots with a Parallel Gripper Based on Feature Sensing of 3D Object Model

Hsin Han Chiang, Jiun Kai You, Chen Chien James Hsu*, Jun Jo

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Grasping strategy of many different kinds of target objects is vital in a wide range of automatic robotic manipulations. Among the many potential applications for a robot arm with a parallel gripper, the key challenge for robotic grasping is to determine an optimal grasping pose relative to the target object. Previous works based on 2D grasp and 6-Dof grasp planning have been proposed to efficiently consider the physical contact between the robotic gripper and the object. However, there are still a few unsolved problems caused by partial and limited information about the detected objects due to their locations and geometries that reduce the grasping quality and reliability. In view of these problems, this paper proposes an optimal grasping strategy to deal with target objects with any poses based on their 3D model during the grasping process. Experimental results of the performance evaluation show that the proposed method outperforms the state-of-the-art in terms of grasping success rate on the YCB-Video datasets. Moreover, we further investigate the effectiveness of the proposed method in two scenarios where the robotic manipulator works in either the collaborative or bin-picking modes.

Original languageEnglish
Pages (from-to)24056-24066
Number of pages11
JournalIEEE Access
Volume10
DOIs
Publication statusPublished - 2022

Keywords

  • 3D objects
  • Robot grasping
  • grasp planning
  • parallel gripper
  • point cloud

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'Optimal Grasping Strategy for Robots with a Parallel Gripper Based on Feature Sensing of 3D Object Model'. Together they form a unique fingerprint.

Cite this