Vision-based interaction force estimation for robot grip motion without tactile/force sensor

Dae Kwan Ko, Kang Won Lee, Dong Han Lee, Soo Chul Lim

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Humans perceive an interaction force through the kinesthetic sense or tactile sense. When viewing the image, they estimate the interaction force on the basis of pseudo-haptics. The interaction force of a robot is traditionally measured using a contact-type tactile sensor or a force/torque (F/T) sensor. In this work, we propose a method for estimating the interaction force between a robot and objects during grasping and picking. The method is based on images, without involving an F/T sensor or a tactile sensor. For undeformable objects, more precise force estimation was achieved by simultaneously using RGB and depth images, the robot position, and electrical current. We propose a deep neural network structure that combines DenseNet and a Transformer encoder/decoder for predicting the interaction force. We verified proposed network with generated DB which has recorded the interaction with 41 objects. Additionally, we compared the results with changing the inputs of the network. Our model could estimate the interaction force from various input modalities for both known objects and unseen objects during its training. The results clearly indicate that the proposed method produces the best results compared with other models, with less than 3% error in estimating the interaction force.

Original languageEnglish
Article number118441
JournalExpert Systems with Applications
Volume211
DOIs
StatePublished - Jan 2023

Keywords

  • Force estimation
  • Interaction force
  • Machine learning
  • Robot grasping

Fingerprint

Dive into the research topics of 'Vision-based interaction force estimation for robot grip motion without tactile/force sensor'. Together they form a unique fingerprint.

Cite this