Robust classification of grasped objects in intuitive human-robot collaboration using a wearable force-myography device

Nadav D. Kahanowich, Avishai Sintov

Research output: Contribution to journalArticlepeer-review

Abstract

Feasible human-robot collaboration requires intuitive and fluent understanding of human motion in shared tasks. The object in hand provides the most valuable information about the intended task of a human. In this letter, we propose a simple and affordable approach where a wearable force-myography device is used to classify objects grasped by a human. The device worn on the forearm incorporates 15 force sensors that can imply about the configuration of the hand and fingers during grasping. Hence, a classifier is trained to easily identify various objects using data recorded while holding them. To augment the classifier, we propose an iterative approach in which additional signals are taken in real-time to increase certainty about the predicted object. We show that the approach provides robust classification where the device can be taken off and placed back while maintaining high accuracy. The approach also improves the performance of trained classifiers that initially produced low accuracy due to insufficient data or non-optimal hyper-parameters. Classification success rate of more than 97% is reached in a short period of time. Furthermore, we analyze the key locations of sensors on the forearm that provide the most accurate and robust classification.

Original languageEnglish
Article number9350160
Pages (from-to)1192-1199
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume6
Issue number2
DOIs
StatePublished - Apr 2021

Keywords

  • Human-Robot collaboration
  • intention recognition

Fingerprint

Dive into the research topics of 'Robust classification of grasped objects in intuitive human-robot collaboration using a wearable force-myography device'. Together they form a unique fingerprint.

Cite this