Inferring object properties from human interaction and transferring them to new motions

Qian Zheng, Weikai Wu, Hanting Pan, Niloy Mitra, Daniel Cohen-Or, Hui Huang

Research output: Contribution to journalArticlepeer-review

Abstract

Humans regularly interact with their surrounding objects. Such interactions often result in strongly correlated motions between humans and the interacting objects. We thus ask: “Is it possible to infer object properties from skeletal motion alone, even without seeing the interacting object itself?” In this paper, we present a fine-grained action recognition method that learns to infer such latent object properties from human interaction motion alone. This inference allows us to disentangle the motion from the object property and transfer object properties to a given motion. We collected a large number of videos and 3D skeletal motions of performing actors using an inertial motion capture device. We analyzed similar actions and learned subtle differences between them to reveal latent properties of the interacting objects. In particular, we learned to identify the interacting object, by estimating its weight, or its spillability. Our results clearly demonstrate that motions and interacting objects are highly correlated and that related object latent properties can be inferred from 3D skeleton sequences alone, leading to new synthesis possibilities for motions involving human interaction. Our dataset is available at http://vcc.szu.edu.cn/research/2020/IT.html.

Original languageEnglish
Pages (from-to)375-392
Number of pages18
JournalComputational Visual Media
Volume7
Issue number3
DOIs
StatePublished - Sep 2021

Keywords

  • human interaction motion
  • motion analysis
  • motion synthesis
  • object property inference

Fingerprint

Dive into the research topics of 'Inferring object properties from human interaction and transferring them to new motions'. Together they form a unique fingerprint.

Cite this