SHREC'16: Matching of deformable shapes with topological noise

Z. Lähner, E. Rodolà, M. M. Bronstein, D. Cremers, O. Burghard, L. Cosmo, A. Dieckmann, R. Klein, Y. Sahillioǧlu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A particularly challenging setting of the shape matching problem arises when the shapes being matched have topological artifacts due to the coalescence of spatially close surface regions - a scenario that frequently occurs when dealing with real data under suboptimal acquisition conditions. This track of the SHREC'16 contest evaluates shape matching algorithms that operate on 3D shapes under synthetically produced topological changes. The task is to produce a pointwise matching (either sparse or dense) between 90 pairs of shapes, representing the same individual in different poses but with different topology. A separate set of 15 shapes with ground-truth correspondence was provided as training data for learning-based techniques and for parameter tuning. Three research groups participated in the contest; this paper presents the track dataset, and describes the different methods and the contest results.

Original languageEnglish
Title of host publicationEG 3DOR 2016 - Eurographics 2016 Workshop on 3D Object Retrieval
EditorsAlfredo Ferreira, Daniela Giorgi, Andrea Giachetti
PublisherEurographics Association
Pages55-60
Number of pages6
ISBN (Electronic)9783038680048
DOIs
StatePublished - 2016
Externally publishedYes
Event9th Eurographics Workshop on 3D Object Retrieval, 3DOR 2016 - Lisbon, Portugal
Duration: 8 May 2016 → …

Publication series

NameEurographics Workshop on 3D Object Retrieval, EG 3DOR
ISSN (Print)1997-0463
ISSN (Electronic)1997-0471

Conference

Conference9th Eurographics Workshop on 3D Object Retrieval, 3DOR 2016
Country/TerritoryPortugal
CityLisbon
Period8/05/16 → …

Fingerprint

Dive into the research topics of 'SHREC'16: Matching of deformable shapes with topological noise'. Together they form a unique fingerprint.

Cite this