Coreference Resolution without Span Representations

Yuval Kirstain, Ori Ram, Omer Levy

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

56 Scopus citations

Abstract

The introduction of pretrained language models has reduced many complex task-specific NLP models to simple lightweight layers. An exception to this trend is coreference resolution, where a sophisticated task-specific model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint - primarily due to dynamically-constructed span and span-pair representations - which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current standard model, while being simpler and more efficient.

Original languageEnglish
Title of host publicationACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference
EditorsChengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
PublisherAssociation for Computational Linguistics (ACL)
Pages14-19
Number of pages6
ISBN (Electronic)9781954085527, 978-1-954085-53-4
DOIs
StatePublished - 2021
EventJoint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021 - Virtual, Online, Thailand
Duration: 1 Aug 20216 Aug 2021

Publication series

NameACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference
Volume2

Conference

ConferenceJoint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021
Country/TerritoryThailand
CityVirtual, Online
Period1/08/216/08/21

Funding

FundersFunder number
Blavatnik Fund
Yandex Initiative in Machine Learning
Intel Corporation

    Fingerprint

    Dive into the research topics of 'Coreference Resolution without Span Representations'. Together they form a unique fingerprint.

    Cite this