Publication
ACL 2019
Conference paper

Extracting multiple-relations in one-pass with pre-trained transformers

Download paper

Abstract

The state-of-the-art solutions for extracting multiple entity-relations from an input paragraph always require a multiple-pass encoding on the input. This paper proposes a new solution that can complete the multiple entity-relations extraction task with only one-pass encoding on the input corpus, and achieve a new state-of-the-art accuracy performance, as demonstrated in the ACE 2005 benchmark. Our solution is built on top of the pre-trained self-attentive models (Transformer). Since our method uses a single-pass to compute all relations at once, it scales to larger datasets easily; which makes it more usable in real-world applications.

Date

28 Jul 2019

Publication

ACL 2019

Resources

Share