Publication
ACL-IJCNLP 2021
Short paper

A Semantics-aware Transformer Model of Relation Linking for Knowledge Base Question Answering

Download paper

Abstract

Relation linking is a crucial component of Knowledge Base Question Answering systems. Existing systems use a wide variety of heuristics, or ensembles of multiple systems, heavily relying on the surface question text. However, the explicit semantic parse of the question is a rich source of relation information that is not taken advantage of. We propose a simple transformer-based neural model for relation linking that leverages the AMR semantic parse of a sentence. Our system significantly outperforms the state-of-the-art on 4 popular benchmark datasets. These are based on either DBpedia or Wikidata, demonstrating that our approach is effective across KGs.

Date

01 Aug 2021

Publication

ACL-IJCNLP 2021

Authors

Tags

Resources

Share