Publication
NAACL 2021
Conference paper

AMR Parsing with Action-Pointer Transformer

Download paper

Abstract

Abstract Meaning Representation parsing belongs to a category of sentence-to-graph prediction tasks where the target graph is not explicitly linked to the sentence tokens. However, nodes or subgraphs are semantically related to subsets of the sentence tokens, and locality between words and related nodes is often preserved. Transition-based approaches have recently shown great progress in capturing these inductive biases but still suffer from limited expressiveness. In this work, we propose a transition-based system that combines hard-attention over sentences with a target-side action pointer mechanism to decouple source tokens from node representations. We model the transitions as well as the pointer mechanism using a single Transformer model. Parser state and graph structure information is efficiently encoded using attention heads. We show that our approach leads to increased expressiveness while capitalizing inductive biases and attains new state-of-the Smatch scores on AMR 1.0 (78.5) and AMR 2.0 (81.8).