Publication
NAACL-HLT 2021
Conference paper

Does Structure Matter? Encoding Documents for Machine Reading Comprehension

Abstract

Machine reading comprehension is a challenging task especially for querying documents with deep and interconnected contexts. Transformer-based methods have shown advanced performances on this task; however, most of them still treat documents as a flat sequence of tokens. This work proposes a new Transformer-based method that reads a document as tree slices. It contains two modules for identifying more relevant text passage and the best answer span respectively, which are not only jointly trained but also jointly consulted at inference time. Our evaluation results show that our proposed method outperforms several competitive baseline approaches on two datasets from varied domains.

Date

06 Jun 2021

Publication

NAACL-HLT 2021