Enhanced word representations for bridging anaphora resolution
Most current models of word representations (e.g., GloVe) have successfully captured finegrained semantics. However, semantic similarity exhibited in these word embeddings is not suitable for resolving bridging anaphora, which requires the knowledge of associative similarity (i.e., relatedness) instead of semantic similarity information between synonyms or hypernyms. We create word embeddings (embeddings PP) to capture such relatedness by exploring the syntactic structure of noun phrases. We demonstrate that using embeddings PP alone achieves around 30% of accuracy for bridging anaphora resolution on the ISNotes corpus. Furthermore, we achieve a substantial gain over the state-of-the-art system (Hou et al., 2013b) for bridging antecedent selection.