Publication
EMNLP 2023
Conference paper

Knowledge Graph Compression for Enhancing Diverse Commonsense Generation

Abstract

Commonsense knowledge graphs such as ConceptNet offer general wisdom, which has been shown to improve the quality and diversity of ``commonsense'' explanations generated by the models; for example, using subgraphs around a task's concepts. However, due to the large coverage and, consequently, vast scale of ConceptNet, knowledge which is locally close in the graph does not necessarily belong to a single, common context. In this paper, we apply a differentiable graph compression algorithm to encode the commonsense knowledge to allow for selecting and injecting more relevant knowledge into Language Models (LMs). This allows the models to apply more of the KG concepts in their generated outputs and also leads to longer and considerably more diverse outputs. Our experiments demonstrate this for the two tasks of generating commonsense and abductive explanations. Our approach may even achieve improvements over the large language model Vicuna-13b, in a few-shot setup, both in terms of quality and diversity.

Date

06 Dec 2023

Publication

EMNLP 2023