Publication
ACL 2024
Conference paper

SymKGQA: Few-Shot Knowledge Graph Question Answering via Symbolic Program Generation and Execution

Abstract

Semantic Parsing of natural language questions into their executable logical form (LF) has shown state-of-the-art (SOTA) performance for Knowledge Graph Question Answering (KGQA). However, these methods are not applicable for real-world applications, due to lack of annotated training data. Recent advances in the capabilities of Large Language Models (LLMs) has led towards generating low-level LFs such as SPARQL and S-Expression in a few-shot setting. Unfortunately, these methods: (1) assumes that underlying LLM is pre-trained on the LF to be generated, (2) performs inferior for the harder complex benchmarks such as KQA Pro, (3) suffers while grounding the generated LF to a specific Knowledge Graph. Recently, a new LF called KoPL has been introduced that explicitly models complex reasoning process step-by-step in a symbolic manner and has shown SOTA on KQA Pro in fully-supervised setting. Inspired by this, we propose SymKGQA framework that generates step-by-step Symbolic LF i.e., KoPL in a few-shot in-context learning setting using LLM. Our framework is not dependent on pre-trained information of LLM about KoPL. We further build a Retrieval-Augmented Generation based Question-Aware Contextual KoPL (QUACK) resolver to ground the generated LF. Our experiments with different LLMs and few-shot settings demonstrate that SymKGQA outperforms all other few-shot and even many of the fully-supervised KGQA approaches.

Date

Publication

ACL 2024

Authors

Topics

Share