Pin-Yu Chen, Cho-Jui Hsieh, et al.
KDD 2022
In recent years, a number of keyphrase generation (KPG) approaches were proposed consisting of complex model architectures, dedicated training paradigms and decoding strategies. In this work, we opt for simplicity and show how a commonly used seq2seq language model, BART, can be easily adapted to generate keyphrases from the text in a single batch computation using a simple training procedure. Empirical results on five benchmarks show that our approach is as good as the existing state-of-the-art KPG systems, but using a much simpler and easy to deploy framework.
Pin-Yu Chen, Cho-Jui Hsieh, et al.
KDD 2022
Conrad Albrecht, Jannik Schneider, et al.
CVPR 2025
Hiroki Yanagisawa
ICML 2023
Bruce Elmegreen, Hendrik Hamann, et al.
ICR 2023