Publication
EMNLP 2023
Conference paper

Zero-Shot-BERT-Adapters: a Zero-Shot Pipeline for Unknown Intent Detection

View code

Abstract

Intent discovery is a crucial task in natural language processing, and it is increasingly relevant for a variety of industrial applications. Identifying novel, unseen intents from user inputs remains one of the biggest challenges in this field. Herein, we propose Zero-Shot-BERT-Adapters, a two-stage method for multilingual intent discovery relying on a Transformer architecture, fine-tuned with Adapters, which is initially trained for Natural Language Inference (NLI) and later applied for unknown intent classification in a zero-shot setting. In our evaluation, we first analyze the quality of the model after adaptive fine-tuning on known classes. Secondly, we evaluate its performance in casting intent classification as an NLI task. Lastly, we test the zero-shot performance of the model on unseen classes, showing how Zero-Shot-BERT-Adapters can effectively perform intent discovery by generating semantically similar intents, if not equal, to the ground-truth ones. Our experiments show how Zero-Shot-BERT-Adapters is outperforming a wide variety of baselines in two zero-shot settings: known intents classification and unseen intent discovery. The proposed pipeline holds the potential to be widely applied in a variety of applications for customer care. It enables automated dynamic triage using a lightweight model that, unlike large language models, can be easily deployed and scaled in a wide variety of business scenarios. Zero-Shot-BERT-Adapters represents an innovative multi-language approach for intent discovery, enabling the online generation of novel intents. The pipeline is available as an installable Python package at the following link: https://anonymous.4open.science/r/zero-shot-bert-adapters-EED8/.