Improved Out-of-Scope Intent Classification with Dual Encoding and Threshold-based Re-Classification

Wael Rashwan, Hossam M. Zawbaa, Sourav Dutta, Haytham Assem

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Detecting out-of-scope user utterances is essential for task-oriented dialogues and intent classification. Current
methodologies face difficulties with the unpredictable distribution of outliers and often rely on assumptions about
data distributions. We present the Dual Encoder for Threshold-Based Re-Classification (DETER) to address these
challenges. This end-to-end framework efficiently detects out-of-scope intents without requiring assumptions
on data distributions or additional post-processing steps. The core of DETER utilizes dual text encoders, the
Universal Sentence Encoder (USE) and the Transformer-based Denoising AutoEncoder (TSDAE), to generate user
utterance embeddings, which are classified through a branched neural architecture. Further, DETER generates
synthetic outliers using self-supervision and incorporates out-of-scope phrases from open-domain datasets.
This approach ensures a comprehensive training set for out-of-scope detection. Additionally, a threshold-based
re-classification mechanism refines the model’s initial predictions. Evaluations on the CLINC-150, Stackoverflow,
and Banking77 datasets demonstrate DETER’s efficacy. Our model outperforms previous benchmarks, achieving
an increase of up to 13% and 5% in F1 score for known and unknown intents on CLINC-150 and Stackoverflow,
and 16% for known and 24% for unknown intents on Banking77. The source code has been released at
https://github.com/Hossam-Mohammed-tech/Intent_Classification_OOS.
Original languageEnglish (Ireland)
Title of host publicationLrec-Coling 2024
Publication statusAccepted/In press - 2024

Fingerprint

Dive into the research topics of 'Improved Out-of-Scope Intent Classification with Dual Encoding and Threshold-based Re-Classification'. Together they form a unique fingerprint.

Cite this