Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: WiML Workshop 1

Targeted active semi supervised learning for new customers in virtual assistants

Dieu Thu Le · Anna Weber


Abstract:

For virtual assistants new customers play a special role and pose a specific challenge. New users interact with the device more naturally and conversationally as they are not yet aware which commands work or and which don’t. Data from new customers is therefore especially valuable to improve spoken language understanding (SLU) systems. In this work, we improve intent classification [5] in SLU for new customers. Most studies in the literature on semi-supervised learning focus on general accuracy improvement [1] [2] [3] [4]. In contrast, we concentrate on data from new customers and use the framework to improve the experience not only for new but for all customers in a large-scale setting. We employ a self training framework that combines targeted active and semi-supervised learning to incorporate new customers’ utterances into the training set. The first step is the identification of problematic utterances that new customers often use and that differ from utterances more experienced customers say frequently. Therefore we project frictional utterances from both cohorts into an embedding space using BERT and topic modelling, then use a density clustering with topic guidance to identify the areas that are representative of utterances from new customers. After the identification, we combine active and semi-supervised learning in two phases. In phase I (Active Learning), we annotate problematic utterances from new customers that were identified in the first step to get a correct interpretation. We then use the model trained on the added active learning dataset as the teacher model [6] for phase II, Semi-supervised Learning. We use a retrieval module to retrieve utterances that are similar to the selected ones in phase I, and use the teacher model to provide pseudo labeling, which is then added into the training set. Results for two languages show offline improvements from 100-300 bps and improvements in online friction rates of up to 100 bps overall for new customers.

Chat is not available.