Q290 : Utilizing from Multilingual Data sets to Provide a Deep Learning-baxsed Architecture for Intent Detection
Thesis > Central Library of Shahrood University > Computer Engineering > MSc > 2024
Authors:
Abstarct: Abstract
With the rapid expansion of artificial intelligence applications, conversational systems such as chatbots have become increasingly popular. Intent Detection (ID) plays a crucial role in these systems to ensure better responses to user interactions. Researchers have employed various approaches for intent detection, including rule-baxsed methods, statistical language models, neural network-baxsed techniques, and deep learning-baxsed methods. Different deep learning techniques such as Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Transformers have been utilized in this field. Each of these methods has addressed the shortcomings of previous models, with Transformers ultimately surpassing others in terms of efficiency and accuracy. Introduced by Google, the Transformer architecture has enhanced computational speed alongside accuracy due to its parallel processing capabilities.
In this thesis, we employ a Transformer-baxsed architecture for intent detection. Additionally, since data augmentation can significantly improve the final accuracy in most intelligent systems, we evaluate the effectiveness of various data augmentation techniques in the context of intent detection. Data augmentation involves generating non-identical yet semantically equivalent copies of training samples using different methods. This approach enhances the learning capacity of the model by effectively increasing the volume of training data. However, the effectiveness of different augmentation techniques varies significantly across domains (e.g., text vs. images). Even within text processing, different augmentation methods can yield varying results depending on the specific task. Therefore, we explore multiple text augmentation strategies and assess their effectiveness in intent detection.
Finally, through extensive experiments and evaluations on the ATIS dataset [1], we conclude that paraphrasing-baxsed augmentation, which involves rewording training samples while preserving their meaning, outperforms other general augmentation techniques (such as word shuffling or synonym replacement) and translation-baxsed augmentation in the context of intent detection.
Keywords:
#Keywords: Chatbot #Intent Detection #Data Augmentation #Transformer Keeping place: Central Library of Shahrood University
Visitor:
Visitor: