For dialogue systems to function effectively, accurate natural language understanding is vital, relying on precise intent recognition and slot filling to ensure smooth and meaningful interactions. Previous studies have primarily focused on addressing each subtask individually. However, it has been discovered that these subtasks are interconnected and achieving better results requires solving them together. One drawback of the joint learning model is its inability to apply learned patterns to unseen data, which stems from a lack of large, annotated data. Recent approaches have shown that using pretrained embeddings for effective text representation can help address the issue of generalization. However, pretrained embeddings are merely trained on corpus that typically consist of commonly discussed matters, which might not necessarily contain domain specific vocabularies for the task at hand. To address this issue, the paper presents a joint model for intent detection and slot filling, harnessing pretrained embeddings and domain specific embeddings using canonical correlation analysis to enhance the model performance. The proposed model consists of convolutional neural network along with bidirectional long short-term memory (BiLSTM) for efficient joint learning classification. The results of the experiment show that the proposed model performs better than the baseline models.
Copyrights © 2025