Download PDFOpen PDF in browserLeveraging BERT for Natural Language Understanding of Domain-Specific KnowledgeEasyChair Preprint 132036 pages•Date: May 6, 2024AbstractNatural Language Understanding (NLU) is a core task when building conversational agents, fulfilling the objectives of understanding the user’s goal and detecting any valuable information regarding it. NLU implies Intent Detection and Slot Filling, to semantically parse the user’s utterance. One caveat when training a Deep Learning model for domain-specific NLU is the lack of specific datasets, which leads to poorly performing models. To overcome this, we experiment with fine-tuning BERT to jointly detect the user’s intent and the related slots, using a custom-generated dataset built around an organization-specific knowledge base. Our results show that well-constructed datasets lead to high detection performances and the resulting model has the potential to enhance a future task-oriented dialogue system. Keyphrases: BERT, intent detection, natural language understanding, slot filling, task-oriented dialogue system
|