IMPLEMENTING A CHATBOT WITH NATURAL LANGUAGE UNDERSTANDING THROUGH BERT AND NEURAL NETWORK INTEGRATION
Keywords:
Chatbot, NLP, Intent Classification, Named Entity Recognition, Neural NetworkAbstract
The objective of this study is to deploy a chatbot tailored for specific domains with a capacity for natural language comprehension. While prior research has demonstrated successful implementations, relying predominantly on Seq2Seq models emphasizing response generation over question comprehension, achieving true natural language understanding necessitates the integration of two core language processing models: intent classification and named entity recognition (NER). However, traditional vectorization methods suffer from a notable drawback in their inability to accommodate unseen words. To address this limitation, we propose a novel approach leveraging the fusion of BERT (Bidirectional Encoder Representations from Transformers) and neural network models. BERT's capacity to capture subword units, synonymous expressions, and inherent word properties as similarity vectors enables robust support for out-of-vocabulary scenarios encountered during language processing tasks. In our experimentation, we utilize the NECTEC sightseeing dataset, preprocessed using BERT embeddings, to evaluate the performance of various neural network models across intent classification and NER tasks. Our findings underscore the promising efficacy of the proposed methodology in enhancing accuracy metrics.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Procedia of Multidisciplinary Research
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.