IMPLEMENTING A CHATBOT WITH NATURAL LANGUAGE UNDERSTANDING THROUGH BERT AND NEURAL NETWORK INTEGRATION

Authors

  • Wuttipat Nilsiri Kasetsart University
  • Ekachai Phaisangittisagul Kasetsart University
  • Thepchai Supnithi National Science and Technology Development Agency

Keywords:

Chatbot, NLP, Intent Classification, Named Entity Recognition, Neural Network

Abstract

The objective of this study is to deploy a chatbot tailored for specific domains with a capacity for natural language comprehension. While prior research has demonstrated successful implementations, relying predominantly on Seq2Seq models emphasizing response generation over question comprehension, achieving true natural language understanding necessitates the integration of two core language processing models: intent classification and named entity recognition (NER). However, traditional vectorization methods suffer from a notable drawback in their inability to accommodate unseen words. To address this limitation, we propose a novel approach leveraging the fusion of BERT (Bidirectional Encoder Representations from Transformers) and neural network models. BERT's capacity to capture subword units, synonymous expressions, and inherent word properties as similarity vectors enables robust support for out-of-vocabulary scenarios encountered during language processing tasks. In our experimentation, we utilize the NECTEC sightseeing dataset, preprocessed using BERT embeddings, to evaluate the performance of various neural network models across intent classification and NER tasks. Our findings underscore the promising efficacy of the proposed methodology in enhancing accuracy metrics.

Downloads

Published

2024-05-30