Indonesian Journal of Electrical Engineering and Computer Science
Vol 37, No 1: January 2025

Integrating ELECTRA and BERT models in transformer-based mental healthcare chatbot

Zeniarja, Junta (Unknown)
Paramita, Cinantya (Unknown)
Subhiyakto, Egia Rosi (Unknown)
Rakasiwi, Sindhu (Unknown)
Shidik, Guruh Fajar (Unknown)
Andono, Pulung Nurtantio (Unknown)
Savicevic, Anamarija Jurcev (Unknown)



Article Info

Publish Date
01 Jan 2025

Abstract

Over the last decade, the surge in mental health disorders has necessitated innovative support methods, notably artificial intelligent (AI) chatbots. These chatbots provide prompt, tailored conversations, becoming crucial in mental health support. This article delves into the use of sophisticated models like convolutional neural network (CNN), long-short term memory (LSTM), efficiently learning an encoder that classifies token replacements accurately (ELECTRA), and bidirectional encoder representation of transformers (BERT) in developing effective mental health chatbots. Despite their importance for emotional assistance, these chatbots struggle with precise and relevant responses to complex mental health issues. BERT, while strong in contextual understanding, lacks in response generation. Conversely, ELECTRA shows promise in text creation but is not fully exploited in mental health contexts. The article investigates merging ELECTRA and BERT to improve chatbot efficiency in mental health situations. By leveraging an extensive mental health dialogue dataset, this integration substantially enhanced chatbot precision, surpassing 99% accuracy in mental health responses. This development is a significant stride in advancing AI chatbot interactions and their contribution to mental health support.

Copyrights © 2025