Daniel Ruiz-Alvarado
Universidad Autónoma del Perú

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Text prediction recurrent neural networks using long short-term memory-dropout Orlando Iparraguirre-Villanueva; Victor Guevara-Ponce; Daniel Ruiz-Alvarado; Saul Beltozar-Clemente; Fernando Sierra-Liñan; Joselyn Zapata-Paulini; Michael Cabanillas-Carbonell
Indonesian Journal of Electrical Engineering and Computer Science Vol 29, No 3: March 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v29.i3.pp1758-1768

Abstract

Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context.