Publicación:
Text prediction recurrent neural networks using long short-term memory-dropout

dc.contributor.authorIparraguirre-Villanueva, Orlando
dc.contributor.authorGuevara-Ponce, Victor
dc.contributor.authorRuiz-Alvarado, Daniel
dc.contributor.authorBeltozar-Clemente, Saul
dc.contributor.authorSierra-Liñan, Fernando Alex
dc.contributor.authorZapata-Paulini, Joselyn E.
dc.contributor.authorCabanillas-Carbonell, Michael A.
dc.date.accessioned2025-09-05T16:34:08Z
dc.description.abstractUnit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context. © 2022 Elsevier B.V., All rights reserved.
dc.identifier.doi10.11591/ijeecs.v29.i3.pp1758-1768
dc.identifier.scopus2-s2.0-85144397373
dc.identifier.urihttps://cris.uwiener.edu.pe/handle/001/424
dc.identifier.uuid96917ecd-3bf5-4758-a65b-ebb3c05c3fb8
dc.language.isoen
dc.publisherInstitute of Advanced Engineering and Science
dc.relation.citationissue3
dc.relation.citationvolume29
dc.relation.ispartofseriesIndonesian Journal of Electrical Engineering and Computer Science
dc.relation.issn25024760
dc.rightshttp://purl.org/coar/access_right/c_abf2
dc.titleText prediction recurrent neural networks using long short-term memory-dropout
dc.typehttp://purl.org/coar/resource_type/c_2df8fbb1
dspace.entity.typePublication
oaire.citation.endPage1768
oaire.citation.startPage1758

Archivos

Colecciones