


Volume 20 No 15 (2022)
Download PDF
Generating text using long short-term memory algorithm based on incremental framework
Mehrdad Jannesar
Abstract
Over the past few decades, text production and designing intelligent systems to make a text comprehensible
to a machine and fabricate new text have been a complex challenge for artificial intelligence researchers. In
addition to being fast, such systems should bring into play the methods of language modeling, syntactic or
grammatical analysis, and semantic and conceptual analysis of a set of texts efficiently. In this regard, several
algorithms based on neural networks have been proposed by researchers. The long short-term memory deep
learning algorithm is investigated in this study as a system with the ability to memorize long sequences and
learn long-term dependencies. Since the combination of several powerful models can lead to the
development of a stronger model with higher accuracy and performance, we decided to present a hybrid
algorithm including three models based on long short-term memory (bidirectional long short-term memory
network, long short-term memory network, deep long short-term memory network) and stacking ensemble
learning algorithm. The components of the hybrid algorithm are taught in parallel to improve the
performance of text generation. The tests performed on all 3 models and the proposed algorithm prove that
the obtained structure can reduce the production error and achieve higher accuracy
Keywords
long short-term memory algorithm, text generation, incremental framework, the ensemble learning algorithm
Copyright
Copyright © Neuroquantology
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Articles published in the Neuroquantology are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant IJECSE right of first publication under CC BY-NC-ND 4.0. Users have the right to read, download, copy, distribute, print, search, or link to the full texts of articles in this journal, and to use them for any other lawful purpose.