Volume 20 No 12 (2022)
 Download PDF
IMAGE CAPTION GENERATOR USING DEEP LEARNING
Peerzada Salman syeed, Dr.Mahmood Usman,
Abstract
As we are living in the 21st century, image caption is one of the most needed tools these days. This application has a built-in function for producing captions for a specific image. This application is made by the contribution of stable network bodies. The process of developing an image illustration is called a caption. It attains detection of the essentials, with their qualities, and the relationship between the things within the picture. Produces syntactically correct and semantically correct sentences. This paper presents an in-depth reading structure to explain images and create captions using AI vision and machine conveyance. This norm focus to discover the various elements seen in an image, see the interconnections between such objects, and produce legends. We utilized an 8K plus image data set and captions for this project. This content will also explain the programs and format of the various depolarized networks . The production of graphic captions is an integral part of Software Vision and Natural Linguistic motion . Photo subtitles manufacturers can look for properties in photo-sharing as social platforms are used, and further more , their use can be upgraded to moving pictures frames . They will indeed make it work for someone who has to translate images. Not to mention that it has an enormous capacity to help the visually impaired
Keywords
Introduction , problem statement, proposed methodology, convolutional neural networks , long short tem memory , evaluation ,result and analysis, conclusion
Copyright
Copyright © Neuroquantology

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Articles published in the Neuroquantology are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant IJECSE right of first publication under CC BY-NC-ND 4.0. Users have the right to read, download, copy, distribute, print, search, or link to the full texts of articles in this journal, and to use them for any other lawful purpose.