Retrieval-Based Inception V3-Net Algorithm and Invariant Data Classification using Enhanced Deep Belief Networks for Content-Based Image Retrieval
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2024.15.spl.48Keywords:
Content-based image retrieval, Deep learning, Retrieval inception V3-NET algorithm, Enhanced deep belief networks.Dimensions Badge
Issue
Section
License
Copyright (c) 2024 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
In the present scenario, Content-Based Image Retrieval (CBIR) performs a constantly changing function that makes use gain knowledge from images. Moreover, it is also the dynamic sector of research and was recently rewarded due to the drastic increase in the performance of digital images. To retrieve images from the massive dataset, experts utilize Content Based Image Retrieval. This approach automatically indexes and retrieves images depending upon the contents of the image, and the developing techniques for mining images are based on the CBIR systems. Based on the visual characteristics of the input image, object pattern, texture, color, shape, layout, and position classifications are applied, and indexing is carried out. When issues arise during feature extraction, deep learning approaches help to resolve them. A method called RIV3-NET, which stands for Retrieval-Based Inception V3, was used to classify the features. Classifying image invariant data using Enhanced Deep Belief Networks (EDBN) is necessary to decrease noise and improve displacement with smoothness. The simulation outcomes demonstrate the improved picture retrieval and parametric analysis.Abstract
How to Cite
Downloads
Similar Articles
- Muhammed Jouhar K. K., Dr. K. Aravinthan, An improved social media behavioral analysis using deep learning techniques , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- Dileep Pulugu, Shaik K. Ahamed, Senthil Vadivu, Nisarg Gandhewar, U D Prasan, S. Koteswari, Empowering healthcare with NLP-driven deep learning unveiling biomedical materials through text mining , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- Abhishek K Pandey, Amrita Sahu, Ajay K Harit, Manoj Singh, Nutritional composition of the wild variety of edible vegetables consumed by the tribal community of Raipur, Chhattisgarh, India , The Scientific Temper: Vol. 14 No. 01 (2023): The Scientific Temper
- S. Kumar, M. Santhanalakshmi , R. Navaneethakrishnan, Content addressable memory for energy efficient computing applications , The Scientific Temper: Vol. 14 No. 02 (2023): The Scientific Temper
- Bhaskar Pandya, Pradipsinh Zala, Vocational education and lifelong learning: Preparing a skilled workforce for the future , The Scientific Temper: Vol. 15 No. spl-2 (2024): The Scientific Temper
- Balaji V, Purnendu Bikash Acharjee, Muniyandy Elangovan, Gauri Kalnoor, Ravi Rastogi, Vishnu Patidar, Developing a semantic framework for categorizing IoT agriculture sensor data: A machine learning and web semantics approach , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
- C. S. Manikandababu, V. Rukkumani, Advanced VLSI-based digital image contrast enhancement: A novel approach with modified image pixel evaluation logic , The Scientific Temper: Vol. 15 No. 01 (2024): The Scientific Temper
- Y. Mohammed Iqbal, M. Mohamed Surputheen, S. Peerbasha, Swarm intelligence-driven HC2NN model for optimized COVID-19 detection using lung imaging , The Scientific Temper: Vol. 16 No. 03 (2025): The Scientific Temper
- Swetha Rajkumar, Jayaprasanth Devakumar, LSTM based data driven fault detection and isolation in small modular reactors , The Scientific Temper: Vol. 14 No. 01 (2023): The Scientific Temper
- Subin M. Varghese, K. Aravinthan, A robust finger detection based sign language recognition using pattern recognition techniques , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
<< < 5 6 7 8 9 10 11 12 13 14 > >>
You may also start an advanced similarity search for this article.

