Retrieval-Based Inception V3-Net Algorithm and Invariant Data Classification using Enhanced Deep Belief Networks for Content-Based Image Retrieval
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2024.15.spl.48Keywords:
Content-based image retrieval, Deep learning, Retrieval inception V3-NET algorithm, Enhanced deep belief networks.Dimensions Badge
Issue
Section
License
Copyright (c) 2024 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
In the present scenario, Content-Based Image Retrieval (CBIR) performs a constantly changing function that makes use gain knowledge from images. Moreover, it is also the dynamic sector of research and was recently rewarded due to the drastic increase in the performance of digital images. To retrieve images from the massive dataset, experts utilize Content Based Image Retrieval. This approach automatically indexes and retrieves images depending upon the contents of the image, and the developing techniques for mining images are based on the CBIR systems. Based on the visual characteristics of the input image, object pattern, texture, color, shape, layout, and position classifications are applied, and indexing is carried out. When issues arise during feature extraction, deep learning approaches help to resolve them. A method called RIV3-NET, which stands for Retrieval-Based Inception V3, was used to classify the features. Classifying image invariant data using Enhanced Deep Belief Networks (EDBN) is necessary to decrease noise and improve displacement with smoothness. The simulation outcomes demonstrate the improved picture retrieval and parametric analysis.Abstract
How to Cite
Downloads
Similar Articles
- T Sowmya Priyadharshini, Rengasamy Sathya, Influence of Different Extraction Solvents and the Micronutrient Composition on the Bioactive Properties and Antimicrobial Efficacy of Spirulina Maxima Extracts , The Scientific Temper: Vol. 16 No. 12 (2025): The Scientific Temper
- A. Sathya, M. S. Mythili, MOHCOA: Multi-objective hermit crab optimization algorithm for feature selection in sentiment analysis of Covid-19 Twitter datasets , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- Mansi Harjivan Chauhan, Divyang D. Vyas, Advancements in sentiment analysis – A comprehensive review of recent techniques and challenges , The Scientific Temper: Vol. 16 No. Spl-1 (2025): The Scientific Temper
- C. Agilan, Lakshna Arun, Optimization-based clustering feature extraction approach for human emotion recognition , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- A. Jabeen, AR Mohamed Shanavas, Bradley Terry Brownboost and Lemke flower pollinated resource efficient task scheduling in cloud computing , The Scientific Temper: Vol. 16 No. 05 (2025): The Scientific Temper
- Parul Yadav, Priyanka Suryavanshi, Storage study on compositional analysis of quinoa and ragi based snacks , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- Gautam Patil, Unnati Soni, Social Inequalities and Health Disparities among Scheduled Castes and Scheduled Tribes: A Gender and Income Perspective in Maharashtra , The Scientific Temper: Vol. 17 No. 03 (2026): The Scientific Temper
- Raja Selvaraj, Manikandasaran S. Sundari, EAM: Enhanced authentication method to ensure the authenticity and integrity of the data in VM migration to the cloud environment , The Scientific Temper: Vol. 14 No. 01 (2023): The Scientific Temper
- Hariini Chandramohan, Sethu Gunasekaran, Comparative analysis on the photocatalytic activity of titania and silica nanoparticles using dye discoloration and contact angle test , The Scientific Temper: Vol. 16 No. 01 (2025): The Scientific Temper
- Krishna P. Kalyanathaya, Krishna Prasad K, A novel method for developing explainable machine learning framework using feature neutralization technique , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
<< < 16 17 18 19 20 21 22 23 24 25 > >>
You may also start an advanced similarity search for this article.

