Retrieval-Based Inception V3-Net Algorithm and Invariant Data Classification using Enhanced Deep Belief Networks for Content-Based Image Retrieval
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2024.15.spl.48Keywords:
Content-based image retrieval, Deep learning, Retrieval inception V3-NET algorithm, Enhanced deep belief networks.Dimensions Badge
Issue
Section
License
Copyright (c) 2024 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
In the present scenario, Content-Based Image Retrieval (CBIR) performs a constantly changing function that makes use gain knowledge from images. Moreover, it is also the dynamic sector of research and was recently rewarded due to the drastic increase in the performance of digital images. To retrieve images from the massive dataset, experts utilize Content Based Image Retrieval. This approach automatically indexes and retrieves images depending upon the contents of the image, and the developing techniques for mining images are based on the CBIR systems. Based on the visual characteristics of the input image, object pattern, texture, color, shape, layout, and position classifications are applied, and indexing is carried out. When issues arise during feature extraction, deep learning approaches help to resolve them. A method called RIV3-NET, which stands for Retrieval-Based Inception V3, was used to classify the features. Classifying image invariant data using Enhanced Deep Belief Networks (EDBN) is necessary to decrease noise and improve displacement with smoothness. The simulation outcomes demonstrate the improved picture retrieval and parametric analysis.Abstract
How to Cite
Downloads
Similar Articles
- Krishna P. Kalyanathaya, Krishna Prasad K, A novel method for developing explainable machine learning framework using feature neutralization technique , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- S. Hemalatha, N. Vanjulavalli, K. Sujith, R. Surendiran, Chaotic-based optimization, based feature selection with shallow neural network technique for effective identification of intrusion detection , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- Damtew Girma, Addisalem Mebratu, Fresew Belete, Response of potato (Solanum tuberosum L.) varieties to blended NPSB fertilizer rates on tuber yield and quality parameters in Gummer district, Southern Ethiopia , The Scientific Temper: Vol. 14 No. 03 (2023): The Scientific Temper
- Vaishali Yeole, Rushikesh Yeole, Pradheep Manisekaran, Analysis and prediction of stomach cancer using machine learning , The Scientific Temper: Vol. 16 No. Spl-1 (2025): The Scientific Temper
- Archana Dhamotharan, Kanthalakshmi Srinivasan, Analog Circuits Based Fault Diagnosis using ANN and SVM , The Scientific Temper: Vol. 14 No. 02 (2023): The Scientific Temper
- M. Merla Agnes Mary, S. Britto Ramesh Kumar, DAJO: A Robust Machine Learning–Based Framework for Preprocessing and Denoising Fetal ECG Signals , The Scientific Temper: Vol. 16 No. 09 (2025): The Scientific Temper
- K. Sreenivasulu, Sampath S, Arepalli Gopi, Deepak Kartikey, S. Bharathidasan, Neelam Labhade Kumar, Advancing device and network security for enhanced privacy , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
- Divya R., Vanathi P. T., Harikumar R., An optimized cardiac risk levels classifier based on GMM with min- max model from photoplethysmography signals , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- Sowmiya M, Banu Rekha B, Malar E, Ensemble classifiers with hybrid feature selection approach for diagnosis of coronary artery disease , The Scientific Temper: Vol. 14 No. 03 (2023): The Scientific Temper
- Merla Agnes Mary, Britto Ramesh Kumar, Hybrid GAN with KNN - SMOTE Approach for Class-Imbalance in Non-Invasive Fetal ECG Monitoring , The Scientific Temper: Vol. 16 No. 09 (2025): The Scientific Temper
<< < 17 18 19 20 21 22 23 24 25 26 > >>
You may also start an advanced similarity search for this article.

