Clean Balance-Ensemble CHD: A Balanced Ensemble Learning Framework for Accurate Coronary Heart Disease Prediction
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2025.16.10.05Keywords:
Coronary Heart Disease (CHD) Prediction, Balanced Ensemble Learning, Preprocessing, Noise Reduction, Prediction AccuracyDimensions Badge
Issue
Section
License
Copyright (c) 2025 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Coronary Heart Disease (CHD) is still one of the leading causes of death worldwide, which necessitates early and reliable prediction methods to support timely medical interventions. Traditional machine learning approaches frequently struggle with noisy and imbalanced datasets which leading to biased predictions and reduced diagnostic reliability. To address these limitations, this paper proposes the CleanBalance-EnsembleCHD algorithm that combines data cleaning, balancing, and ensemble learning to improve prediction accuracy. The goal is to reduce noise, handle imbalance, and combine the strengths of multiple classifiers to detect CHDs more effectively. For noise reduction, the methodology employs Edited Nearest Neighbor (ENN) and Iterative Partitioning Filter (IPF), if imbalance persists Synthetic Minority Oversampling Technique (SMOTE) used. Five classifiers namely Rotation Forest, LogitBoost, Multilayer Perceptron, Logistic Model Trees (LMT), and Random Forest were trained, with the best models chosen for weighted soft-voting ensemble integration. The experimental evaluation on a CHD dataset with an initial class imbalance (maj/min ratio: 1.038, Gini index: 0.4998) revealed significant improvements. After ENN and IPF cleaning, the dataset was reduced from 1011 to 853 balanced instances (class counts: {1.0=414, 0.0=439}). Individual classifiers performed well, with accuracies of 97.36% (Rotation Forest), 94.72% (LogitBoost), 96.04% (Multilayer Perceptron), 97.95% (LMT), and 98.53% (Random Forest). After that, the top three models chosen Random Forest, LMT, and Rotation Forest were combined into an ensemble that outperformed all individual models on the test set, with Accuracy: 99.42%, F1-score: 0.9939, and MCC: 0.9884. These findings show that CleanBalance-EnsembleCHD provides superior predictive reliability leading to noise-resistant and balanced decision-making. Finally, the proposed framework provides a powerful and interpretable solution for early CHD detection using the potential to help clinicians with risk assessment and medical decision support.Abstract
How to Cite
Downloads
Similar Articles
- Pravin P. Adivarekar1, Amarnath Prabhakaran A, Sukhwinder Sharma, Divya P, Muniyandy Elangovan, Ravi Rastogi, Automated machine learning and neural architecture optimization , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
- Deepika S, Jaisankar N, A novel approach to heart disease classification using echocardiogram videos with transfer learning architecture and MVCNN integration , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- V. Babydeepa, K. Sindhu, Piecewise adaptive weighted smoothing-based multivariate rosenthal correlative target projection for lung and uterus cancer prediction with big data , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- Engida Admassu, Classifying enset based on their disease tolerance using deep learning , The Scientific Temper: Vol. 14 No. 03 (2023): The Scientific Temper
- Y. Mohammed Iqbal, M. Mohamed Surputheen, S. Peerbasha, Swarm intelligence-driven HC2NN model for optimized COVID-19 detection using lung imaging , The Scientific Temper: Vol. 16 No. 03 (2025): The Scientific Temper
- P S Renjeni, B Senthilkumaran, Ramalingam Sugumar, L. Jaya Singh Dhas, Gaussian kernelized transformer learning model for brain tumor risk factor identification and disease diagnosis , The Scientific Temper: Vol. 16 No. 02 (2025): The Scientific Temper
- S. TAMIL FATHIMA, K. FATHIMA BIBI, Early diagnosis of cardiac disease using Xgboost ensemble voting-based feature selection, based lightweight recurrent neural network approach , The Scientific Temper: Vol. 16 No. 06 (2025): The Scientific Temper
- A. Basheer Ahamed, M. Mohamed Surputheen, M. Rajakumar, Quantitative transfer learning- based students sports interest prediction using deep spectral multi-perceptron neural network , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- Kinjal K. Patel, Kiran Amin, Predictive modeling of dropout in MOOCs using machine learning techniques , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- Ashutosh Kumar, The Effect of Noise Exposure on Cognitive Performance and Brain Activity Patterns , The Scientific Temper: Vol. 12 No. 1&2 (2021): The Scientific Temper
<< < 1 2 3 4 5 6 7 8 9 10 > >>
You may also start an advanced similarity search for this article.

