Stochastic kernelized discriminant extreme learning machine classifier for big data predictive analytics
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2024.15.spl.46Keywords:
Big data, predictive analytics, stochastic kernelized quadratic discriminant analysis, qualitative indexed extreme learning classifier, Baroni–Urbani–Buser coefficient, Hardlimit activation function.Dimensions Badge
Issue
Section
License
Copyright (c) 2024 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Predictive analytics has appeared as a dominant tool to improve crop yield in the agriculture field by leveraging big data. Soil is a vital aspect in determining the growth of crop production, and its attributes considerably influence crop growth, nutrient availability, and overall crop yield. Predictive analytics involves the combination of big soil data with weather information for crop yield estimation. By utilizing chronological data on crop performance, different machine learning (ML) and deep learning (DL) models have been developed to forecast crop yield outcomes under different scenarios. However, accurate prediction in the shortest possible time is a major challenging issue. A novel model called stochastic kernelized discriminant extreme learning machine classifier (SKDELMC) is introduced for crop yield forecast by analyzing large amounts of soil as well as weather big data. This SKDELMC model typically includes feature selection and classification to identify the most relevant features and classify the data into different categories. A number of data is gathered from a dataset. This data includes a range of soil parameters and the weather features that influence crop growth. After the data collection, with a huge number of features, it’s important to choose mainly relevant ones for predictive analytics. The stochastic kernelized quadratic discriminant analysis is applied to identify the main informative features to minimize time complexity prediction. Once relevant features are chosen, the next step is to classify data into different categories using the qualitative indexed extreme learning classifier. It is a feed-forward neural network having a straightforward solution without requiring any iteration. A network includes different layers, such as the input layer, multiple hidden layers, as well as output layer. Relevant features are provided to the input layer. Then Baroni–Urbani–Buser coefficient is applied in the hidden layer by analyzing testing as well as training data is the qualitative index used to analyze the similarity between the data. After that, the Hardlimit activation function is utilized for evaluating similarity value as well as providing classification results. Based on the classification results, accurate prediction outcomes are attained at the output layer. Experimental evaluation is carried out by dissimilar quantitative parameters, namely disease prediction accuracy, sensitivity, false-positive rate, prediction time, and space complexity. Discussed performance outcomes illustrate that the SKDELMC model improves the accuracy of prediction and decreases the time consumption as well as space complexity than existing prediction techniques.Abstract
How to Cite
Downloads
Similar Articles
- Swetha Rajkumar, Jayaprasanth Devakumar, LSTM based data driven fault detection and isolation in small modular reactors , The Scientific Temper: Vol. 14 No. 01 (2023): The Scientific Temper
- M. Yamunadevi, P. Ponmuthuramalingam, A review and analysis of deep learning methods for stock market prediction with variety of indicators , The Scientific Temper: Vol. 16 No. 02 (2025): The Scientific Temper
- V. Selvi, T. S. Poornappriya, R. Balasubramani, Cloud computing research productivity and collaboration: A scientometric perspective , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- Sharada C, T N Ravi, S Panneer Arokiara, Lancaster sliced regressive keyword extraction based semantic analytics on social media documents , The Scientific Temper: Vol. 16 No. 08 (2025): The Scientific Temper
- S. TAMIL FATHIMA, K. FATHIMA BIBI, Early diagnosis of cardiac disease using Xgboost ensemble voting-based feature selection, based lightweight recurrent neural network approach , The Scientific Temper: Vol. 16 No. 06 (2025): The Scientific Temper
- Lakshmi Priya, Anil Vasoya, C. Boopathi, Muthukumar Marappan, Evaluating dynamics, security, and performance metrics for smart manufacturing , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
- Chetna Dhull, Asha ., Impact of crop insurance and crop loans on agricultural growth in Haryana: A factor analysis approach , The Scientific Temper: Vol. 15 No. 01 (2024): The Scientific Temper
- V. Karthikeyan, C. Jayanthi, Advancements in image quality assessment: a comparative study of image processing and deep learning techniques , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- Jyoti Vishwakarma, Sunil Kumar, Mapping Research on ESG Disclosure and Firm Performance: A Systematic Bibliometric Analysis , The Scientific Temper: Vol. 16 No. 09 (2025): The Scientific Temper
- Sreenath M.V. Reddy, D. Annapurna, Anand Narasimhamurthy, Influence node analysis based on neighborhood influence vote rank method in social network , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
<< < 4 5 6 7 8 9 10 11 12 13 > >>
You may also start an advanced similarity search for this article.

