English language analysis using pattern recognition and machine learning
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2023.14.3.32Keywords:
Computer text, Handwriting data, OCR, Pattern recognition, Statistical structureDimensions Badge
Issue
Section
License
Copyright (c) 2023 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Pattern identification and classification in complicated systems are difficult. This study uses optical character recognition (OCR) to digitize handwritten data. OCR segments and categorizes characters using online and offline methods for different input sources. Hindi and Bangladeshi categorization results unite linguistic studies. Handwriting recognition systems create editable digital documents from touchscreens, electronic pens, scanners, and photographs. Statistical, structural, neural network and syntactic methods improve online and offline recognition. In “english language analysis using pattern recognition and machine learning,” the accuracy of various approaches is examined, showing deep convolution neural networks (DCNN) 98% accuracy in recognizing subtle linguistic patterns. Nave Bayes, a trustworthy language analysis approach, has 96.2% accuracy. Table recognition (TR) algorithms retrieve structured information at 97%. This method outperforms others with 98.4% accuracy. This unique strategy could improve english language analysis using cutting-edge pattern recognition and machine learning techniques.Abstract
How to Cite
Downloads
Similar Articles
- Hemamalini V., Victoria Priscilla C, Deep learning driven image steganalysis approach with the impact of dilation rate using DDS_SE-net on diverse datasets , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- Ayesha Shakith, L. Arockiam, EMSMOTE: Ensemble multiclass synthetic minority oversampling technique to improve accuracy of multilingual sentiment analysis on imbalance data , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- Ayesha Shakith, L. Arockiam, Enhancing classification accuracy on code-mixed and imbalanced data using an adaptive deep autoencoder and XGBoost , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- D. Prabakar, Santhosh Kumar D.R., R.S. Kumar, Chitra M., Somasundaram K., S.D.P. Ragavendiran, Narayan K. Vyas, Task offloading and trajectory control techniques in unmanned aerial vehicles with Internet of Things – An exhaustive review , The Scientific Temper: Vol. 14 No. 04 (2023): The Scientific Temper
- Deepika S, Jaisankar N, A novel approach to heart disease classification using echocardiogram videos with transfer learning architecture and MVCNN integration , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- Gomathi P, Deena Rose D, Sampath Kumar R, Sathya Priya M, Dinesh S, Ramarao M, Computer vision for unmanned aerial vehicles in agriculture: applications, challenges, and opportunities , The Scientific Temper: Vol. 14 No. 03 (2023): The Scientific Temper
- Priydarshi Shireesh, Tiwari Atul Kumar, Singh Prashant, Rai Kumud, Mishra Dev Brat, Comparative Water Quality Analysis in Beso River in District Jaunpur, Azamgarh and Ghazipur Uttar Pradesh , The Scientific Temper: Vol. 12 No. 1&2 (2021): The Scientific Temper
- Amanda Q. Okronipa, Jones Y. Nyame, Adoption of health information systems in emerging economies: Evidence from Ghana , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- Sathya R., Balamurugan P, Classification of glaucoma in retinal fundus images using integrated YOLO-V8 and deep CNN , The Scientific Temper: Vol. 15 No. 03 (2024): The Scientific Temper
- S. Hemalatha, N. Vanjulavalli, K. Sujith, R. Surendiran, Chaotic-based optimization, based feature selection with shallow neural network technique for effective identification of intrusion detection , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
<< < 5 6 7 8 9 10 11 12 13 14 > >>
You may also start an advanced similarity search for this article.