Enhancing Kannada text-to-speech and braille conversion with deep learning for the visually impaired
Downloads
Published
DOI:
https://doi.org/10.58414/SCIENTIFICTEMPER.2025.16.spl-1.06Keywords:
Kannada Text-to-Braille, Speech Synthesis, Text-to-Speech (TTS), Support Vector Machine (SVM), Tacotron2, HiFi-GAN, WaveNet, Braille ConversionDimensions Badge
Issue
Section
License
Copyright (c) 2025 The Scientific Temper

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Advancements in assistive technology have greatly improved accessibility for visually impaired individuals, enabling seamless interaction with textual content. This research introduces a novel approach that converts Kannada text into both speech and Braille, promoting multilingual accessibility. The proposed system incorporates a support vector machine (SVM) for Kannada text-to-Braille conversion and a deep learning-based text-to-speech (TTS) model for speech synthesis. The Braille translation module accurately maps Kannada characters to their respective Braille representations using SVM classifiers, ensuring precise conversion. Simultaneously, the speech synthesis component utilizes Tacotron2 for converting Kannada text into mel-spectrograms, followed by WaveNet/HiFi-GAN to produce high-quality Kannada speech. A dataset containing 2000 Kannada text-Braille pairs and corresponding text-speech samples is employed for training and evaluation. Experimental findings validate the effectiveness of the proposed system in accurately translating Braille while generating clear and natural Kannada speech. The integration of machine learning and deep learning techniques enhances efficiency, scalability, and usability, making this a reliable assistive tool for visually impaired Kannada-speaking individuals.Abstract
How to Cite
Downloads
Similar Articles
- Megha Joshi, Bhaskar Pandya, Feminist Narratology and Gendered Reimagining of the Mahabharata in Kane’s work Karna’s Wife: The Outcast’s Queen , The Scientific Temper: Vol. 17 No. 01 (2026): The Scientific Temper
- Ritu Jain, Ritesh Tiwari, Shailendra Kumar, Ajay Kumar Shukla, Manish Kumar, Awadhesh Kumar Shukla, Description of Medicinal Herb, Perfume Ginger: Hedychium spicatum (Zingiberales: Zingiberaceae) , The Scientific Temper: Vol. 13 No. 02 (2022): The Scientific Temper
- Temesgen A. Asfaw, Deep learning hyperparameter’s impact on potato disease detection , The Scientific Temper: Vol. 14 No. 03 (2023): The Scientific Temper
- K. R. R. Prakash, Kishore Kunal, Designing information systems for business administration through human and computer interaction , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- Chaitanya A. Kulkarni, Sayali Wadhokar, Om C. Wadhokar, Medhavi Joshi, Tushar Palekar, The intersection of cervical cancer treatment and physiotherapy: Current insights and future directions , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- Jayalakshmi K., M. Prabakaran, Feature selection in HR analytics: A hybrid optimization approach with PSO and GSO , The Scientific Temper: Vol. 15 No. spl-1 (2024): The Scientific Temper
- Santima Uchukanokkul, Bijal Zaveri, Global student mobility from Southeast Asia and South Asia: Trends, challenges, and policy interventions , The Scientific Temper: Vol. 16 No. 02 (2025): The Scientific Temper
- Olivia C. Gold, Jayasimman Lawrence, Enhanced LSTM for heart disease prediction in IoT-enabled smart healthcare systems , The Scientific Temper: Vol. 15 No. 02 (2024): The Scientific Temper
- Esther Princess G, Navigating the challenges of moonlighting: A study of employee experiences in the FMCG sector in India , The Scientific Temper: Vol. 15 No. 04 (2024): The Scientific Temper
- Manan Pathak, Dishang Trivedi Trivedi, Field-effect limits and design parameters for hybrid HVDC – HVAC transmission line corridors , The Scientific Temper: Vol. 16 No. Spl-1 (2025): The Scientific Temper
<< < 16 17 18 19 20 21 22 23 24 25 > >>
You may also start an advanced similarity search for this article.

