AUTHOR=Bhatia Surbhi , Devi Ajantha , Alsuwailem Razan Ibrahim , Mashat Arwa TITLE=Convolutional Neural Network Based Real Time Arabic Speech Recognition to Arabic Braille for Hearing and Visually Impaired JOURNAL=Frontiers in Public Health VOLUME=Volume 10 - 2022 YEAR=2022 URL=https://www.frontiersin.org/journals/public-health/articles/10.3389/fpubh.2022.898355 DOI=10.3389/fpubh.2022.898355 ISSN=2296-2565 ABSTRACT=Natural Language Processing (NLP) is a group of theoretically inspired computer structures for analyzing and modelling clearly going on texts at one or extra degrees of linguistic evaluation to acquire human-like language processing for quite a few activities and applications. Hearing and visually impaired people are unable to see entirely or have very low vision, as well as being unable to hear completely or having a hard time hearing. It is difficult to get information since both hearing and vision, which are crucial organs for receiving information, are damaged. Hearing and visually impaired people are considered to have a substantial information deficit, as opposed to people who just have one handicap, such as blindness or deafness. Visually and hearing-impaired people are unable to communicate with their voices in the same manner that blind people can. In the equal manner that persons with hearing impaired can get visible facts which includes characters, visible images, and landscapes thru their eyes, they may be not able to get hold of visible facts which includes characters, visible images, and landscapes via their eyes. In addition to a lack of knowledge, visually and hearing-impaired people who are unable to communicate with the outside world may experience emotional loneliness, which can lead to stress and, in severe circumstances, serious mental illness. As a result, for Visually and Hearing-impaired people who want to live active, independent lives in society, overcoming information handicap is a key issue. The major objective of this research is to recognize real-time Arabic speech input and convert it to Arabic text using Convolutional Neural Network-based algorithms before storing it on an SD card. The Arabic text is then converted to Arabic Braille characters, which are then used to operate the Braille pattern via a Braille display that outputs a solenoid drive. Visually and hearing challenged participants who were proficient in Braille reading deciphered the Braille lettering triggered on the finger.