A Novel Design of Audio CAPTCHA for Visually Impaired Users

Main Article Content

Mrim Mhsn Alnfiai


CAPTCHAs are widely used by web applications for the purpose of security and privacy. However, traditional text-based CAPTCHAs are not suitable for sighted users much less users with visual impairments. To address the issue, this paper proposes a new mechanism for CAPTCHA called HearAct, which is a real-time audio-based CAPTCHA that enables easy access for users with visual impairments. The user listens to the sound of something (the “sound-maker”), and he/she must identify what the sound-maker is. After that, HearAct identifies a word and requires the user to analyze a word and determine whether it has the stated letter or not. If the word has the letter, the user must tap and if not, they swipe. This paper presents our HearAct pilot study conducted with thirteen blind users. The preliminary user study results suggest the new form of CAPTCHA has a lot of potential for both blind and visual users. The results also show that the HearAct CAPTCHA can be solved in a shorter time than the text-based CAPTCHAs because HearAct allows users to solve the CAPTCHA using gestures instead of typing. Thus, participants preferred HearAct over audio-based CAPTCHAs. The results of the study also show that the success rate of solving the HearAct CAPTCHA is 82.05% and 43.58% for audio CAPTCHA. A significant usability differences between the System Usability score for HearAct CAPTCHA method was 88.07 compared to audio CAPTCHA was 52.11%. Using gestures to solve the CAPTCHA challenge is the most preferable feature in the HearAct solution. To increase the security of HearAct, it is necessary to increase the number of sounds in the CAPTCHA. There is also a need to improve the CAPTCHA solution to cover wide range of users by adding corresponding image with each sound to meet deaf users’ needs; they then need to identify the spelling of the sound maker’s word.

Article Details

How to Cite
Alnfiai, M. M. (2020). A Novel Design of Audio CAPTCHA for Visually Impaired Users. International Journal of Communication Networks and Information Security (IJCNIS), 12(2). https://doi.org/10.17762/ijcnis.v12i2.4529
Research Articles
Author Biography

Mrim Mhsn Alnfiai, Taif University

Mrim Alnfiai is an Assistant Professor of Information Technology at the TaifUniversity in Saudi Arabia. Her research interests are in assistive technology, Human Computer Interaction and Accessibility. Mrim publishes several papers at assistive technology, HCI and accessibility conferences including ASSETS, ANT, FNC, CIST, JAIHC, and ICCA. Currently, her research focuses on designing accessible tools for visually impaired people including people with no or low vision. She has conducted several studies and experiences to understand visually impaired abilities and behaviors and design accessible systems that help them interact easily with technology. From 2012 to 2014, she was a master student in computer science at Dalhousie University, CANADA. From 2014 until 2018, she was a PhD student in computer science at Dalhousie University, CANADA. where she was advised by Srini Sampalli. Mrim has received the Saudi Bureau Awarded in 2014. From 2018, she is an Assistant Professor of Information Technology at the Taif University and she is working as the vice president of IT department. Contact information: Mrim Alnfiai, Ph.D.Taif Universitym.alnofiee@tu.edu.sa +966566811101