An Automated Framework for Virtually Accessing the Keyboard Using Eye Movements

  • Arati K. Deshpande, Dr. Arati D. K., Dr. Shailesh Kumar

Abstract

Eye movement may be a crucial real-time input channel for human-computer communication, crucial for persons with physical limitations. To increase the reliability, portability, and usefulness of eye-tracking technology in user-computer interaction, a unique eye control system that includes both mouse and keyboard capabilities is developed. The suggested system offers a simple interaction utilizing just the user’s eyes. The suggested system’s user flow has been meticulously crafted to correspond with human nature. In addition, a magnifying module is offered to facilitate precise operation. Two interactive activities of varying difficulty were performed in the experiment to compare the suggested eye control tool to an existing machine learning-based system. Measures based on the Technology Acceptance Model are used to assess the perceived efficacy of our system. The suggested solution is proved to be very successful in terms of usability and interface design. The experimental analysis gives an overview of the overall approach performance analysis of the proposed approach with the existing one.

Published
2022-07-02
How to Cite
Arati K. Deshpande, Dr. Arati D. K., Dr. Shailesh Kumar. (2022). An Automated Framework for Virtually Accessing the Keyboard Using Eye Movements. Design Engineering, (1), 3677 - 3688. Retrieved from http://thedesignengineering.com/index.php/DE/article/view/9520
Section
Articles