Eye Gaze-Controlled Interfaces for Limited Mobility Environment

JeevithaShree DV
Supervisor: Dr. Pradipta Biswas
I3D Lab, CPDM, IISc Bangalore

Eye movement-based Human Computer Interaction (HCI) has become a significant area of research in computer science. The gaze-based computing devices allow a computer to be controlled using only eye movements, which can be critical when a person is unable to use a mouse or keyboard. Advances in eye tracking focuses on interactive communication and control tools based on gaze tracking, including eye typing, computer control, and gaming. Users with Severe Speech and Motor Impairment (SSMI) often use a communication chart through their eye gaze or limited hand movement and care takers interpret their communication intent. My research focuses on developing gaze-based accessible interfaces for users with limited mobility. In particular, I aim to design and develop intelligent gaze-controlled accessible systems for people with Severe Speech and Motor Impairment (SSMI). I further explored the use of gaze-based interfaces for situationally impaired users. As reported by World Health Organization (WHO), there are currently more than 2 billion people with disability in the world, accounting for 37.5% of the worlds’ population. Furthermore, in India, about 2.21% (2.68 crores) of population, lives with some kind of physical or mental disability. Disability is part of the human condition. Limited mobility impairment is a disability that refers to the inability of a person to use one or more of his/her extremities, or lack of strength to walk or lift objects. Limited mobility impairment can be of two types namely, situational impairment, and physical impairment. Situational impairment is the inability to access computers as a result of the situation. Examples include noise, insufficient lighting, diversions, tasks that need the use of eyes or hands. Physical impairment on the other hand is a disability that limits a person's physical capacity to move, coordinate actions, or perform physical activities. They do not have control over their body movements and are subject to assistance in almost all activities like eating, playing and communication. Yet, eye movements may be the only movements that are in their voluntary control. These users depend on electronic interfaces for communication, learning and many other daily activities. However, the interfaces are often designed assuming the preference and ease of use of end users for different screen regions to be the same for people with SSMI and their able-bodied counterparts. Thus, creating accessible electronic interfaces for these users is one of the most pressing problems today. My research focuses on investigating HCI issues in limited mobility environment for developing intelligent eye gaze-controlled interfaces. Chapter 2 presents a detailed literature review on accessible computing, eye tracking technology and use of eye tracking technology in accessible computing for limited mobility environment.

Figure 1. User Trial with Eye Gaze Controlled Game

Due to spasticity, users with SSMI could not use any assistive technologies like track ball, push button switch and so on. Chapter 3 presents a set of studies to evaluate whether these users can undertake pointing and selection tasks with eye gaze as input, understand the preferences and ease of use of different screen regions and validate the designed interfaces for users with SSMI. Results noted that, users with SSMI could fixate attention despite having more uncontrolled saccadic gaze movements when compared to their able-bodied counterparts. Both users could fixate attention to visual stimuli within 1.5 seconds in more than 80% cases. Further, a left-to-right and top-to-bottom search strategy for able bodied users was noted. However, for users with SSMI, the frequency of total selection, first selection and visual search patterns indicated a nearest neighbourhood strategy. They preferred the middle and right side of the screen for fixating attention more often than the extreme top and left side of the screen. Additionally, a study involving gaze-controlled word construction game was designed and showed that these users can undertake pointing and selection tasks statistically significantly faster [F(1,195) = 31.04, p < 0.01, ƞ2 = 0.14] if the screen elements are placed at their preferred positions on screen. Finally, a gaze controlled Alternative and Augmentative Communication Aid with an intelligent user interface was proposed. The interface adapts positions of screen elements based on frequency of use and ease of selection. Thus, the proposed interfaces can help users with SSMI communicate with their peer and take part in edutainment games without the help of an assistant.

Users with SSMI often use a technique called eye pointing to communicate with outside world with the help of an assistant. Chapter 4 aims to automate this communication through electronic means and presents an optimized eye gaze-controlled English and Kannada virtual keyboards that follow both static and dynamic adaptation process. The virtual keyboard can automatically adapt to reduce eye gaze movement distance through Genetic algorithm; and dwell time for selection based on Markov model-based algorithm. Results noted that, with the proposed layout for English and Kannada virtual keyboards, a reduction in total eye gaze movement distance was found to be 13% and 6.04% for the optimized and orderly arrangement of layout respectively. Further, using the proposed virtual keyboard layout for English language, the average task completion time for users with SSMI was 39.44 seconds in adaptive condition and 29.52 seconds in non-adaptive condition. The overall typing speed was 16.9 lpm (letters per minute) for able-bodied users and 6.6 lpm for users with SSMI without using any word completion or prediction features. A case study with an elderly participant with SSMI found a typing speed of 2.54 wpm (words per minute) and 13.99 lpm (letters per minute) after four months of practice. Thus, the proposed virtual keyboard help users with SSMI type better without any intervention of an assistant. Additionally, the proposed English virtual keyboard layout with word completion feature was successfully integrated to a web-based interactive visualization Covid dashboard, thereby making it accessible for users with SSMI.

Figure 2. Eye Gaze Controlled Virtual Keyboard

Web accessibility is one of the most important aspects of building a website. It is important for web developers to ensure that their website is accessible according to WCAG standards for people with different range of abilities. Chapter 5 compares a set of ten WCAG tools and their results in terms of ease of comprehension and interpretation by web developers for two websites. After careful evaluation of two websites using the ten tools, results noted that, both websites lacked color contrast between background and foreground; lack of sign language alternatives; opening of pop-ups without proper warnings and so on. Further, results from comparative analysis of selected web accessibility tools noted that, there is no single tool that can be found ideal in all aspects. However, from my study, Utilitia Validator by Utilitia SP. z O.O was considered the most feasible tool. I then demonstrated how simulation of user interaction can capture usability and accessibility issues that are not detected through only syntactic analysis of website content. Further a Common User Profile format was presented to compare and contrast accessibility systems and services; and to simulate and personalize interaction for users with different range of abilities. It can be noted that by rectifying and incorporating issues and alternate suggestions by simulation study and Common User Profile format respectively, developers can improve both websites making it accessible to maximum audience. Finally, a case study on personalization of contents of a Geo-Tagging website using the Common User Profile format was presented.

Figure 3. Personalized Web Interfaces

Figure 4. Eye Gaze Analysis in Aircraft Cockpit

Besides developing accessible gaze-controlled interfaces for users with SSMI, chapter 6 presents my research contributions beyond physically impaired users. I investigated the use of eye gaze-controlled interfaces for situationally impaired pilots in military aviation environment. The study focused on developing gaze-controlled intelligent and adaptive Multi-Functional Displays (MFD) with hotspot and automatic zooming techniques using two algorithms. I designed bespoke hardware as a working prototype of a configurable head up display and designed and implemented new algorithms to control an on-screen cursor through an operator’s eye gaze. Results noted that participants could undertake pointing and selection tasks at an average time of 2.43 seconds for Head Down Display and 1.9 seconds for Head Up Display. Additionally, the gaze-controlled interface statistically significantly increased the speed of interaction for secondary mission control tasks when compared to touchscreen (2.8 seconds) and joystick-based (2.67 seconds) target designation system. The gaze-controlled system was then tested inside an aircraft both on the ground and in different phases of flight with military pilots. Results showed that they could undertake representative pointing and selection tasks in less than two seconds, on average. I further used eye tracking not just for improving response time and pointing and selection times, but as a metric to estimate pilots’ cognitive load while performing complex tasks inside an aircraft. Ocular parameters were investigated to automatically estimate pilots’ cognitive load and found rate of fixation to be significantly different for different flying conditions. Further results from the study can be used for real time estimation of pilots’ cognitive load, providing suitable warnings and alerts to the pilot in cockpit, and training of military pilots on cognitive load management during operational missions. I then used eye tracking to understand reading ability of children of different age groups due to learning disability; and to understand how aging affects user experience on mobile phones while performing complex tasks, and estimate cognitive workload using ocular parameters. Results from both studies noted that students with poor reading ability took significantly longer time (2.1 minutes) to read the same length of text than others (0.37 minutes). A significant correlation between number of fixations, backward eye gaze movements and subjective rating scores indicated the students’ reading abilities. Further, the proposed analysis can be subsequently used to diagnose early sign of dyslexia among students. Finally, I noted that aging has a bigger effect on performance of using mobile phones irrespective of any complex task given to them. Participants aged between 50 to 60+ years had difficulties in completing tasks and showed increased fixation rate (1.78 seconds) and cognitive workload. They took longer fixation duration (592.04 seconds) to complete tasks which involved copy-paste operations. The study also identified design implications and provided design recommendations for designers and manufacturers.

Finally, I summarized the dissertation and discussed directions for future work in chapter 7. I conclude that eye tracking serves as a feasible and comfortable input modality for most users with limited mobility. By analysing users’ ocular parameters, adaptable and accessible interfaces can be developed based on their needs thereby resolving most HCI issues. Additionally, eye tracking can be used to analyse visual behaviour, diagnose perceptual and learning disabilities, estimate users’ cognitive processes to reveal information such as learning patterns, social interaction methods, and so on across a wide range of users. In the end, I have discussed about my future proposals for post-doctoral work.


Publications

  1. JeevithaShree DV, P Jain, A Mukhopadhyay, KPS Saluja, P Biswas, Eye Gaze Controlled Adaptive Virtual Keyboard for Users with SSMI, Technology and Disability, IOS Press, 2021
  2. Jeevithashree, D. V., Saluja, K. S., & Biswas P, A case study of developing gaze-controlled interface for users with severe speech and motor impairment, Technology and Disability, 2019
  3. JeevithaShree DV, KPS Saluja, LRD Murthy and P. Biswas, Operating different displays in military fast jets using eye gaze tracker, Journal of Aviation Technology and Engineering, 2018
  4. Jeevithashree DV, K. P. Saluja and P. Biswas, Gaze Controlled Interface for Limited Mobility Environment, ACM International Conference on Designing Interactive Systems (DIS), 2018
  5. Jeevithashree D. V., Pallavi Ray, P. Natarajan, and Pradipta Biswas, Automating the process of gaze tracking data using soft clustering in Intelligent Computing, Instrumentation and Control Technologies (ICICICT), 2017
  6. S. Kumar, JeevithaShree DV, and P. Biswas, Comparing ten WCAG Tools for accessibility evaluation of websites, Technology and Disability, IOS Press 2021
  7. M. D. Babu, JeevithaShree DV, G. Prabhakar, KPS Saluja and P. Biswas, Estimating Pilots’ Cognitive Load From Ocular Parameters Through Simulation and In-Flight Studies, Journal of Eye Movement Research, Bern Open Publishing, 2019
  8. P. Biswas, Jeevithashree DV, G. Prabhakar, Eye Gaze Controlled Remotely Piloted Vehicle, IEEE Drone Computing, 2017
  9. P.Biswas and Jeevithashree DV, Eye Gaze Controlled MFD for Military Aviation, ACM International Conference on Intelligent User Interfaces (IUI), 2018 (Acceptance rate 23%)
  10. KPS Saluja, Jeevithashree Dv, S. Arjun and P. Biswas, Analyzing Eye Gaze Movement of Users with Different Reading Abilities due to Learning Disability, 3rd International Conference on Graphics and Signal Processing (ICGSP), 2019
  11. M D Babu, JeevithaShree DV, G. Prabhakar and P. Biswas, Using Eye Gaze Tracker to Automatically Estimate Pilots’ Cognitive Load, 50th International Symposium of the Society for Flight Test Engineer (SFTE) 2019
  12. A. Agarwal, JeevithaShree DV, K S Saluja, A Sahay, P Mounika, A Sahu, R Bhaumik, V K Rajendran and P. Biswas, Comparing Two Webcam based Eye Gaze Trackers for Users with Severe Speech and Motor Impairment, International Conference on Research into Design (ICoRD), 2019 [Distinguished Paper Award]
  13. A William, JeevithaShree DV, K S Saluja, A Mukhopadhyay, R Murugash, P Biswas, Eye Tracking to Understand Impact of Aging on Mobile Phone Applications, International Conference on Research into Design (ICoRD), 2021
  14. P Biswas, KPS Saluja, S Arjun, LRD Murthy, G Prabhakar, VK Sharma and Jeevitha Shree DV, COVID 19 Data Visualization through Automatic Phase Detection, ACM Digital Government: Research and Practice (DGOV) 2020
  15. MD Babu, P. Biswas, G. Prabhakar, DV JeevithaShree and LRD Murthy, Eye Gaze Tracking in Military Aviation, Indian Air Force AVIAMAT, 2018
  16. A. Mukhopadhaya, V. Yellheti, S. Arjun, P. Thomas, M. D. Babu, K. P. S. Saluja, JeevithaShree DV and P. Biswas, Evaluating Accuracy of Eye Gaze Controlled Interface in Military Aviation Environment, Indian Control Conference, 2019