Research

 

Home

 


People

 


Research

 


Facilities

 


Collaborators

 


Prospective Students

 


Contact Us

 

Our present research is investigating new modalities of interaction and in particular eye gaze controlled interfaces. We are developing new target prediction and multimodal fusion algorithms to improve accuracy and response times of gaze controlled interfaces. We are investigating automotive and military aviation environments and trying to improve human machine interaction for secondary mission control tasks. In parallel, we are working on detecting users’ cognitive load by analysing their ocular parameters using low cost off-the shelf eye gaze trackers.

 

In automotive domain, we are working on both facilitating HMI while operating a dashboard and on developing technology to automatically detect distraction of drivers. Our patent on Multimodal Gaze Controlled Projected Dasboard proposes a Head Up Display that projects the dashboard display on windscreen, which does not require drivers taking off eyes from road while undertaking secondary tasks and the display can itself be operated using eye gaze and finger movement of drivers. Our driver distraction detection system is investigating different sensors to track head and eye gaze movements and relating those to complexity of driving tasks.

Our research on automotive domain was easily exploited for military aviation, in particular for fast jets. We have an operational flight simulator in our lab, which is integrated to a Multimodal Gaze Controlled HUD. We invented new target prdiction technologies for eye gaze and finger movement tracking systems to reduce pointing and selection times in the HUD. The HUD can also be integrated to Brain Computer and Voice controlled systems and we have also integrated a gaze controlled system with a high-end flight simulator at the National Aerospace Laboratory. Our research is currently optimizing input and output modalities for different mission control tasks.

We also actively pursue research on Assistive Technology and work with spastic children. Earlier, we investigated  eye-gaze tracking data from people with visual impairment, audiogram data from people with hearing impairment and hand strength data from people with motor impairment and developed the inclusive user modelling system to promote inclusive design of electronic interfaces. The system found applications in a wide variety of systems including a digital TV framework for elderly users, a weather monitoring system, Augmentative and Alternative Communication Aid for spastic children, an electronic agricultural advisory system and a smartphone based emergency warning system.

With a seed funding from Bosch, we initiated research on smart manufacturing. Presently we are developing IoT modules for environment and posture tracking. We have developed a visualization module that displays both spatial and temporal data and by holding a semi-transparent screen in front of an electronic screen, a user can augment spatial information with temporal trend or vice versa. The system also supports interaction through an interactive laser pointer and is integrated to an early warning system that automatically sends a Gmail if it senses any deviation in sensor readings in the contexts of environment and people tracking.