(October 2016 onwards)
Aviation Automotive Assistive Manufacturing
1. Study Project on Advanced HMI Systems, Aeronautical Development Agency, MoD, India (INR 47.5L, 2019-2020)
2. Reducing pilotsí cognitive load by facilitating human machine interaction in military aviation environment, Aeronautical Research and Development Board, MoD, India (INR 47.3L, 2017-2019)
3. ERP PhD student sponsored by CSIR National Aerospace Laboratories
Our aviation related projects are developing eye gaze controlled Multi-Function Display (MFD) and Head Mounted Display system (HMDS) and can simultaneously estimate pilotsí cognitive workload from ocular parameters. As part of the research we collected ocular parameters during air to ground attack dives and -1G to +5G constant G-load manoeuvres in combat aircrafts. We also developed and demonstrated eye gaze controlled virtual and real flight simulators in Aero India 2019 and Defence Expo 2020.
Recording Ocular Parameters at +5G
o We reported that pilots can undertake representative pointing and selection tasks at less than 2 secs on average and the accuracy of COTS eye gaze tracking glass is less than 5ļ of visual angle up to +3G, although it is less accurate at -1G and +5G. We used these results in developing a multimodal head and eye gaze movement based HMDS and in a user study involving standard ISO 9241 pointing task. The average pointing and selection times was significantly lower in the proposed system compared to traditional TDS.
o Ocular parameters like rate of fixation is significantly different in different flying conditions and significantly correlated with rate of descent during air to ground dive training task, normal load factor (G) of the aircraft during constant G level turn manoeuvres and pilotís control inceptor and tracking error in simulation tasks.
1. M. D. Babu, JeevithaShree DV, G. Prabhakar, KPS Saluja and P. Biswas, Estimating Pilotsí Cognitive Load From Ocular Parameters Through Simulation and In-Flight Studies, Journal of Eye Movement Research, Bern Open Publishing, 12 (3), 2019
2. JeevithaShree DV, KPS Saluja, LRD Murthy and P. Biswas, Operating different displays in military fast jets using eye gaze tracker, Journal of Aviation Technology and Engineering 8(1), Purdue University Press, 2018
3. P. Biswas and Jeevithashree DV, Eye Gaze Controlled MFD for Military Aviation, ACM International Conference on Intelligent User Interfaces (IUI) 2018 (Acceptance rate 23%)
4. Murthy LRD, Multimodal Interaction for Real and Virtual Environments, ACM International Conference on Intelligent User Interfaces (IUI) Doctoral Consortium 2020 (Acceptance rate 30%)
5. Jeevithashree DV, K. P. Saluja and P. Biswas, Gaze Controlled Interface for Limited Mobility Environment, ACM International Conference on Designing Interactive Systems (DIS) 2018
6. LRD Murthy, A. Mukhopadhyay, V Yelleti, S Arjun, P Thomas, MD Babu, KPS Saluja, JeevithaShree DV and P. Biswas, Evaluating Accuracy of Eye Gaze Controlled Interface in Military Aviation Environment, IEEE Aerospace 2020
7. J. Rajesh and P. Biswas, Eye-gaze tracker as input modality for Military Aviation Environment, IEEE International Conference on Intelligent Computing, Instrumentation and Control Technologies
8. M D Babu, JeevithaShree DV, G. Prabhakar and P. Biswas, Using Eye Gaze Tracker to Automatically Estimate Pilotsí Cognitive Load, 50th International Symposium of the Society for Flight Test Engineer (SFTE), sponsored by Airbus
9. M D Babu, P. Biswas, G. Prabhakar, JeevithaShree DV and LRD Murthy, Eye Gaze Tracking in Military Aviation, Indian Air Force AVIAMAT 2018
Screenshot of Interactive HUD
Automotive Aviation Assistive Manufacturing
1. Intelligent Dashboard Control System(IDCS), Distraction and Cognitive Load Detection, Faurecia Services Groupe, France (INR 57.3L, 2018-20)
2. Ride Quality Monitoring of Drivers and Passengers of Autonomous Vehicles, Wipro as part of the WIRIN project (INR 10L, 2018-19)
3. Ashok Leyland Course on Automotive User Interface (INR 2L, 2018)
We have developed a finger movement and eye gaze controlled interactive Head Up Display (HUD) for car that can also simultaneously estimate driversí cognitive load using a new machine learning algorithm involving ocular parameters. We validated the interactive HUD through simulation and in-car studies. The work is extended to autonomous vehicles too by integrating the cognitive load estimation module with object detection Convolutional Neural Network (CNN) models. As part of the study, we compared existing CNNs like RetinaNet, YOLO and maskR-CNN for Indian road dataset with respect to accuracy and latency of detecting different traffic participants.
o User studies show the interactive HUD improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2 secs on average inside a Toyota Etios car.
o Average velocities of saccadic intrusion increases with increase in cognitive load and recording saccadic intrusion and eye blinks for 6 s duration can predict driversí instantaneous perception of developing road hazards.
o Accuracy of Mask R-CNN is significantly higher than YOLOv3 and RetinaNet while evaluating these models on novel classes of objects such as animals, autorickshaws, caravan in Indian road dataset. We have also tested latency of the CNN models and found that latency of YOLOv3 is significantly lower than other two models and RetinaNet is significantly faster than Mask R-CNN.
1. P. Biswas, Exploring the use of Eye Gaze Controlled Interfaces in Automotive Environments, Springer 2016, ISBN: 978-3-319-40708-1
2. G. Prabhakar, A. Ramakrishnan, L. R. D Murthy, V. K. Sharma, M. Madan, S. Deshmukh and P. Biswas, Interactive Gaze & Finger controlled HUD for Cars, Journal of Multimodal User Interface, Springer, 2019
3. P. Biswas and G. Prabhakar, Detecting Driversí Cognitive Load from Saccadic Intrusion, Transportation Research Part F: Traffic Psychology and Behaviour 54, Elsevier, 2018
4. G. Prabhakar and P. Biswas, Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technologies and Interaction, 2(1), 2018
5. P. Biswas, S. Deshmukh, G. Prabhakar, M. Modiksha and A. Mukhopadhyay, System and Method for Monitoring Cognitive Load of a Driver of a Vehicle, Indian Patent Application No.: 201941052358
6. P. Biswas, S. Deshmukh, G. Prabhakar, M. Modiksha, V. K. Sharma and A. Ramakrishnan, A System for Man- Machine Interaction in Vehicles, Indian Patent Application No.: 201941009219 PCT International Application No. PCT/IB2020/050253
7. G. Prabhakar and P. Biswas, Wearable Laser Pointer for Automotive User Interface, Application No.: 201741035044, PCT International Application No. PCT/IB2018/057680
8. P. Biswas, Interactive Gaze Controlled Projected Display, Indian Patent Application No.: 201641037828
9. P. Biswas, G. Prabhakar, J. Rajesh, K. Pandit and A. Halder, Improving Eye Gaze Controlled Car Dashboard using Simulated Annealing, Proceedings of the 31st British Human Computer Interaction Conference (British HCI 17, selected to exhibit in the Interactions Gallery)
10. G. Prabhakar, N. Madhu and P. Biswas, Comparing Pupil Dilation, Head Movement, and EEG for Distraction Detection of Drivers, Proceedings of the 32nd British Human Computer Interaction Conference 2018 (British HCI 18)
11. P. Biswas, S. Twist and S. Godsill, Intelligent Finger Movement Controlled Interface for Automotive Environment, Proceedings of the 31st British Human Computer Interaction Conference 2017 (British HCI 17)
12. P. Biswas, Multimodal Gaze Controlled Dashboard, ACM AutomotiveUI 2016
13. G. Prabhakar and P. Biswas, Evaluation of Laser Pointer as a Pointing Device in Automotive, IEEE International Conference on Intelligent Computing, Instrumentation and Control Technologies [Best Paper Award]
14. G. Prabhakar, J. Rajesh and P.Biswas, Comparison of three Hand Movement Tracking Sensors as Cursor Controllers, IEEE International Conference on Control, Instrumentation, Communication & Computational Technologies (ICCICCT-2016) [Best Paper Award]
15. A. Ramakrishnan, G. Prabhakar, M. Madan, S. Deshmukh, P. Biswas, Eye Gaze Controlled HUD, International Conference on ICT for Sustainable Development 2019
AssistiveTechnology Aviation Automotive Manufacturing
1. DST DUO Fellowship with University of Barcelona, Spain (Ä6000, 2020)
2. Gaze Controlled Applications for eInclusion of users with SSMI, Microsoft Research India (INR 10L, 2019-2020)
3. Social Networking Application for Children with Severe Speech and Motor Impairment (SSMI), Microsoft AI for Accessibility Grant ($15000, 2019-2020)
4. Cognitive and Auditory Rehabilitation for Elderly, Department of BioTechnology (DBT, Govt of India) (INR 10L, 2018-2021)
5. IT4All, DST SERB Early Career Fellowship, (INR 14.5L, 2017-2020)
Our projects on Assistive Technology and Inclusive design are covering a wide range of abilities including users with severe speech and motor impairment, students with learning disabilities and elderly users with hearing impairment. We deployed eye trackers and a set of eye gaze controlled assistive software and robotic manipulator at the Spastic Society of India, Chennai, India. We compared ocular parameters of students with learning disabilities with their able bodied counterpart, deployed online multilingual cognitive and auditory test battery and investigated website and smartphone usage patterns by elderly users. We work with International Telecommunication Union to promote accessibility of audio visual media by chairing an Inter-Rapporteur Group on Audio Visual Media Accessibility (RG AVA).
Users with severe speech and motor impairment operating a robotic manipulator through webcam based eye gaze tracker
o Users with SSMI can statistically significantly select correct elements on screen in an eye gaze controlled interface faster when elements were organized at their preferred screen positions than random organization.
o Using our bespoke eye gaze controlled interface, able bodied users can select one of nine regions of screen at a median of less than 2 secs and users with SSMI can do so at a median of 4 secs. Using the eye gaze controlled human-robot AR display, users with SSMI could undertake representative pick and drop task at an average duration less than 15 secs and reach a randomly designated target within 60 secs using a COTS eye tracker and at an average time of 2 mins using the webcam based eye gaze tracker.
o Total number of eye gaze fixations and regressions (backward gaze movement) were statistically significantly correlated to the reading ability of students.
1. Jeevithashree DV, KPS Saluja and P. Biswas, A case study of developing gaze controlled interface for users with severe speech and motor impairment, Technology and Disability 31(1), IOS Press, 2019
2. N. Blow and P. Biswas, A pointing facilitation system for motor-impaired users combining polynomial smoothing and time weighted gradient target prediction models, Assistive Technology 29 (1), ISSN: 1040-0435, Taylors and Francis, 2016
3. P.Biswas and P. O. Looms, From disability to capability: strategies for reconciling solutions for all, Encyclopaedia of Computer Science and Technology (2nd Edition), Taylor and Francis 2018
4. P. Biswas and P. Langdon, Applications of Intelligent and Multimodal Eye Gaze Controlled Interfaces, Wiley Handbook of Human-Computer Interaction 2018
5. P. Biswas and M. Springett, Inclusive User Modelling, Wiley Handbook of Human-Computer Interaction 2018
6. Jeevithashree DV, K. P. Saluja and P. Biswas, Gaze Controlled Interface for Limited Mobility Environment, ACM International Conference on Designing Interactive Systems (DIS) 2018
7. S. Arjun, Personalizing data visualization and interaction, ACM International conference on User Modeling and Adaptive Personalization (UMAP 2018) Doctoral Consortium
8. A. Sahay and P. Biswas, Webcam based eye gaze tracking using a landmark detector, ACM Compute 2017, Annual Conference of ACM India
9. KPS Saluja, Jeevithashree Dv, S. Arjun and P. Biswas, Analyzing Eye Gaze Movement of Users with Different Reading Abilities due to Learning Disability, 3rd International Conference on Graphics and Signal Processing (ICGSP 2019, Best Presentation Award)
10. K. Maheswary, S. Arjun and P. Biswas, Inclusive Crowd Sourcing based Disaster Reporting System, International Conference on Control, IEEE Instrumentation, Communication & Computational Technologies (ICCICCT-2016)
11. A. Agarwal, JeevithaShree DV, K S Saluja, A Sahay, P Mounika, A Sahu, R Bhaumik, V K Rajendran and P. Biswas, Comparing Two Webcam based Eye Gaze Trackers for Users with Severe Speech and Motor Impairment, International Conference on Research into Design (ICoRD 2019) [Distinguished Paper Award]
12. P. Mounika, D. Karia D., K. Sharma, and P. Biswas, Accessibility evaluation of three important Indian websites, International Conference on Research into Design (ICoRD 2019)
13. P.Biswas, A. Halder, K. Maheswary and S. Arjun, Inclusive Personlization of User Interfaces, 6th International Conference on Research into Design (ICoRD 2017)
Virtual Digital Twin with Sensor Dashboard of Smart Factory
Smart Manufacturing Aviation Automotive Assistive
2. Multimodal CoBot, Department of Heavy Industries, Common Engineering Facility Centre (CEFC) (INR 10L, 2019-2020)
3. Smart Manufacturing Test Bed, Robert Bosch Centre for Cyber-Physical Systems (INR 15L, 2016-2017)
5. NIFT Course on Artificial Intelligence and Machine Learning (INR 1.5L, 2019)
We are developing an interactive sensor dashboard and multimodal CoBot for supporting Industry 4.0 initiative. We are investigating different visualization techniques of sensor reading in 2D and 3D environment. The multimodal CoBot allows to control a robotic manipulator through multiple modalities like eye gaze, gesture and speech. Earlier we also developed image processing algorithms for automated inspection of rubber sheet for a shoe making MSME and Printed Circuit Boards (PCB) for detecting correct orientation and type of IC chips using webcam and shape detection algorithms. We have developed
a prototype system using a Kinect that can detect bad leaning posture by measuring the difference between the neck-spine and hip-spine joints in the skeleton figure of Kinect. Following RULA guidelines, it raises an alert when the neck or torso is inclined more than 60 ͦ.
o We have evaluated shape detection algorithms in different lighting conditions (indoor, outdoor, and controlled light source) and best shape detection result is obtained in YCbCr color space using bounding box shape descriptors for 2500 Lux using LED.
1. A. Mukhopadhay, I. Mukherjee and P. Biswas, Comparing Shape Descriptor Methods for Different Color Space and Lighting Conditions, Artificial Intelligence in Engineering Design and Manufacturing, Cambridge University Press 33 (4) 2019
2. V. K. Sharma and P. Biswas, System for Operating Joystick, Indian Patent Application no. 201941044740
3. P. Biswas, S. Roy, G. Prabhakar, J. Rajesh, S. Arjun, M. Arora, B. Gurumoorthy and A. Chakrabarti, Interactive Sensor Visualization for Smart Manufacturing System, Proceedings of the 31st British Human Computer Interaction Conference 2017 (British HCI 17)
4. A. Mukhopadhyay, LRD Murthy, M. Arora, A. Chakrabarti, I. Mukherjee and P. Biswas, PCB Inspection in the context of Smart Manufacturing, International Conference on Research into Design (ICoRD 2019)
5. K. Puneeth, A. Sahay, G, Shreya, P. Biswas, M. Arora, A. Chakrabarti, Designing an Affordable System for Early Defect Detection Using Image Processing, Advances in Manufacturing Technology XXXIII: Proceedings of the 17th International Conference on Manufacturing Research, incorporating the 34th National Conference on Manufacturing Research, 2019