Automotive Back

  • Intelligent Dashboard Control System(IDCS), Distraction and Cognitive Load Detection, Faurecia Services Groupe, France (INR 57.3L, 2018-20)
  • Automatic Driver Activity Detection, PulseLabs.ai, USA ($4000, 2021)
  • Ride Quality Monitoring of Drivers and Passengers of Autonomous Vehicles, Wipro as part of the WIRIN project (INR 10L, 2018-19)
  • Ashok Leyland Course on Automotive User Interface (INR 2L, 2018)

We have developed a finger movement and eye gaze controlled interactive Head Up Display (HUD) for car that can also simultaneously estimate drivers’ cognitive load using a new machine learning algorithm involving ocular parameters. We validated the interactive HUD through simulation and in-car studies. The work is extended to autonomous vehicles too by integrating the cognitive load estimation module with object detection Convolutional Neural Network (CNN) models. As part of the study, we compared existing CNNs like RetinaNet, YOLO and maskR-CNN for Indian road dataset with respect to accuracy and latency of detecting different traffic participants. Our recent study developed a prototype of an Augmented Reality (AR) and Mixed Reality (MR) assistance system and compared mixed reality with augmented reality displays on response time to take over request.

Main Results

  • User studies show the interactive HUD improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2 secs on average inside a Toyota Etios car.
  • Average velocities of saccadic intrusion increases with increase in cognitive load and recording saccadic intrusion and eye blinks for 6 s duration can predict drivers’ instantaneous perception of developing road hazards.
  • A DNN based machine learning model combining different ocular parameters is more accurate in predicting affective states of drivers than individual ocular parameters
  • Accuracy of Mask R-CNN is significantly higher than YOLOv3 and RetinaNet while evaluating these models on novel classes of objects such as animals, autorickshaws, caravan in Indian road dataset. We have also tested latency of the CNN models and found that latency of YOLOv3 is significantly lower than other two models and RetinaNet is significantly faster than Mask R-CNN.
  • A machine learning model combining CNN and Encoder-Decoder system can accurately detect lanes in Indian road dataset than SoTA models.
  • A touchscreen based video see-through Augmented Reality ADAS is faster and easier to operate than a Mixed Reality Headset.



Video Demonstrations


Publications

  • A Mukhopadhyay, VK Sharma, PG Tatyarao, AK Shah, AMC Rao, PR Subin, P Biswas, A comparison study between XR interfaces for driver assistance in take over request, Transportation Engineering (11),2023,100159, Elsevier, ISSN 2666-691X,https://doi.org/10.1016/j.treng.2022.100159
  • A. Mukhopadhyay, LRD Murthy, I. Mukherjee and P. Biswas, A Hybrid Lane Detection Model for Wild Road Conditions, IEEE Transaction on Artificial Intelligence 22
  • LRD Murthy, G. Kumar, M. Modiksha, S. Deshmukh and P. Biswas, Efficient Interaction with Automotive Head Up Displays using Appearance-based Gaze Tracking, ACM Automotive User Interfaces (AutoUI ) 2022
  • A. Mukhopadhyay, V. K. Sharma, P. T. Gaikwad, A. K. Sandula, and P. Biswas, Exploring the Use of XR Interfaces for Driver Assistance in Take Over Request, ACM Automotive UI (AutoUI ) 2022
  • A. Mukhopadhyay, S. Agarwal, P. D. Zwick, and P. Biswas, To show or not to show: Redacting sensitive text from videos of electronic displays, ACM Automotive UI (AutoUI ) 2022
  • LRD Murthy, A. Mukhopadhyay, K. Anand, S. Aggarwal and P. Biswas, PARKS-Gaze - An Appearance-based Gaze Estimation Dataset in Wilder Conditions, ACM International Conference on Intelligent User Interfaces (IUI 2022)
  • LRD Murthy, A. Mukhopadhyay and P. Biswas, Distraction Detection in Automotive Environment using Appearance-based Gaze Estimation, ACM International Conference on Intelligent User Interfaces (IUI 2022)
  • G. Prabhakar, P. Rajkhowa and P. Biswas, A Wearable Virtual Touch System for Car Dashboard, Journal on Multimodal User Interfaces, Springer, 2021
  • P. Biswas, Exploring the use of Eye Gaze Controlled Interfaces in Automotive Environments Springer 2016, ISBN: 978-3-319-40708-1
  • G. Prabhakar, A. Ramakrishnan, L. R. D Murthy, V. K. Sharma, M. Madan, S. Deshmukh and P. Biswas, Interactive Gaze & Finger controlled HUD for Cars, Journal on Multimodal User Interfaces, Springer, 2019
  • G. Prabhakar and P. Biswas, A Brief Survey on Interactive Automotive UI, Transportation Engineering, Elsevier 2021
  • G. Prabhakar, A. Mukhopadhyay, LRD Murthy, M. Madan, S. Deshmukh and P. Biswas, Cognitive load estimation using Ocular Parameters in Automotive, Transportation Engineering, Elsevier 2020
  • P. Biswas and G. Prabhakar, Detecting Drivers’ Cognitive Load from Saccadic Intrusion, Transportation Research Part F: Traffic Psychology and Behaviour 54, Elsevier, 2018
  • G. Prabhakar and P. Biswas, Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technologies and Interaction, 2(1), 2018
  • P. Biswas, S. Deshmukh, G. Prabhakar, M. Modiksha and A. Mukhopadhyay, System and Method for Monitoring Cognitive Load of a Driver of a Vehicle, Indian Patent Application No.: 201941052358, PCT Application No.: PCT/IB2020/062016
  • P. Biswas, S. Deshmukh, G. Prabhakar, M. Modiksha, V. K. Sharma and A. Ramakrishnan,A System for Man- Machine Interaction in Vehicles , Indian Patent Application No.: 201941009219, PCT International Application No. PCT/IB2020/050253
  • G. Prabhakar and P. Biswas, Wearable Laser Pointer for Automotive User Interface Application No.: 201741035044, PCT International Application No. PCT/IB2018/057680
  • P. Biswas, Interactive Gaze Controlled Projected Display , Indian Patent Application No.: 201641037828
  • P. Biswas, G. Prabhakar, J. Rajesh, K. Pandit and A. Halder, Improving Eye Gaze Controlled Car Dashboard using Simulated Annealing, Proceedings of the 31st British Human Computer Interaction Conference (British HCI 17, selected to exhibit in the Interactions Gallery)
  • G. Prabhakar, N. Madhu and P. Biswas, Comparing Pupil Dilation, Head Movement, and EEG for Distraction Detection of Drivers, Proceedings of the 32nd British Human Computer Interaction Conference 2018 (British HCI 18)
  • P. Biswas, S. Twist and S. Godsill, Intelligent Finger Movement Controlled Interface for Automotive Environment, Proceedings of the 31st British Human Computer Interaction Conference 2017 (British HCI 17)
  • A. Mukhopadhyay, A. Agarwal, I. Mukherjee and P. Biswas, Performance Comparison of Different CNN models for Indian Road Dataset, 3rd International Conference on Graphics and Signal Processing (ICGSP 2019, Best Presentation Award)
  • A. Mukhopadhay and P. Biswas, Comparing CNNs for Non-Conventional Traffic Participants, ACM Automotive UI 2019 [Recipient of Global Fellowship]
  • P. Biswas, Multimodal Gaze Controlled Dashboard, ACM AutomotiveUI 2016
  • G. Prabhakar and P. Biswas, Evaluation of Laser Pointer as a Pointing Device in Automotive, IEEE International Conference on Intelligent Computing, Instrumentation and Control Technologies [Best Paper Award]
  • G. Prabhakar, J. Rajesh and P.Biswas, Comparison of three Hand Movement Tracking Sensors as Cursor Controllers, IEEE International Conference on Control, Instrumentation, Communication & Computational Technologies (ICCICCT-2016) [Best Paper Award]
  • A. Ramakrishnan, G. Prabhakar, M. Madan, S. Deshmukh, P. Biswas, Eye Gaze Controlled HUD, International Conference on ICT for Sustainable Development 2019