CHI 2021: Adaptive Accessible AR/VR Systems

Augmented, virtual and mixed reality technologies offer new ways of interacting with digital media. However, such technologies are not well explored for people with different ranges of abilities beyond a few specific navigation and gaming applications. While new standardization activities are investigating accessibility issues with existing AR/VR systems, commercial systems still confined to specialized hardware and software limiting their widespread adoption among people with disabilities. This workshop takes a novel approach by exploring the application of user model based personalization for AR/VR systems. The workshop will be organized by experienced researchers in the field of human computer interaction, robotics control, assistive technology, and AR/VR systems, and will consist of peer reviewed papers and hands-on demonstrations. Keynote speeches and demonstrations will cover latest accessibility research at Microsoft, Google, Verizon and leading universities.

Download Flyer

ACM TACCESS Special Issue

Authors of selected papers will be invited to submit extended versions of their papers to a special issue of the ACM Transactions on Accessible Computing (TACCESS)

All participants have to register to ACM CHI Conference and add this workshop during registration with code: AccessW18

Workshop Schedule

    Day 1, 7th May

  • UTC 1300, Opening Plenary, Introductions
  • UTC 1400, Spatial Audio, Dr Swami Manohar, Microsoft Research
  • UTC 1500, Paper id 3: Privacy Concerns of Bystanders with Augmented Reality Assistive Technologies
  • UTC 1530, Paper id 4: Exploring Augmented Reality Games in Accessible Learning: A Systematic Review
  • UTC 1600, The XR Access Initiative - Real Accessibility in Virtual Environments, Dr Larry Goldberg, Verizon

    Day 2, 8th May

  • UTC 1300, Paper id 1: VR360 Subtitles
  • UTC 1330, Paper id 6: Eye Gaze Controlled CoBot for SSMI
  • UTC 1400, The birth of augmented reality, Prof Peter Robinson, University of Cambridge
  • UTC 1430, Paper id 2: Impact of Visualisation in Virtual Environment
  • UTC 1500, Designing experiences for blind and partially sighted people, Dr Sonali Rai, RNIB, UK
  • UTC 1600, Personal, Accessible, Immersive – my choice matters, Mr Andy Quested, European Broadcasting Union

    Day 3, 9th May

  • UTC 1300, Augmented Reality for Assistive Robots,Ms Kavita Krishnaswamy, University of Maryland
  • UTC 1330, Paper id 7: Hands-on Hand-in-Hand
  • UTC 1400, The Augmented Employee, Dr Anasol Pena-Rios, British Telecom
  • UTC 1440, Gradually Including Potential Users
  • UTC 1500, Immersive Captions, developing with the community, Mr Christopher Patnoe, Google
  • UTC 1600, Group Discussion and Closing Remarks

UTC 1300, IST 1830, BST 1400, CEST 1500, EDT 0900, PDT 0600

Invited Speakers


  • Disseminate knowledge on accessibility services available for AR/VR/MR systems
  • Identify gaps in state-of-the-art of immersive media with respect to accessibility
  • Enhance usability of immersive media
  • Identify new applications for immersive media (e.g. Human Robot Interaction for rehabilitation, remote monitoring and control of semi-autonomous vehicles and so on)
  • Extend applications of user modelling and interface personalization to immersive media

What audience get

  • Exposure to state of the art accessibility systems for immersive media
  • Hands on demonstration on eye gaze controlled AR/VR systems
  • New applications of AR/VR systems

Research Questions

  • What is the value addition of immersive technologies over traditional computer monitor for users with different range of abilities?
  • How immersive media (AR/VR/MR) can be adapted to enhance interaction experience of users with different range of abilities?
  • How a personalization framework through user modelling can automatically adjust interface and interaction for traditional and immersive media?
  • What are affordable ways to deliver immersive media to users with different range of abilities at large scale?
  • How to develop algorithms for affordable new interaction technologies like eye gaze tracker, haptic feedback system and 3D audio?
  • What combination and configuration of output modalities are most effective to convey information to users with different range of abilities for AR/VR applications?

Video Demonstration



Paper Submission: 28th February 2021 (Extended)

Notification: 31st March 2021

Final Camera Ready Copy: 15th April 2021

Submission Link: Submit here

Papers should be between 5000 and 8000 words in length in ACM format