Lessons from Two Decades of Designing for the Visually Impaired
- Nilotpal Biswas
- Jun 18
- 2 min read

In today's blog, we will discuss a review paper: "Two Decades of Touchable and Walkable Virtual Reality for Blind and Visually Impaired People: A High-Level Taxonomy" by Julian Kreimeier and Timo Götzelmann. The paper systematically reviews two decades of research on audio-haptic virtual reality (VR) designed specifically for blind and visually impaired people (VIP), highlighting technological advancements, application scenarios, and usability evaluations.
The authors define VR for VIP as systems that enable spatial exploration via haptic (touch-based) and audio feedback, categorizing VR environments into three scales: small, medium, and large. Small-scale VR uses grounded force-feedback devices within hand's reach, allowing tactile exploration of objects like charts and miniature maps. Medium-scale VR involves room-sized, physically walkable spaces, often employing wearable feedback devices such as virtual white canes. Large-scale VR extends beyond physical constraints, allowing users to navigate extensive environments through avatar-based controls using joysticks or other controllers.
A significant finding from the review is that grounded haptic feedback devices like Phantom have been extensively studied for their precise, intuitive tactile feedback but remain limited by their stationary setup and high costs. Conversely, wearable and non-grounded devices are gaining prominence due to their affordability and ease of integration with consumer technology like smartphones.
The review underscores the crucial role of audio feedback, typically implemented as spatial sound, aiding VIP to mentally construct accurate spatial models of virtual environments. Additionally, the interaction perspective: exocentric (object-external exploration) versus egocentric (self-centric exploration), plays a pivotal role in usability and cognitive load.
Applying these insights to designing VR shopping for visually impaired users, several takeaways emerge:
Multimodal Feedback Integration: Combining audio and wearable haptic feedback effectively provides users with comprehensive spatial understanding, essential for virtual shopping scenarios.
Egocentric Perspective for Immersion: Utilizing an egocentric approach in VR shopping environments significantly improves users' spatial orientation and ability to independently navigate virtual stores, enhancing immersion and realism.
Accessibility through Consumer Technology: Leveraging widely available consumer-grade VR hardware (like smartphones and simple controllers) could dramatically reduce barriers, making VR shopping more accessible and cost-effective for visually impaired users.
Dynamic Scaling and Customization: Allowing users to dynamically scale and customize the virtual shopping environment could accommodate various visual impairments and personal preferences, thereby improving user autonomy and overall satisfaction.
Realistic Simulations with Practical Constraints: Striking a balance between realism (detailed store layouts, realistic product presentations) and practical usability constraints (avoiding cognitive overload, simplifying navigation) is critical for user-friendly VR shopping experiences.
In conclusion, the insights derived from Kreimeier and Götzelmann’s comprehensive taxonomy can significantly inform the development of accessible, immersive, and effective VR shopping solutions, fostering greater independence and inclusivity for visually impaired individuals.
Reference
Kreimeier, J. and Götzelmann, T., 2020. Two decades of touchable and walkable virtual reality for blind and visually impaired people: A high-level taxonomy. Multimodal Technologies and interaction, 4(4), p.79.
Comments