Locomotion That Makes Sense When You Can’t See the Aisles
- Nilotpal Biswas
- May 8
- 3 min read

A trip through a virtual supermarket is only empowering when every shopper can move with confidence. For blind and low-vision consumers, that means locomotion methods must speak or vibrate to them just as clearly as printed aisle markers do for sighted visitors. A recent software-level review [1] covering 330 of the most-used commercial VR titles provides timely evidence on which travel techniques work, how often they appear in real apps, and where accessibility still falls short. Translating those findings to a retail setting shows that the goal is less about inventing exotic mechanics and more about adapting mainstream ones with smart audio and haptic scaffolding.
Teleportation, still present in almost a third of today’s applications, minimises motion sickness because movement is instantaneous, yet its default targeting arc is a purely visual cue. The review notes that point-and-teleport only becomes usable for visually impaired people (VIP) when developers pair it with non-visual feedback such as a spoken confirmation of the chosen landing spot and a brief controller buzz that “locks” the destination . In a VR shop this could sound like “Bakery, four metres ahead,” while a high-frequency vibration warns if a shelf blocks the landing.
Continuous joystick sliding, usually delivered alongside 30-degree snap turns, is now supported in more than half of commercial titles . For VIP shoppers, the comfort problem is less visual overload and more vection-related nausea, so the widespread use of dynamic field-of-view reduction (the “tunnelling” vignette) is welcome. Spoken orientation cues every few seconds: “facing dairy section south”, help maintain a mental map while the vignette does its work. Each snap-turn step should trigger a brief tick in the left or right controller so users can count rotations, a practice drawn from the review’s observation that haptic cues are still under-explored but highly valued by VIP testers .
True room-scale walking—taking real steps inside a tracked play area—is the single most common technique in industry apps, appearing in three-quarters of them . Living-room space is too small for an entire mall, but room-scale walking works well for short moves, such as stepping closer to a fresh produce, to inspect it. Designers can place gentle audio “beacons” at aisle entrances and a low rumble near the boundary so a shopper knows when to stop.
One of the fastest-growing techniques in commercial VR is grab-based locomotion: climbing or hand-walking along virtual rails jumped from 3 % of titles in 2016 to over 30 % since 2020. In a supermarket this method can double as an invisible handrail: when the user’s virtual palm touches a shelf edge, the controller pulses; pulling the hand glides the avatar laterally with centimetre-level precision, perfect for comparing similar jars. Because the motions demand fine arm tracking, accessible variants should offer adjustable grab strength and an option to switch back to joystick sliding.
Across all categories the review highlights a surprising omission: not one audited title relied on voice commands for primary locomotion. Yet spoken shortcuts such as “Go to checkout” or “Turn right ninety degrees” would remove the need for complex gestures and free one hand for a real-world cane or support dog leash. Adding voice as a parallel, not replacement, channel can therefore unlock the existing teleport, sliding and grab routines for a broader audience.
Let us string these ideas into a single shopping journey. A blind user enters the mall and says, “Find bread aisle.” The system casts an audio-annotated teleport, vibrates when a clear landing is found and places the shopper beside the bakery counter. They slide the joystick forward; tunnelling keeps nausea at bay while footstep-synchronised haptics and periodic verbal headings anchor orientation. Reaching out, their controller buzzes at the shelf rail; they hand-walk along loaves until a product name is spoken aloud, then squeeze the grip to add it to the basket. “Checkout,” they say, and a snap-turn-assisted teleport aligns them with the payment kiosk.
What the data ultimately show is that inclusive VR retail does not require brand-new locomotion paradigms. Teleport, joystick steering, room-scale steps and grab-and-pull already dominate the industry; the task for designers is to lace these familiar methods with speech, vibration and optional comfort tweaks so that every shopper—sighted or not—can move with the same certainty. By investing in those details now, developers ensure that the next virtual mall feels as orderly to a blind customer as the best-designed physical one.
Reference
Anderton, C., Creed, C., Sarcar, S. and Theil, A., 2025. From teleportation to climbing: A review of locomotion techniques in the most used commercial virtual reality applications. International Journal of Human–Computer Interaction, 41(4), pp.1946-1966.
Comments