top of page
Search

A Tool to Untangle the Web for Visually Impaired Users

  • Writer: Nilotpal Biswas
    Nilotpal Biswas
  • Sep 19
  • 3 min read
ree

Online shopping has become a daily convenience for many, but for blind individuals who rely on screen readers, it can often be a frustrating maze. E-commerce sites, in an effort to provide comprehensive information, spread product details like descriptions, technical specifications, and user reviews across different sections and even multiple pages. While a sighted user can quickly scan a page to find what they need, a screen reader user must navigate this content linearly, a process that is often tedious, time-consuming, and overwhelming. This complex navigation, involving constant toggling back and forth between product lists and detail pages, creates a significant usability gap, leading to interaction fatigue and the risk of missing out on the best deals.


To address this challenge, researchers from Old Dominion University and Michigan State University have developed InstaFetch, a browser extension designed to revolutionize the online shopping experience for blind users [1]. InstaFetch works by consolidating all essential product information into a single, accessible, pop-up interface. When a user browsing a list of products wants to know more about an item, they can activate an "Options" button added by the extension. This opens an overlay that neatly presents the product's description, specifications, and reviews in separate tabs, eliminating the need to navigate away from the main product list. This design streamlines the process, allowing users to quickly access and compare information without getting lost in the website's structure.


Another feature of InstaFetch is its ability to handle natural language queries. Instead of manually sifting through text, a user can simply ask a direct question, such as, "How many HDMI ports does this TV have?" or "What do reviewers say about the sound quality?". The system uses a language model to understand the query, retrieve the relevant details from the product page, and provide an instant, coherent answer. This direct query function empowers users to pinpoint the exact information they need, dramatically reducing the effort and time spent on manual navigation.


The effectiveness of InstaFetch was confirmed in a user study with 14 blind participants. The results were significant. When using InstaFetch, participants spent significantly less time and used far fewer keyboard shortcuts per item compared to using a standard screen reader or even a state-of-the-art accessibility tool called SaIL. This efficiency allowed them to explore nearly twice as many products within the same timeframe, reducing interaction fatigue and increasing their confidence in finding "advantageous deals". The users appreciated the centralized information and the convenience of the query feature, which made the entire shopping process less frustrating and more enjoyable.


The core takeaways from the InstaFetch study offer valuable insights for designing more inclusive digital experiences, particularly for a VR shopping application aimed at users with partial visual impairment. The study's central finding, that reducing navigational complexity and consolidating information significantly lessens cognitive load and fatigue, is directly applicable to a 3D virtual environment. A VR shopping app should avoid forcing users to physically "travel" long virtual distances or navigate complex menu systems to compare products. Instead, inspired by InstaFetch, it could feature an accessible "information hub" or an intelligent virtual assistant that can be summoned on command. This would allow a user to instantly pull up detailed product specifications, reviews, and comparisons in a clear, high-contrast interface without losing their place in the virtual store. By minimizing the need for cumbersome back-and-forth navigation and enabling direct, query-based information retrieval, a VR experience can become less disorienting and more empowering, ensuring that users can focus on their shopping goals rather than struggling with the interface.


Reference

  1. Prakash, Y., Nayak, A.K., Sunkara, M., Jayarathna, S., Lee, H.N. and Ashok, V., 2024. All in one place: Ensuring usable access to online shopping items for blind users. Proceedings of the ACM on Human-Computer Interaction, 8(EICS), pp.1-25.

 
 
 

Comments


logo ei.png
Embedded Interaction Lab

Department of Design, IIT Guwahati, 

Assam, India.

Phone: +91-361 258 2485

© 2024. All rights reserved.

bottom of page