This is our first blog post sharing our learning from a few research papers we have studied and analyzed. We will be publishing detailed findings soon.
Shopping is inherently a multisensory experience, with most individuals relying heavily on visual cues to navigate aisles, appreciate colorful displays, and select items based on visual appeal. For Visually Impaired People (VIP), however, shopping is primarily a tactile and auditory journey, requiring touch, sound, and often the assistance of others. What is typically a routine activity, such as grocery shopping, becomes a complex challenge for VIP, involving meticulous preparation, navigation, and organization, supported by assistive technologies.
Understanding these unique experiences and challenges in offline shopping is critical as we transition toward crafting inclusive virtual shopping environments. This deeper comprehension enables designers/developers to ensure that emerging virtual retail spaces are accessible, enjoyable, and empowering for all users, irrespective of their abilities.
Behaviors in Offline Shopping
Preparation The shopping journey for VIP begins well before stepping out of the house. VIP uses tools like braille labels, audio recorders, or smartphones to create detailed shopping lists, ensuring efficiency during their shopping trips [2][3].
Navigation In stores, VIP depend on assistive tools like smartphones with screen readers, barcode scanners, and sometimes photo recognition apps. Assistance from others is often required for navigation and product identification. Memory and step counting are frequently used to locate specific aisles or shelves [2] [3].
Post-Shopping Tasks VIPs carefully label items, organize groceries, and manage inventories at home using tactile markers or audio tags [3]. This systematic approach ensures accessibility and reduces dependence on others. In some cases, VIP rely on family members or caregivers to help unpack and arrange their purchased items. Additionally, they memorise the specific locations where they store items at home to maintain accessibility and reduce dependence in their daily lives.
Challenges Faced
Navigational Difficulties
Limited tactile markers or auditory cues in stores make independent navigation challenging [2].
Without proper infrastructure, VIP may need assistance to find their way, making a simple task more complex [3].
Once inside, they often have trouble locating specific shelves or aisles [2].
Relying on memory or manual step-counting can be imprecise and ineffective, especially when store layouts change [3].
Identifying Items
They struggle to identify desired items due to the lack of visual information, especially when distinguishing between similar-looking products [1][3].
Reading product labels for details like price or expiry is difficult without visual aids, particularly when labels or packaging aren’t tactilely distinguishable [3].
Existing tools, like barcode scanners, are time-consuming and inefficient, requiring individual scans for each item [1][3].
Photo recognition tools also struggle to identify multiple products simultaneously, further slowing down the process [1].
Communication delays or latency while using these tools hinder the ability to receive timely feedback, leading to frustration [1].
Picking Items from Shelves
VIPs face difficulty picking the right item from shelves that are tightly packed, especially when products are similar in shape or size [1].
The dense arrangement of products increases the likelihood of selecting the wrong product [1].
Without proper guidance, VIP are forced to rely on trial and error, making the process time-consuming and inefficient [1].
Dependency and Cost Barriers
Assistance from others can compromise privacy and independence [3].
High costs and steep learning curves of wearable devices discourage their adoption [1].
Recommendations for Improvement
Smartphone-Based Systems Use widely accessible smartphones to provide speech targeting navigation guidance and item-picking assistance [1].
Hybrid Assistive Technologies Combine computer vision with RFID for efficient product identification and navigation. Computer Vision can be used to recognize and describe products, while RFID technology can help provide accurate and immediate identification of items by scanning tags [2].
Holistic Design Address the entire shopping process, including pre-shopping tasks like list creation and post-shopping activities like inventory management [3].
Participatory Design Involve VIP in co-designing solutions to ensure alignment with their real-world needs [3].
Strength-Based Design Leverage VIP's enhanced haptic and auditory senses for intuitive and empowering solutions [3].
Conclusion
Understanding the offline shopping challenges faced by people with visual impairments is essential for creating more inclusive virtual retail environments. As retail spaces continue to evolve digitally, it is crucial that we prioritize universal design principles to ensure accessibility for all. By learning from the experiences of VIP in offline settings, we can create digital environments that bridge the accessibility gap and foster greater independence and inclusion.
References
Wang, Z., Guo, B., Wang, Q., Zhang, D. and Yu, Z., 2023. MobiEye: An Efficient Shopping-Assistance System for the Visually Impaired With Mobile Phone Sensing. IEEE Transactions on Human-Machine Systems.
Elgendy, M., Sik-Lanyi, C. and Kelemen, A., 2019. Making shopping easy for people with visual impairment using mobile assistive technologies. Applied Sciences, 9(6), p.1061.
Yuan, C.W., Hanrahan, B.V., Lee, S., Rosson, M.B. and Carroll, J.M., 2019. Constructing a holistic view of shopping with people with visual impairment: a participatory design approach. Universal Access in the Information Society, 18, pp.127-140.
Comentarios