September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Visual Cues in Nonvisual Cooking: Assessing the Role of Tactile and AI-Assisted Technologies
Author Affiliations & Notes
  • Lily M. Turkstra
    University of California, Santa Barbara
  • Alexa Van Os
    University of California, Santa Barbara
  • Tanya Bhatia
    University of California, Santa Barbara
  • Michael Beyeler
    University of California, Santa Barbara
  • Footnotes
    Acknowledgements  This work was partially supported by the University of California, Santa Barbara.
Journal of Vision September 2024, Vol.24, 1457. doi:https://doi.org/10.1167/jov.24.10.1457
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lily M. Turkstra, Alexa Van Os, Tanya Bhatia, Michael Beyeler; Visual Cues in Nonvisual Cooking: Assessing the Role of Tactile and AI-Assisted Technologies. Journal of Vision 2024;24(10):1457. https://doi.org/10.1167/jov.24.10.1457.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Understanding how individuals who are blind navigate instrumental activities of daily living can provide crucial insights into the indispensable role of visual cues in these tasks. Cooking, a complex and multi-step process, relies heavily on visual information, from the selection of ingredients to gauging the readiness of a dish. While alternative senses and assistive technologies offer some aid, the specific visual cues that guide the cooking process have not been extensively studied. To address this, we present an observational analysis of nonvisual cooking, highlighting the visual cues integral to the task and examining the interaction between these cues and assistive technologies, particularly smartphone-based applications. Eight either legally or totally blind participants (35-74 years of age) were trained on how to navigate a kitchen and its appliances using tactile tools (i.e., Wikki Stix, bump dots) and utilize an AI-based smartphone app (either Microsoft Seeing AI or Google Lookout). Participants were instructed to bake a pizza under two task conditions: either by relying solely on tactile tools or by combining tactile tools with a smartphone app. Their verbalized thoughts and requests for researcher assistance were recorded, with question frequency and topics used to gauge the importance of different visual cues. Participants exhibited high independence, rarely asking for researcher assistance and predominantly relying on tactile aids over smartphone apps, even when digital tools were designed specifically for the task. Apps were only used primarily when tactile tools were inadequate for acquiring crucial visual cues, such as selecting the correct pizza topping, or identifying similarly packaged ingredients. Only when tactile tools and apps failed in tasks like rolling out the dough did participants request researcher assistance. Our findings highlight a diverse range of user preferences and app usage patterns, providing valuable insights for the development of more effective assistive tools.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×