Abstract
Understanding how individuals who are blind navigate instrumental activities of daily living can provide crucial insights into the indispensable role of visual cues in these tasks. Cooking, a complex and multi-step process, relies heavily on visual information, from the selection of ingredients to gauging the readiness of a dish. While alternative senses and assistive technologies offer some aid, the specific visual cues that guide the cooking process have not been extensively studied. To address this, we present an observational analysis of nonvisual cooking, highlighting the visual cues integral to the task and examining the interaction between these cues and assistive technologies, particularly smartphone-based applications. Eight either legally or totally blind participants (35-74 years of age) were trained on how to navigate a kitchen and its appliances using tactile tools (i.e., Wikki Stix, bump dots) and utilize an AI-based smartphone app (either Microsoft Seeing AI or Google Lookout). Participants were instructed to bake a pizza under two task conditions: either by relying solely on tactile tools or by combining tactile tools with a smartphone app. Their verbalized thoughts and requests for researcher assistance were recorded, with question frequency and topics used to gauge the importance of different visual cues. Participants exhibited high independence, rarely asking for researcher assistance and predominantly relying on tactile aids over smartphone apps, even when digital tools were designed specifically for the task. Apps were only used primarily when tactile tools were inadequate for acquiring crucial visual cues, such as selecting the correct pizza topping, or identifying similarly packaged ingredients. Only when tactile tools and apps failed in tasks like rolling out the dough did participants request researcher assistance. Our findings highlight a diverse range of user preferences and app usage patterns, providing valuable insights for the development of more effective assistive tools.