Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3441852.3471215acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Open access

Non-Visual Cooking: Exploring Practices and Challenges of Meal Preparation by People with Visual Impairments

Published: 17 October 2021 Publication History

Abstract

The reliance on vision for tasks related to cooking and eating healthy can present barriers to cooking for oneself and achieving proper nutrition. There has been little research exploring cooking practices and challenges faced by people with visual impairments. We present a content analysis of 122 YouTube videos to highlight the cooking practices of visually impaired people, and we describe detailed practices for 12 different cooking activities (e.g., cutting and chopping, measuring, testing food for doneness). Based on the cooking practices, we also conducted semi-structured interviews with 12 visually impaired people who have cooking experience and show existing challenges, concerns, and risks in cooking (e.g., tracking the status of tasks in progress, verifying whether things are peeled or cleaned thoroughly). We further discuss opportunities to support the current practices and improve the independence of people with visual impairments in cooking (e.g., zero-touch interactions for cooking). Overall, our findings provide guidance for future research exploring various assistive technologies to help people cook without relying on vision.

Supplementary Material

VTT File (1686.vtt)
MP4 File (1686.mp4)
Presentation video

References

[1]
Ali Abdolrahmani, Ravi Kuber, and Stacy M Branham. 2018. ” Siri Talks at You” An Empirical Investigation of Voice-Activated Personal Assistant (VAPA) Usage by Individuals Who Are Blind. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. 249–258.
[2]
Ali Abdolrahmani, Kevin M Storer, Antony Rishin Mukkath Roy, Ravi Kuber, and Stacy M Branham. 2020. Blind leading the sighted: drawing design insights from blind users towards more productivity-oriented voice interfaces. ACM Transactions on Accessible Computing (TACCESS) 12, 4 (2020), 1–35.
[3]
Lisa Anthony, YooJin Kim, and Leah Findlater. 2013. Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments. In Proceedings of the SIGCHI conference on human factors in computing systems. 1223–1232.
[4]
Shiri Azenkot, Kyle Rector, Richard Ladner, and Jacob Wobbrock. 2012. PassChords: secure multi-touch authentication for blind people. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility. 159–166.
[5]
Alexy Bhowmick and Shyamanta M Hazarika. 2017. An insight into assistive technology for the visually impaired and blind people: state-of-the-art and future trends. Journal on Multimodal User Interfaces 11, 2 (2017), 149–172.
[6]
Marie Claire Bilyk, Jessica M Sontrop, Gwen E Chapman, Susan I Barr, and Linda Mamer. 2009. Food experiences and eating patterns of visually impaired and blind people. Canadian Journal of Dietetic practice and research 70, 1 (2009), 13–18.
[7]
Stacy M Branham and Shaun K Kane. 2015. Collaborative accessibility: How blind and sighted companions co-create accessible home spaces. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2373–2382.
[8]
Stacy M Branham and Antony Rishin Mukkath Roy. 2019. Reading between the guidelines: How commercial voice assistant guidelines hinder accessibility for blind users. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. 446–458.
[9]
Kathy Charmaz. 2006. Constructing grounded theory: A practical guide through qualitative analysis. sage.
[10]
Quentin Chibaudel, Wafa Johal, Bernard Oriola, Marc JM Macé, Pierre Dillenbourg, Valérie Tartas, and Christophe Jouffrais. 2020. ” If you’ve gone straight, now, you must turn left”-Exploring the use of a tangible interface in a collaborative treasure hunt for people with visual impairments. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. 1–10.
[11]
Dasom Choi, Daehyun Kwak, Minji Cho, and Sangsu Lee. 2020. ” Nobody Speaks that Fast!” An Empirical Study of Speech Rate in Conversational Agents for People with Vision Impairments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[12]
Aira Tech Corp. 2021. Home - Aira : Aira. https://aira.io/. (Accessed on 04/08/2021).
[13]
Nem Khan Dim, Chaklam Silpasuwanchai, Sayan Sarcar, and Xiangshi Ren. 2016. Designing mid-air TV gestures for blind people using user-and choice-based elicitation approaches. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. 204–214.
[14]
Bryan Dosono, Jordan Hayes, and Yang Wang. 2015. “I’m Stuck!”: A Contextual Inquiry of People with Visual Impairments in Authentication. In Eleventh Symposium On Usable Privacy and Security ({SOUPS} 2015). 151–168.
[15]
Giovanni Fusco, Ender Tekin, Richard E Ladner, and James M Coughlan. 2014. Using computer vision to access appliance displays. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 281–282.
[16]
David Gonçalves, André Rodrigues, and Tiago Guerreiro. 2020. Playing With Others: Depicting Multiplayer Gaming Experiences of People With Visual Impairments. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. 1–12.
[17]
Anhong Guo, Xiang’Anthony’ Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey P Bigham. 2016. Vizlens: A robust and interactive screen reader for interfaces in the real world. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 651–664.
[18]
Anhong Guo, Jeeeun Kim, Xiang’Anthony’ Chen, Tom Yeh, Scott E Hudson, Jennifer Mankoff, and Jeffrey P Bigham. 2017. Facade: Auto-generating tactile interfaces to appliances. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 5826–5838.
[19]
Christine Ha. 2020. The Blind Cook – Christine Ha’s Adventures. http://www.theblindcook.com/. (Accessed on 03/14/2021).
[20]
Reiko Hamada, Jun Okabe, Ichiro Ide, Shin’ichi Satoh, Shuichi Sakai, and Hidehiko Tanaka. 2005. Cooking navi: assistant for daily cooking in kitchen. In Proceedings of the 13th annual ACM international conference on Multimedia. 371–374.
[21]
Rex Hartson and Pardha S Pyla. 2012. The UX Book: Process and guidelines for ensuring a quality user experience. Elsevier.
[22]
Liang He, Zijian Wan, Leah Findlater, and Jon E Froehlich. 2017. TacTILE: a preliminary toolchain for creating accessible graphics with 3D-printed overlays and auditory annotations. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 397–398.
[23]
Ed Henkler. 2020. How Does a Blind Person Cook? - The Blind Guide. https://theblindguide.com/how-does-blind-person-cook/. (Accessed on 03/14/2021).
[24]
Juan Pablo Hourcade, Sarah L Mascher, David Wu, and Luiza Pantoja. 2015. Look, my baby is using an iPad! An analysis of YouTube videos of infants and toddlers using tablets. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 1915–1924.
[25]
Apple Inc.2021. Accessibility - Vision - Apple (CA). https://www.apple.com/ca/accessibility/vision/. (Accessed on 03/16/2021).
[26]
Nabila Jones, Hannah Elizabeth Bartlett, and Richard Cooke. 2019. An analysis of the impact of visual impairment on activities of daily living and vision-related quality of life in a visually impaired adult population. British Journal of Visual Impairment 37, 1 (2019), 50–63.
[27]
Shaun K Kane, Jeffrey P Bigham, and Jacob O Wobbrock. 2008. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 73–80.
[28]
Fumihiro Kato and Shoichi Hasegawa. 2013. Interactive Cooking Simulator: Showing food ingredients appearance changes in frying pan cooking. In Proceedings of the 5th international workshop on Multimedia for cooking & eating activities. 33–38.
[29]
Mohammad Kianpisheh, Franklin Mingzhe Li, and Khai N Truong. 2019. Face recognition assistant for people with visual impairments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 3 (2019), 1–24.
[30]
Aida Komkaite, Liga Lavrinovica, Maria Vraka, and Mikael B Skov. 2019. Underneath the Skin: An Analysis of YouTube Videos to Understand Insertable Device Interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
[31]
Eliza Kostyra, Sylwia Żakowska-Biemans, Katarzyna Śniegocka, and Anna Piotrowska. 2017. Food shopping, sensory determinants of food choice and meal preparation by visually impaired people. Obstacles and expectations in daily food experiences. Appetite 113(2017), 14–22.
[32]
Sven Krome, Steffen P Walz, and Stefan Greuter. 2016. Contextual inquiry of future commuting in autonomous cars. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 3122–3128.
[33]
Kazuma Kusu, Nozomi Makino, Takamitsu Shioi, and Kenji Hatano. 2017. Calculating Cooking Recipe’s Difficulty based on Cooking Activities. In Proceedings of the 9th Workshop on Multimedia for Cooking and Eating Activities in conjunction with The 2017 International Joint Conference on Artificial Intelligence. 19–24.
[34]
Gierad Laput, Karan Ahuja, Mayank Goel, and Chris Harrison. 2018. Ubicoustics: Plug-and-play acoustic activity recognition. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. 213–224.
[35]
Barbara Leporini, Maria Claudia Buzzi, and Marina Buzzi. 2012. Interacting with mobile devices via VoiceOver: usability and accessibility issues. In Proceedings of the 24th Australian Computer-Human Interaction Conference. 339–348.
[36]
Franklin Mingzhe Li, Di Laura Chen, Mingming Fan, and Khai N Truong. 2019. FMT: A Wearable Camera-Based Object Tracking Memory Aid for Older Adults. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 3 (2019), 1–25.
[37]
Mingzhe Li, Mingming Fan, and Khai N Truong. 2017. BrailleSketch: A gesture-based text input method for people with visual impairments. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 12–21.
[38]
Google LLC. 2021. Get started on Android with TalkBack - Android Accessibility Help. https://support.google.com/accessibility/android/answer/6283677?hl=en. (Accessed on 03/16/2021).
[39]
Matthew Louis Mauriello, Brenna McNally, and Jon E Froehlich. 2019. Thermporal: An easy-to-deploy temporal thermographic sensor system to support residential energy audits. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–14.
[40]
Matthew Louis Mauriello, Manaswi Saha, Erica Brown Brown, and Jon E Froehlich. 2017. Exploring novice approaches to smartphone-based thermographic energy auditing: A field study. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 1768–1780.
[41]
Tim Morris, Paul Blenkhorn, Luke Crossey, Quang Ngo, Martin Ross, David Werner, and Christina Wong. 2006. Clearspeech: A display reader for the visually handicapped. IEEE Transactions on Neural Systems and Rehabilitation Engineering 14, 4(2006), 492–500.
[42]
Shotaro Omori and Ikuko Eguchi Yairi. 2013. Collaborative music application for visually impaired people with tangible objects on table. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. 1–2.
[43]
André Rodrigues, Kyle Montague, Hugo Nicolau, and Tiago Guerreiro. 2015. Getting smartphones to talkback: Understanding the smartphone adoption process of blind users. In Proceedings of the 17th international acm sigaccess conference on computers & accessibility. 23–32.
[44]
Zeeshan Saquib, Vishakha Murari, and Suhas N Bhargav. 2017. BlinDar: An invisible eye for the blind people making life easy for the blind with Internet of Things (IoT). In 2017 2nd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT). IEEE, 71–75.
[45]
Kevin M Storer, Tejinder K Judge, and Stacy M Branham. 2020. ” All in the Same Boat”: Tradeoffs of Voice Assistant Ownership for Mixed-Visual-Ability Families. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.
[46]
Ender Tekin, James M Coughlan, and Huiying Shen. 2011. Real-time detection and reading of LED/LCD displays for visually impaired persons. In 2011 IEEE Workshop on Applications of Computer Vision (WACV). IEEE, 491–496.
[47]
VisionAware. [n.d.]. Safe Cooking Techniques for Cooks Who Are Blind or Have Low Vision - VisionAware. https://visionaware.org/everyday-living/essential-skills/cooking/safe-cooking-techniques/. (Accessed on 04/05/2021).
[48]
Hans Jørgen Wiberg. 2021. Be My Eyes - See the world together. https://www.bemyeyes.com/. (Accessed on 04/10/2021).
[49]
Wikipedia. 2021. Christine Ha - Wikipedia. https://en.wikipedia.org/wiki/Christine_Ha. (Accessed on 03/14/2021).
[50]
Hanlu Ye, Meethu Malu, Uran Oh, and Leah Findlater. 2014. Current and future mobile and wearable device use by people with visual impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3123–3132.
[51]
Mingyong Zhou, Wenyan Li, and Bo Zhou. 2017. An IOT system design for blind. In 2017 14th Web Information Systems and Applications Conference (WISA). IEEE, 90–92.

Cited By

View all
  • (2024)Please Understand My Disability: An Analysis of YouTubers' Discourse on Disability ChallengesProceedings of the ACM on Human-Computer Interaction10.1145/36869468:CSCW2(1-25)Online publication date: 8-Nov-2024
  • (2024)A Recipe for Success? Exploring Strategies for Improving Non-Visual Access to Cooking InstructionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675662(1-15)Online publication date: 27-Oct-2024
  • (2024)CookAR: Affordance Augmentations in Wearable AR to Support Kitchen Tool Interactions for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676449(1-16)Online publication date: 13-Oct-2024
  • Show More Cited By

Index Terms

  1. Non-Visual Cooking: Exploring Practices and Challenges of Meal Preparation by People with Visual Impairments
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASSETS '21: Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility
    October 2021
    730 pages
    ISBN:9781450383066
    DOI:10.1145/3441852
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 October 2021

    Check for updates

    Author Tags

    1. accessibility
    2. activity of daily living
    3. assistive technology
    4. blind
    5. cooking
    6. people with visual impairments

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ASSETS '21
    Sponsor:

    Acceptance Rates

    ASSETS '21 Paper Acceptance Rate 36 of 134 submissions, 27%;
    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)984
    • Downloads (Last 6 weeks)158
    Reflects downloads up to 16 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Please Understand My Disability: An Analysis of YouTubers' Discourse on Disability ChallengesProceedings of the ACM on Human-Computer Interaction10.1145/36869468:CSCW2(1-25)Online publication date: 8-Nov-2024
    • (2024)A Recipe for Success? Exploring Strategies for Improving Non-Visual Access to Cooking InstructionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675662(1-15)Online publication date: 27-Oct-2024
    • (2024)CookAR: Affordance Augmentations in Wearable AR to Support Kitchen Tool Interactions for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676449(1-16)Online publication date: 13-Oct-2024
    • (2024)Toward Building Design Empathy for People with Disabilities Using Social Media Data: A New Approach for Novice DesignersProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660687(3145-3160)Online publication date: 1-Jul-2024
    • (2024)Sign Language-Based versus Touch-Based Input for Deaf Users with Interactive Personal Assistants in Simulated Kitchen EnvironmentsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651075(1-9)Online publication date: 11-May-2024
    • (2024)A Contextual Inquiry of People with Vision Impairments in CookingProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642233(1-14)Online publication date: 11-May-2024
    • (2024)BubbleCam: Engaging Privacy in Remote Sighted AssistanceProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642030(1-16)Online publication date: 11-May-2024
    • (2023)Practices and Barriers of Cooking Training for Blind and Low Vision PeopleProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3614494(1-5)Online publication date: 22-Oct-2023
    • (2023)Embodied Exploration: Facilitating Remote Accessibility Assessment for Wheelchair Users with Virtual RealityProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608410(1-17)Online publication date: 22-Oct-2023
    • (2023)Empowering Autonomy and Agency: Exploring and Augmenting Accessible Cyber-Physical SystemsAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3610770(263-266)Online publication date: 8-Oct-2023
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media