Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3098279.3125434acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
demonstration

EatAR tango: portion estimation on mobile devices with a depth sensor

Published: 04 September 2017 Publication History

Abstract

The accurate assessment of nutrition information is a challenging task, but crucial for people with certain diseases, such as diabetes. An important part of the assessment of nutrition information is portion estimation, i.e. volume estimation. Given the volume and the food type, the nutrition information can be computed on the basis of the food type specific nutrition density. Recently mobile devices with depth sensors have been made available for the public (Google's project tango platform). In this work, an app for mobile devices with a depth sensor is presented which assists users in portion estimation. Furthermore, we present the design of a user study for the app and preliminary results.

References

[1]
K Aizawa, Y Maruyama, He Li, and C Morikawa. 2013. Food Balance Estimation by Using Personal Dietary Tendencies in a Multimedia Food Log. IEEE Transactions on Multimedia 15, 8: 2176--2185.
[2]
M Anthimopoulos, J Dehais, P Diem, and S Mougiakakou (eds.). 2013. Segmentation and recognition of multi-food meal images for carbohydrate counting: Bioinformatics and Bioengineering (BIBE), 2013 IEEE 13th International Conference on.
[3]
Aaron Bangor, Philip T. Kortum, and James T. Miller. 2008. An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction 24, 6: 574--594.
[4]
Carol Jo Boushey, Amelia J Harray, Deborah Anne Kerr, et al. 2015. How willing are adolescents to record their dietary intake? The mobile food record. JMIR mHealth and uHealth 3, 2: e47.
[5]
J Brooke. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry.
[6]
Junghoon Chae, Insoo Woo, Sungye Kim, et al. 2011. Volume Estimation Using Food Specific Shape Templates in Mobile Image-Based Dietary Assessment. Proceedings of SPIE 7873: 78730K.
[7]
J Dehais, S Shevchik, P Diem, and S G Mougiakakou (eds.). 2013. Food volume computation for self dietary assessment applications: Bioinformatics and Bioengineering (BIBE), 2013 IEEE 13th International Conference on.
[8]
Joachim Dehais, Marios Anthimopoulos, Sergey Shevchik, and Stavroula Mougiakakou. 2017. Two-View 3D Reconstruction for Food Volume Estimation. IEEE Transactions on Multimedia 19, 5: 1090--1099.
[9]
Michael Domhardt, Martin Tiefengrabner, Radomir Dinic, et al. 2015. Training of carbohydrate estimation for people with diabetes using mobile augmented reality. Journal of diabetes science and technology 9, 3: 516--24.
[10]
Ye He, Chang Xu, N Khanna, C J Boushey, and E J Delp. 2013. Context based food image analysis. Image Processing (ICIP), 2013 20th IEEE International Conference on, 2748--2752.
[11]
Austin Myers, Nick Johnston, Vivek Rathod, et al. 2015. Im2Calories: towards an automated mobile vision food diary. ICCV.
[12]
Daniel Rhyner, Hannah Loher, Joachim Dehais, et al. 2016. Carbohydrate Estimation by a Mobile Phone-Based System Versus Self-Estimations of Individuals With Type 1 Diabetes Mellitus: A Comparative Study. Journal of medical Internet research 18, 5: e101.
[13]
TusaRebecca E Schap, Bethany L Six, Edward J Delp, David S Ebert, Deborah A Kerr, and Carol J Boushey. 2011. Adolescents in the United States can identify familiar foods at the time of consumption and when prompted with an image 14 h postprandial, but poorly estimate portions. Public health nutrition 14, 7: 1184-1191.
[14]
D A Schoeller, L G Bandini, and W H Dietz. 1990. Inaccuracies in self-reported intake identified by comparison with the doubly labelled water method. Canadian journal of physiology and pharmacology 68, 7: 941--949.
[15]
Bethany L Six, TusaRebecca E Schap, Deborah A Kerr, and Carol J Boushey. 2011. Evaluation of the Food and Nutrient Database for Dietary Studies for use with a mobile telephone food record. Journal of food composition and analysis: an official publication of the United Nations University, International Network of Food Data Systems 24, 8: 1160--1167.
[16]
Thomas Stütz, Radomir Dinic, Michael Domhardt, and Simon Ginzinger. 2014. Can mobile augmented reality systems assist in portion estimation? A user study. International Symposium on Mixed and Augmented Reality - Media, Art, Social Science, Humanities and Design (ISMAR-MASH'D), IEEE, 51-57.

Cited By

View all
  • (2024)Autocleandeepfood: auto-cleaning and data balancing transfer learning for regional gastronomy food computingThe Visual Computer10.1007/s00371-024-03560-7Online publication date: 9-Jul-2024
  • (2023)Evaluating machine learning technologies for food computing from a data set perspectiveMultimedia Tools and Applications10.1007/s11042-023-16513-483:11(32041-32068)Online publication date: 19-Sep-2023
  • (2018)PHARAProceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct10.1145/3236112.3236161(339-345)Online publication date: 3-Sep-2018
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MobileHCI '17: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
September 2017
874 pages
ISBN:9781450350754
DOI:10.1145/3098279
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 September 2017

Check for updates

Author Tags

  1. augmented reality
  2. mobile
  3. portion estimation

Qualifiers

  • Demonstration

Funding Sources

  • Österreichische Forschungsförderungsgesellschaft

Conference

MobileHCI '17
Sponsor:

Acceptance Rates

MobileHCI '17 Paper Acceptance Rate 45 of 224 submissions, 20%;
Overall Acceptance Rate 202 of 906 submissions, 22%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Autocleandeepfood: auto-cleaning and data balancing transfer learning for regional gastronomy food computingThe Visual Computer10.1007/s00371-024-03560-7Online publication date: 9-Jul-2024
  • (2023)Evaluating machine learning technologies for food computing from a data set perspectiveMultimedia Tools and Applications10.1007/s11042-023-16513-483:11(32041-32068)Online publication date: 19-Sep-2023
  • (2018)PHARAProceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct10.1145/3236112.3236161(339-345)Online publication date: 3-Sep-2018
  • (2017)EatAR Tango: Results on the Accuracy of Portion Estimation2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)10.1109/ISMAR-Adjunct.2017.90(284-287)Online publication date: Oct-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media