Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3099023.3099107acmconferencesArticle/Chapter ViewAbstractPublication PagesumapConference Proceedingsconference-collections
research-article

Psychomotor Learning in Martial Arts: an Opportunity for User Modeling, Adaptation and Personalization

Published: 09 July 2017 Publication History

Abstract

Psychomotor learning is crucial for many kind of tasks that involve the acquisition of motor skills, such as practicing martial arts. In the past two decades, diverse technological solutions have been developed to support the learning of the corresponding motor skills. However, the UMAP (User Modeling, Adaptation and Personalization) community has not taken part in that research efforts and thus, resulting systems do not adapt and personalize the system response to their users' needs. This paper discusses the main features of existing systems to learn martial arts and identifies research opportunities that are to be included in the future agenda for UMAP research.

References

[1]
R.W. Byrne, A.E. Russon. 1998. Learning by imitation: a hierarchical approach. Behav Brain Sci. 21(5), 667--721.
[2]
O.C. Santos. 2016. Training the Body: The Potential of AIED to support Personalized Motor Skills Learning. International Journal of Artificial Intelligence in Education, 26 (2), 730--755.
[3]
J.K. Aggarwal, L. Xia. 2014. Human activity recognition from 3D data: A review, Pattern Recognition Letters, 48 (15),70--80.
[4]
A.I. Cuesta-Vargas, A. Galán-Mercant, J.M. Williams. 2010. The use of inertial sensors system for human motion analysis. Physical Therapy Reviews, 15 (6), 462--473.
[5]
J. Van der Linden, E. Schoonderwaldt, J. Bird, R. Johnson. 2011. MusicJacket-Combining Motion Capture and Vibrotactile Feedback to Teach Violin Bowing. IEEE Trans. Instrum. Meas. 60, 104--113.
[6]
R. Sigrist, G. Rauter, R. Riener, P. Wolf. 2013. Augmented visual, auditory, haptic, and multi-modal feedback in motor learning: A review. Psychonomic Bulletin & Review, 20, 21--53.
[7]
E. Polak, J. Kulasa, A.V. Brito, M.A. Castro, O. Fernandes. 2016. Motion analysis systems as optimization training tools in combat sports and martial arts. Revista de Artes Marciales Asiáticas, León, 10 (2), 105--123.
[8]
P. Thiparpakul, W. Limprasert. 2017. New Design System to Learning Martial Art Via Kinect 2.0. In Proceedings of the 5th International Conference on Information and Education Technology (ICIET '17). ACM, New York, NY, USA, 89--93.
[9]
D.A. Becker. 1997. Sensei: A Real-Time Recognition, Feedback and Training System for T'ai Chi Gestures. M.I.T. Media Lab Perceptual Computing Group Technical Report No 426. Master of Science in Media Technology.
[10]
X. Sun, C.-W. Chen, B.S. Manjunath, 2002. Probabilistic motion parameter models for human activity recognition. In Proceedings of the 16th International Conference on Pattern Recognition, 443--446.
[11]
P.T. Chua, R. Crivella, B. Daly. 2003. Training for physical tasks in virtual environments: Tai Chi. Proceedings of IEEE Virtual Reality, 87--94.
[12]
S. Wolf, R. Sattin, M. Kutner. 2003. Intense t'ai chi exercise training and fall occurrences in older, transitionally frail adults: a randomized, controlled trial. J Am Geriatr Soc., 1:188--189.
[13]
C. Wang, R. Bannuru, J. Ramel, B. Kupelnick, T. Scott, C.H. Schmid. 2010. Tai Chi on psychological well-being: systematic review and meta-analysis. BMC Complementary and Alternative Medicine. 10:23.
[14]
P.-H. Han, Y.-S. Chen, Y. Zhong, H.-L. Wang, Y.-P. Hung. 2017. My Tai-Chi coaches: an augmented-learning tool for practicing Tai-Chi Chuan. In Proceedings of the 8th Augmented Human International Conference (AH '17). ACM, New York, NY, USA, Article 25, 4 pages.
[15]
T. Iwaanaguchi, M. Shinya, S. Nakajima, M. Shiraishi. 2015. Cyber Tai Chi - CG-based Video Materials for Tai Chi Chuan Self-Study. 2015 International Conference on Cyberworlds (CW), Visby, 365--368.
[16]
O. Portillo-Rodriguez, O. Sandoval-Gonzalez, E. Ruffaldi, R. Leonardi, C. Avizzano, M. Bergamasco. 2008. Real-time gesture recognition, evaluation and feed-forward correction of a multimodal Tai-Chi platform. International Workshop on Haptic and Audio Interaction Design. HAID 2008: Haptic and Audio Interaction Design. Lect. Notes Comput. Sci. 2008, 5270, 30--39.
[17]
K. Kunze, M. Barry, E.A. Heinz, P. Lukowicz, D. Majoe, J. Gutknecht. 2006. Towards Recognizing Tai Chi - An Initial Experiment Using Wearable Sensors. 3rd International Forum on Applied Wearable Computing 2006, Bremen, Germany, 2006, 1--6.
[18]
P. Hämäläinen, T. Ilmonen, J. Höysniemi, M. Lindholm, A. Nykänen. 2005. Martial arts in artificial reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, NY, USA, 781--790.
[19]
F. Goodman, A. Popovic, P. Brady. 2010. The Practical Step-by-step Guide To Martial arts: T'ai Chi & Aikido. London: Hermes House, 512 pages.
[20]
D. James, W. Uroda, T. Gibson. 2005. Dynamics of swing: A study of classical japanese swordsmanship using accelerometers. In A. S. Ujihashi S. (Ed.). Presented at the Asia-Pacific Congress on Sports Technology, Tokyo (2005).
[21]
A. Timmi. 2012. Biomechanical analysis of Karate techniques based on the evaluation of the body kinetic energy from 3D mocap data. 1st IMACSSS (International Martial Arts and Combat Sports Scientific Society). Genova, 8--10 June 2012,
[22]
C. Chye, M. Sakamoto, T. Nakajima. 2014. An Exergame for Encouraging Martial Arts. Human-Computer Interaction, Part III, HCII 2014, LNCS 8512, 221--232.
[23]
A. Bloomfield, N. Badler. 2008. Virtual Training via vibrotactile arrays. Teleoper. Virtual Environ, 17, 103--120.
[24]
M. Takahata, K. Shiraki, Y. Sakane, Y. Takebayashi. 2004. Sound feedback for powerful karate training. In Proceedings of the 2004 conference on New interfaces for musical expression (NIME '04), Michael J. Lyons (Ed.). National University of Singapore, Singapore, Singapore, 13--18.
[25]
E. A. Heinz, K. S. Kunze, M. Gruber, D. Bannach and P. Lukowicz. 2006. Using Wearable Sensors for Real-Time Recognition Tasks in Games of Martial Arts - An Initial Experiment. 2006 IEEE Symposium on Computational Intelligence and Games, Reno, NV, 2006, 98--102.
[26]
D.Y. Kwon, M. Gross. 2005. Combining Body Sensors and Visual Sensors for Motion Training. In Proceedings of Advances in Computer Entertainment Technology, Valencia, Spain, 15--17 June 2005, 94--101.
[27]
S. Phunsa, N. Potisarn and S. Tirakoat, 2009. Edutainment -- Thai Art of Self-Defense and Boxing by Motion Capture Technique. 2009 International Conference on Computer Modeling and Simulation, Macau, 152--155.
[28]
G. Mustapha, M.F.A. Razak, M.S.M. Hamzah, N.H.M. Yahya, J. Mahmud. 2016. The Development of a Low Cost Motion Analysis System: Cekak Visual 3D V1.0. International Journal of GEOMATE, 11 (24), 2248--2252.
[29]
Y. Shinagawa, J. Nakajima, T. L. Kunii, K. Hara. 1997., Capturing and analyzing stability of human body motions using video cameras. Computer Animation '97, Geneva, 1997, 48--57.
[30]
E. H. Chi. 2005. Introducing wearable force sensors in martial arts. IEEE Pervasive Computing, 4 (3), 47--53.
[31]
Aikido3D: http://www.aikido3d.com/.
[32]
K. Kiili. 2005. Digital game-based learning: Towards an experiential gaming model. The Internet and Higher Education 8(1), 13--24.
[33]
A. Canessa, M. Chessa, A. Gibaldi, S. P. Sabatini, F. Solari. 2014. Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment. Vis. Commun. Image Represent, 25 (1), 227--237.
[34]
L. Emering, R. Boulic and D. Thalmann. 1998. Interacting with virtual humans through body actions. IEEE Computer Graphics and Applications, 18 (1), 8--11.
[35]
C.J. Winstein, R.A. Schmidt, 1990. Reduced frequency of knowledge of results enhances motor skill learning. Journal of Experimental Psychology: Learning, Memory, and Cognition, 16(4), 677--691.
[36]
G.C. Burdea. 1996. Force and Touch Feedback for Virtual Reality. New York: Wiley.
[37]
O.C. Santos, M. Eddy. 2017. Modeling Psychomotor Activity: Current Approaches and Open Issues. In UMAP 2017 Adjunct proceedings (in press).
[38]
M. Hausen, P.P. Soares, M.P., Araújo, F. Porto, E. Franchini, C.A. Bridge, J. Gurgel. 2017. Physiological responses and external validity of a new setting for taekwondo combat simulation. PLoS ONE 12(2).
[39]
O.C. Santos. 2016. Emotions and personality in e-learning systems: an affective computing perspective. In Emotions and Personality in Personalized Services (ed. Tkalčič, DeCarolis, de Gemmis, Odić, Košir). Springer, 278--279.
[40]
J. Harasymowicz, R.M. Kalina. 2005. Training of psychomotor adaptation -- a key factor in teaching self-defence. Archives of Budo, 1: 19--26.
[41]
O.C. Santos, J.G. Boticario. 2015. Practical guidelines for designing and evaluating educationally oriented recommendations. Comput. Educ. 2015, 81, 354--374.
[42]
O.C. Santos. 2015. Education still needs Artificial Intelligence to support Motor skill Learning. A Case Study with Aikido. In: Les Contes du Marriage: Should AI stay married to ED? A workshop examining the current and future identity of the AIED field. CEUR Workshop Proceedings, AIED 2015 workshops, 1432 (4), 72--81.

Cited By

View all
  • (2024)Mastering Mind and Movement. ACM UMAP 2024 Tutorial on Modeling Intelligent Psychomotor Systems (M3@ACM UMAP 2024)Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3658534(9-12)Online publication date: 27-Jun-2024
  • (2024)Exploring the Impact of Partial Occlusion on Emotion Classification From Facial Expressions: A Comparative Study of XR Headsets and Face MasksIEEE Access10.1109/ACCESS.2024.338043912(44613-44627)Online publication date: 2024
  • (2024)Exploring raw data transformations on inertial sensor data to model user expertise when learning psychomotor skillsUser Modeling and User-Adapted Interaction10.1007/s11257-024-09393-234:4(1283-1325)Online publication date: 1-Sep-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
UMAP '17: Adjunct Publication of the 25th Conference on User Modeling, Adaptation and Personalization
July 2017
456 pages
ISBN:9781450350679
DOI:10.1145/3099023
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 July 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. adaptation
  2. martial arts
  3. motor skills
  4. personalization
  5. psychomotor learning
  6. user modeling

Qualifiers

  • Research-article

Conference

UMAP '17
Sponsor:

Acceptance Rates

Overall Acceptance Rate 162 of 633 submissions, 26%

Upcoming Conference

UMAP '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Mastering Mind and Movement. ACM UMAP 2024 Tutorial on Modeling Intelligent Psychomotor Systems (M3@ACM UMAP 2024)Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3658534(9-12)Online publication date: 27-Jun-2024
  • (2024)Exploring the Impact of Partial Occlusion on Emotion Classification From Facial Expressions: A Comparative Study of XR Headsets and Face MasksIEEE Access10.1109/ACCESS.2024.338043912(44613-44627)Online publication date: 2024
  • (2024)Exploring raw data transformations on inertial sensor data to model user expertise when learning psychomotor skillsUser Modeling and User-Adapted Interaction10.1007/s11257-024-09393-234:4(1283-1325)Online publication date: 1-Sep-2024
  • (2021)Toward Modeling Psychomotor Performance in Karate Combats Using Computer Vision Pose EstimationSensors10.3390/s2124837821:24(8378)Online publication date: 15-Dec-2021
  • (2021)Punch Anticipation in a Karate Combat with Computer VisionAdjunct Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization10.1145/3450614.3461688(61-67)Online publication date: 21-Jun-2021
  • (2021)KUMITRON: Artificial Intelligence System to Monitor Karate Fights that Synchronize Aerial Images with Physiological and Inertial SignalsCompanion Proceedings of the 26th International Conference on Intelligent User Interfaces10.1145/3397482.3450730(37-39)Online publication date: 14-Apr-2021
  • (2018)MyShikkoAdjunct Publication of the 26th Conference on User Modeling, Adaptation and Personalization10.1145/3213586.3225225(217-218)Online publication date: 2-Jul-2018
  • (2017)Modeling Psychomotor ActivityAdjunct Publication of the 25th Conference on User Modeling, Adaptation and Personalization10.1145/3099023.3099083(305-310)Online publication date: 9-Jul-2017
  • (2017)Towards Personalized Vibrotactile Support for Learning AikidoData Driven Approaches in Digital Education10.1007/978-3-319-66610-5_70(593-597)Online publication date: 5-Sep-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media