Liang et al., 2023 - Google Patents
Deep-learning model for the prediction of lower-limb joint moments using single inertial measurement unit during different locomotive activitiesLiang et al., 2023
- Document ID
- 12355954201184467501
- Author
- Liang W
- Wang F
- Fan A
- Zhao W
- Yao W
- Yang P
- Publication year
- Publication venue
- Biomedical Signal Processing and Control
External Links
Snippet
The estimation of lower-limb joint moments during locomotive activities can provide valuable feedback in joint-injury risk evaluation and clinical diagnosis. The use of inertial measurement units (IMUs) in joint moment estimation has drawn considerable attention …
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
- A61B5/1038—Measuring plantar pressure during gait
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/08—Learning methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7232—Signal processing specially adapted for physiological signals or for diagnostic purposes involving compression of the physiological signal, e.g. to extend the signal recording period
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4528—Joints
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Su et al. | A CNN-based method for intent recognition using inertial measurement units and intelligent lower limb prosthesis | |
Dorschky et al. | CNN-based estimation of sagittal plane walking and running biomechanics from measured and simulated inertial sensor data | |
Sethi et al. | A comprehensive survey on gait analysis: History, parameters, approaches, pose estimation, and future work | |
Molinaro et al. | Subject-independent, biological hip moment estimation during multimodal overground ambulation using deep learning | |
Hossain et al. | Deepbbwae-net: A cnn-rnn based deep superlearner for estimating lower extremity sagittal plane joint kinematics using shoe-mounted imu sensors in daily living | |
Khodabandelou et al. | A fuzzy convolutional attention-based GRU network for human activity recognition | |
Kang et al. | Subject-independent continuous locomotion mode classification for robotic hip exoskeleton applications | |
Sun et al. | Continuous estimation of human knee joint angles by fusing kinematic and myoelectric signals | |
Rai et al. | Mode-free control of prosthetic lower limbs | |
Siu et al. | Ankle torque estimation during locomotion from surface electromyography and accelerometry | |
Yang et al. | Inertial sensing for lateral walking gait detection and application in lateral resistance exoskeleton | |
Liang et al. | Deep-learning model for the prediction of lower-limb joint moments using single inertial measurement unit during different locomotive activities | |
Duong et al. | Ecological validation of machine learning models for spatiotemporal gait analysis in free-living environments using instrumented insoles | |
Hollinger et al. | The influence of gait phase on predicting lower-limb joint angles | |
Molinaro et al. | Anticipation and Delayed Estimation of Sagittal Plane Human Hip Moments using Deep Learning and a Robotic Hip Exoskeleton | |
Moghadam et al. | The effect of imu sensor location, number of features, and window size on a random forest model’s accuracy in predicting joint kinematics and kinetics during gait | |
Zarshenas et al. | Ankle torque forecasting using time-delayed neural networks | |
KR102323818B1 (en) | Method and system for artificial intelligence-based brain disease diagnosis using human dynamic characteristics information | |
Cornish et al. | Hip contact forces can be predicted with a neural network using only synthesised key points and electromyography in people with hip osteoarthritis | |
Narayan et al. | A comparative performance analysis of backpropagation training optimizers to estimate clinical gait mechanics | |
Na et al. | Deep domain adaptation, pseudo-labeling, and shallow network for accurate and fast gait prediction of unlabeled datasets | |
Kumar et al. | Prediction of lower limb kinematics from vision-based system using deep learning approaches | |
Lee et al. | A review on neural network based gait estimation methods | |
Monica et al. | Efficient Gait Analysis Using Deep Learning Techniques. | |
Alemayoh et al. | A Neural Network-Based Lower Extremity Joint Angle Estimation from Insole Data |