Liang et al., 2023 - Google Patents
Deep-learning model for the prediction of lower-limb joint moments using single inertial measurement unit during different locomotive activitiesLiang et al., 2023
- Document ID
- 12355954201184467501
- Author
- Liang W
- Wang F
- Fan A
- Zhao W
- Yao W
- Yang P
- Publication year
- Publication venue
- Biomedical Signal Processing and Control
External Links
Snippet
The estimation of lower-limb joint moments during locomotive activities can provide valuable feedback in joint-injury risk evaluation and clinical diagnosis. The use of inertial measurement units (IMUs) in joint moment estimation has drawn considerable attention …
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
- A61B5/1038—Measuring plantar pressure during gait
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
- G06N3/08—Learning methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7232—Signal processing specially adapted for physiological signals or for diagnostic purposes involving compression of the physiological signal, e.g. to extend the signal recording period
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4528—Joints
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Su et al. | A CNN-based method for intent recognition using inertial measurement units and intelligent lower limb prosthesis | |
Hernandez et al. | Lower body kinematics estimation from wearable sensors for walking and running: A deep learning approach | |
Molinaro et al. | Subject-independent, biological hip moment estimation during multimodal overground ambulation using deep learning | |
Martinez-Hernandez et al. | Probabilistic identification of sit-to-stand and stand-to-sit with a wearable sensor | |
Hossain et al. | Deepbbwae-net: A cnn-rnn based deep superlearner for estimating lower extremity sagittal plane joint kinematics using shoe-mounted imu sensors in daily living | |
Kang et al. | Subject-independent continuous locomotion mode classification for robotic hip exoskeleton applications | |
Sun et al. | Continuous estimation of human knee joint angles by fusing kinematic and myoelectric signals | |
Liang et al. | Synergy-based knee angle estimation using kinematics of thigh | |
Yang et al. | Inertial sensing for lateral walking gait detection and application in lateral resistance exoskeleton | |
Hollinger et al. | The influence of gait phase on predicting lower-limb joint angles | |
Moghadam et al. | The effect of imu sensor location, number of features, and window size on a random forest model’s accuracy in predicting joint kinematics and kinetics during gait | |
Molinaro et al. | Anticipation and delayed estimation of sagittal plane human hip moments using deep learning and a robotic hip exoskeleton | |
Liang et al. | Deep-learning model for the prediction of lower-limb joint moments using single inertial measurement unit during different locomotive activities | |
Cornish et al. | Hip contact forces can be predicted with a neural network using only synthesised key points and electromyography in people with hip osteoarthritis | |
Badura et al. | Automatic berg balance scale assessment system based on accelerometric signals | |
Na et al. | Deep domain adaptation, pseudo-labeling, and shallow network for accurate and fast gait prediction of unlabeled datasets | |
Zarshenas et al. | Ankle torque forecasting using time-delayed neural networks | |
Li et al. | 3D knee and hip angle estimation with reduced wearable IMUs via transfer learning during yoga, golf, swimming, badminton, and dance | |
Narayan et al. | A comparative performance analysis of backpropagation training optimizers to estimate clinical gait mechanics | |
Sugai et al. | Lstm network-based estimation of ground reaction forces during walking in stroke patients using markerless motion capture system | |
Ren et al. | PDCHAR: Human activity recognition via multi-sensor wearable networks using two-channel convolutional neural networks | |
Alemayoh et al. | A Neural Network-Based Lower Extremity Joint Angle Estimation from Insole Data | |
Zhang et al. | Estimation of Normal Ground Reaction Forces in Multiple Treadmill Skiing Movements Using IMU Sensors with Optimized Locations | |
Negi et al. | A standalone real-time gait phase detection using fuzzy-logic implementation in arduino nano | |
Carter et al. | Estimating joint moments during treadmill running using various consumer based wearable sensor locations |