Nothing Special   »   [go: up one dir, main page]

CN116370259A - Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion - Google Patents

Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion Download PDF

Info

Publication number
CN116370259A
CN116370259A CN202310118404.XA CN202310118404A CN116370259A CN 116370259 A CN116370259 A CN 116370259A CN 202310118404 A CN202310118404 A CN 202310118404A CN 116370259 A CN116370259 A CN 116370259A
Authority
CN
China
Prior art keywords
upper limb
patient
limb rehabilitation
human
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310118404.XA
Other languages
Chinese (zh)
Inventor
谢龙汉
黄国威
黄双远
陈彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lizhi Medical Technology Guangzhou Co ltd
Original Assignee
Lizhi Medical Technology Guangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lizhi Medical Technology Guangzhou Co ltd filed Critical Lizhi Medical Technology Guangzhou Co ltd
Priority to CN202310118404.XA priority Critical patent/CN116370259A/en
Publication of CN116370259A publication Critical patent/CN116370259A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/60Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG]
    • A61H2230/605Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG] used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/62Posture
    • A61H2230/625Posture used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • General Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Dermatology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Neurosurgery (AREA)
  • General Business, Economics & Management (AREA)
  • Rehabilitation Therapy (AREA)
  • Business, Economics & Management (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pain & Pain Management (AREA)
  • Veterinary Medicine (AREA)
  • Neurology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Acoustics & Sound (AREA)
  • Rehabilitation Tools (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, which comprises an exoskeleton upper limb rehabilitation robot, data gloves, a binocular camera, an augmented virtual reality game center and an upper computer control center. The invention is applied to the upper limb rehabilitation training process. The exoskeleton upper limb rehabilitation robot drives the affected side to perform rehabilitation training through the mechanical arm; the data glove realizes real-time acquisition of the posture of the affected side and acquisition of muscle electric signals; the binocular camera collects joint position information in real time and recognizes compensation behaviors in the training process; the upper computer control center realizes the communication, control and data processing of the equipment, and ensures the safety of rehabilitation training; the virtual reality game center realizes interaction with the game through the exoskeleton upper limb rehabilitation robot, and generates an evaluation report. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.

Description

Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion
Technical Field
The invention relates to the field of upper limb rehabilitation, in particular to a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion.
Background
With the increase of living and medical levels, the proportion of the aged population is continuously rising. Cerebral apoplexy is a high morbidity of the elderly, hemiplegia caused by cerebral apoplexy greatly influences the daily life of the patients, and rehabilitation of hemiplegia upper limbs is a common concern. According to nerve plasticity theory, the exercise rehabilitation therapy has a key effect on the recovery of hemiplegia limbs. However, due to the huge number of stroke patients, rehabilitation therapists are seriously deficient, and the traditional artificial physical therapy cannot meet the huge rehabilitation requirements. The upper limb rehabilitation robot can effectively relieve the shortage pressure of rehabilitation therapists, improve the efficiency and effect of rehabilitation training and has great clinical application prospect. Through the scientific training of the man-machine interaction upper limb rehabilitation system with multi-sensor information fusion, the activity range, muscle strength, flexibility and the like of the affected limb of the hemiplegic patient can be recovered, and the self-care capacity of daily life is improved. Traditionally, treatment regimens developed based on rehabilitation experience have differed even for the same patient to exercise the same action. Meanwhile, the evaluation of the rehabilitation effect of the patient depends on the experience or scale of a rehabilitee, and large errors exist. The existing upper limb rehabilitation robot lacks of real-time detection of physiological states and compensatory behaviors of patients, cannot perform timely feedback and improve rehabilitation training, cannot safely monitor and effectively inhibit compensatory movements, and further affects safety and effectiveness of upper limb rehabilitation training. In the rehabilitation process, the patient often has a relatively dry and noisy process, and the emotion after the illness is relatively low, so that the situations of poor rehabilitation enthusiasm, low coordination degree and the like exist, and the rehabilitation effect is affected.
Disclosure of Invention
In order to at least solve one of the technical defects in the prior art, the invention discloses a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, and the scientization, the safeguarding, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process are realized through the fusion processing of physiological, posture and motion information acquired by a plurality of sensors.
The invention provides a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, which comprises an exoskeleton upper limb rehabilitation robot, data gloves, a binocular camera, an augmented virtual reality game center and an upper computer control center.
The data glove is worn on the arm of the patient, so that the real-time acquisition of the posture information of the upper limb of the patient and the real-time acquisition of the muscle electrical signals are realized, the data glove is used as a basis for controlling the exoskeleton upper limb rehabilitation robot, and the physiological state of the patient is monitored in real time, so that the safety of the rehabilitation process is realized;
the binocular camera is used for monitoring the joint point position information of a patient in real time, monitoring the joint point position information of the patient in real time through a human skeleton recognition algorithm, realizing the recognition and detection of trunk compensatory motions through a compensatory recognition algorithm, reminding the patient to correct the abnormal posture of the patient in time in the training process through voice and visual feedback, and transmitting data to an upper computer control center so as to realize the effectiveness of the rehabilitation process;
the upper computer control center realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot, the data glove and the binocular camera, ensures coordination and fusion of physiological and motion information acquired by the multiple sensors, and finally enables a patient to perform effective rehabilitation training under safe and reliable conditions. The rehabilitation therapist can view the real-time data of the system, remotely observe the physiological state of the patient and update the treatment scheme in time. Realizing the data coupling of the rehabilitation process;
the exoskeleton upper limb rehabilitation robot is worn on the patient's affected side to assist the patient's affected side in exercise training;
the augmented virtual reality game center is used for storing patient information, and meanwhile, interaction with an augmented virtual reality game is achieved through receiving position data of the exoskeleton rehabilitation robot, a rehabilitation training scene, real-time movement postures of a patient and compensation evaluation results are displayed, the patient interacts with the patient through visual display and voice prompt in the game process, and finally a game report is generated.
Further, the binocular camera install in the display top, 30 degrees orientation training scenes of downward sloping effectively avoid the shielding of training process.
Furthermore, the binocular camera collects environmental image data at 60fps, so that accurate spatial information in a visual field range can be obtained, and positioning accuracy can reach millimeter level.
Further, the rehabilitation training comprises active training, passive training, active and passive training and other training modes. Meanwhile, the force sensor and the inertial sensor loaded on the exoskeleton upper limb rehabilitation robot can monitor the state and position information of the affected limb in real time, so that the scientificalness of the rehabilitation process is realized.
Further, the virtual reality game center comprises a plurality of active games and passive games, and the patient selects the rehabilitation game according to the rehabilitation prescription. The patient is placed in the virtual reality environment, the patient interacts with the patient limb position data collected by the exoskeleton upper limb rehabilitation robot, and the game and training tasks are completed in the augmented virtual reality environment.
Further, the virtual reality game center stores basic information and training information of the patient, and is used as a visual center for collecting data by various sensors of the rehabilitation system, a rehabilitation training scene, real-time motion gestures of the patient and compensation evaluation results are displayed, the virtual reality game center interacts with the patient through visual display and voice prompt in the game process, finally a game report is generated, and rehabilitation therapists, the patient and family members of the patient can intuitively know rehabilitation conditions according to the report. Realizing the fun and data visualization of the rehabilitation process.
Furthermore, the myoelectric sensors arranged in the data glove collect the myoelectric signals of the corresponding parts of the affected side of the patient in real time according to the corresponding training actions, and comprehensively monitor the physiological state of the patient to be used as the basis for the rehabilitation of the patient by a rehabilitation engineer.
Furthermore, three inertial sensors are arranged in the data glove, motion information of a patient is collected in real time and is used for restoring the motion state of the patient, and meanwhile, the motion information collected by the data glove and the binocular camera are mutually complemented and verified to restore the real state of the patient; the motion information is used as the descriptive characteristic of the rehabilitation training actions completed by the patient, so that a rehabilitation therapist can track the rehabilitation condition of the patient in time.
Furthermore, the position information of the affected limb joints acquired by the data glove and the binocular camera is subjected to information fusion through Kalman filtering and is used as an input source of an upper computer control center. The fused data can improve the positioning precision, and the real-time accurate position information of the joints is used as feedback for real-time change of the motion of the virtual character in the augmented virtual reality game, so that the flexible following of the motion of the virtual character to the patient can be realized. Realizing immersion in the rehabilitation process.
Compared with the prior art, the invention has the following advantages and technical effects:
the invention comprises an exoskeleton upper limb rehabilitation robot, a data glove, a binocular camera, an augmented virtual reality game center and an upper computer control center. The invention is applied to the upper limb rehabilitation training process. The data glove is worn on the arm of the affected side, acquires muscle electrical signals and posture information of the affected side of the patient in real time, and transmits the data to the control center of the upper computer to serve as a basis for controlling the exoskeleton upper limb rehabilitation robot; the binocular camera monitors the joint point position information of a patient in real time through a human skeleton recognition algorithm, recognizes the torso compensatory motion through a compensatory recognition algorithm, and transmits data to an upper computer control center; the upper computer control center comprehensively evaluates and processes the information of the patient training process, realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot, the data glove and the binocular camera, ensures coordination and fusion of physiological, posture and motion information acquired by the multiple sensors, and finally enables the patient to perform effective rehabilitation training under the safe condition. The rehabilitation therapist remotely observes the physiological state of the patient through the system data generated in real time and timely updates the treatment scheme; the exoskeleton upper limb rehabilitation robot is worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games, so that multiple training modes such as active rehabilitation training, passive rehabilitation training, active and passive training on the patient's affected side are realized; the augmented virtual reality game center stores patient information, meanwhile, interaction with the augmented virtual reality game is achieved through receiving position data of the exoskeleton rehabilitation robot, results of rehabilitation training scenes, real-time motion postures of patients and compensation evaluation are displayed, the game center interacts with the patients through visual display and voice prompts in the game process, finally, game reports are generated, and rehabilitation therapists, patients and family members of the patients can intuitively know rehabilitation conditions according to the reports. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.
Drawings
FIG. 1 is a block diagram of a human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion;
FIG. 2 is a block diagram of steps of a human-computer interaction upper limb rehabilitation system patient using an augmented virtual reality game center based on multi-sensor information fusion;
fig. 3 is a composition and functional block diagram of a man-machine interaction upper limb rehabilitation system data glove based on multi-sensor information fusion.
The figure shows: 1-data glove, 2-binocular camera, 3-upper computer control center, 4-exoskeleton upper limb rehabilitation robot and 5-augmented virtual reality game center;
11-register or login, 12-power on/initialize, 13-set IP and port number, 14-select training mode, 15-select game type, 16-start game, 17-view game record, 18-view assessment report;
21-data glove, 22-myoelectric sensor, 23-inertial sensor, 24-physiological state, 25-motion information.
Detailed Description
The invention will be further illustrated with reference to specific examples, but is not limited thereto.
As shown in fig. 1, the man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion provided by the invention comprises a data glove 1, a binocular camera 2, an upper computer control center 3, an exoskeleton upper limb rehabilitation robot 4 and an augmented virtual reality game center 5.
The data glove 1 is used for being worn on the arm of the patient on the affected side, collecting muscle electrical signals and posture information of the patient on the affected side in real time, and transmitting data to the upper computer control center 3 to serve as a basis for controlling the exoskeleton upper limb rehabilitation robot 4; the binocular camera 2 monitors three-dimensional position information of joint points of a patient in real time through a human skeleton recognition algorithm, recognizes torso compensatory movements through a compensatory recognition algorithm, corrects abnormal postures of the patient in a training process in time, realizes effectiveness of a rehabilitation process, and transmits data to the upper computer control center 3; the upper computer control center 3 comprehensively evaluates and processes the information of the patient training process, realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot 4, the data glove 1 and the binocular camera 2, ensures coordination and fusion of physiological, posture and motion information acquired by multiple sensors, and finally enables the patient to perform effective rehabilitation training under the safe condition. The rehabilitation therapist remotely observes the physiological state of the patient through the system data generated in real time, and timely updates the treatment scheme to realize the data coupling of the rehabilitation process; the exoskeleton upper limb rehabilitation robot 4 is used for being worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games, so that multiple training modes such as active rehabilitation training, passive rehabilitation training, active and passive training of the patient's affected side are realized; the augmented virtual reality game center 6 is used for storing patient information, and meanwhile, through receiving the position data of the exoskeleton rehabilitation robot, interaction with the augmented virtual reality game is achieved, results of rehabilitation training scenes, real-time motion postures of patients and compensation evaluation are displayed, in the game process, the game is interacted with the patients through visual display and voice prompt, finally, a game report is generated, and rehabilitation therapists, patients and family members of the patients can intuitively know rehabilitation conditions according to the report. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.
In some embodiments of the present invention, the human skeleton recognition algorithm employs the openpoise human skeleton recognition algorithm. The compensatory exercise data set of the hemiplegia patient is trained to obtain a compensatory identification algorithm based on MLP, so that the classification and identification of trunk compensatory exercise are realized. The openelse human skeleton recognition algorithm and the MLP-based compensation recognition algorithm are both existing algorithms, and are not specifically described here.
In some embodiments of the present invention, the data glove 1 is used for real-time acquisition of posture information of an upper limb on a patient's affected side and real-time acquisition of muscle electrical signals, and is used as a basis for controlling an exoskeleton upper limb rehabilitation robot to monitor the physiological state of the patient in real time. Realizing the safety of the rehabilitation process.
In some embodiments of the invention, the binocular camera 2 is mounted above the display and is tilted downward 30 degrees towards the training scene, effectively avoiding occlusion during the training process. The binocular camera collects environmental image data at 60fps, accurate spatial information in a visual field range can be obtained, and positioning accuracy can reach millimeter level.
In some embodiments of the present invention, the exoskeleton upper limb rehabilitation robot 4 is worn on the patient's patient side to assist the patient's patient side in performing exercise training, and may perform multiple training modes such as active training, passive training and active and passive training. Meanwhile, the exoskeleton is provided with a force sensor and an inertial sensor, the force sensor and the inertial sensor can monitor the state and position information of the affected limb in real time, under the condition of ensuring the safety of the affected limb, the flexible interaction between the mechanical arm and the affected limb during active training is realized through force and position feedback, and interactive force numerical reference can be provided for resistance training and passive training, so that the scientization of the rehabilitation process is realized.
The augmented virtual reality game center 5 stores a plurality of active games and passive games, and the patient selects a rehabilitation game according to the rehabilitation prescription. The patient is placed in the virtual reality environment, the affected limb position information acquired by the exoskeleton upper limb rehabilitation robot 4 is interacted, and the game and training tasks are completed in the augmented virtual reality environment. The virtual reality game center 5 stores basic information and training information of a patient, and is used as a visual center for collecting data of various sensors of a rehabilitation system, a rehabilitation training scene, real-time motion gesture of the patient and compensation evaluation results are displayed, the virtual reality game center interacts with the patient through visual display and voice prompt in the game process, finally a game report is generated, and rehabilitation therapists, the patient and family members of the patient can intuitively know rehabilitation conditions according to the report. Realizing the fun and data visualization of the rehabilitation process.
In addition, the invention also provides a virtual reality game center with convenient operation, referring to fig. 2, the process of the augmented virtual reality game center used by the patient of the human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion comprises the following steps: log in or register 11, power on/initialize 12, set IP and port number 13, select training mode 14, select game type 15, start game 16, view game record 17, view assessment report 18.
Step login or registration 11 registers the patient using the system for the first time, and after successful registration, the patient can log in, and the subsequent use only needs to log in. The login can be authenticated by the user account number and the password, or by the mobile phone scanning the two-dimensional code.
Step start/initialize 12, set IP and port number 13, select training mode 14 for setting the robotic arm. Before training, the mechanical arm needs to be started/initialized 12, and the starting and initialization of the motor and the pose of the mechanical arm are waited; setting IP and port number 13, selecting connection mode, clicking connection button, connecting mechanical arm program by TCP communication; after connection is completed, the training mode 14 is selected according to the condition of the patient, the training mode selection process comprises the steps of selecting a left hand and a right hand, then selecting a plurality of modes such as an active mode, a passive mode, a power assisting mode, an impedance mode, a data glove mode and the like, and after each mode is withdrawn, the exoskeleton upper limb rehabilitation robot can recover to an initial position. After the mode is confirmed, a security code is input to confirm the selection.
The step of selecting a game type 15, starting the selected game 16 is a function of the game center. The patient can select the game type 15 according to the classification of the game center, can select the active, passive, all and others according to the action type, can select the shoulder joint, the elbow joint, the wrist joint, the multi-joint and all according to the joint type, and can select the simple, medium and difficult according to the difficulty. After clicking the corresponding classification, a plurality of games can be selected, such as a beating mouse, a gold miner, thunder and lightning, cognitive selection and the like; the patient selects the corresponding classification according to the rehabilitation game diagnosis and treatment list, and starts the game, so that the patient can perform immersive interaction with the rehabilitation game under the assistance of the mechanical arm.
Step view game record 17 is that the patient can view own game history after each game participation, including specific contents such as action type, difficulty, training date, score and the like of the game, and visual scores can be selected, so that the training score of the patient can be intuitively displayed in a form of a table.
Step review assessment report 18 is an assessment report generated from physician diagnosis and game training results, including various scales to assess the patient's motor and cognitive abilities, etc., optionally in the form of a visual table, which will map the patient's assessment results to a chart.
As shown in fig. 3, the data glove 1 comprises a myoelectric sensor 22, an inertial sensor 23, wherein the myoelectric sensor 22 is used for acquiring a patient physiological state 24, and the inertial sensor 23 is used for acquiring patient movement information 25.
The myoelectric sensors 22 at the corresponding positions are arranged in the data glove 1, and the myoelectric signals at the corresponding positions at the affected side of the patient are collected in real time according to the corresponding training actions, so that the physiological state of the patient can be comprehensively monitored, the safety of the rehabilitation training process is ensured, and meanwhile, the myoelectric sensors are used as bases for a rehabilitation engineer to evaluate the rehabilitation condition of the patient.
In some embodiments of the present invention, three inertial sensors 23 are provided in the data glove 1 to collect the movement information of the patient's affected limb in real time and restore the movement state.
Variations and modifications to the above would be obvious to persons skilled in the art to which the invention pertains from the foregoing description and teachings. Therefore, the invention is not limited to the specific embodiments disclosed and described above, but some modifications and changes of the invention should be also included in the scope of the claims of the invention.

Claims (10)

1. A man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion is characterized by comprising a data glove (1), a binocular camera (2), an upper computer control center (3), an exoskeleton upper limb rehabilitation robot (4) and an augmented virtual reality game center (5),
the data glove (1) is used for being worn on an arm at a patient side, collecting muscle electric signals and posture information of the patient side in real time, and transmitting the data to the upper computer control center (3) as a basis for controlling the exoskeleton upper limb rehabilitation robot (4);
the binocular camera (2) is used for monitoring the joint point position information of a patient in real time, monitoring the joint point position information of the patient in real time through a human skeleton recognition algorithm, realizing the recognition of trunk compensatory motions through a compensatory recognition algorithm, and transmitting data to the upper computer control center (3);
the upper computer control center (3) is used for realizing communication, control and data processing among the exoskeleton upper limb rehabilitation robot (4), the data glove (1) and the binocular camera (2), so that a patient can perform rehabilitation training under a safe condition, and system data generated in real time can assist a therapist to remotely observe the physiological state of the patient and update a treatment scheme in time;
the exoskeleton upper limb rehabilitation robot (4) is used for being worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games;
the augmented virtual reality game center (5) is used for storing patient information, and simultaneously realizing interaction with an augmented virtual reality game by receiving position data of the exoskeleton rehabilitation robot (4), displaying a rehabilitation training scene, a real-time movement gesture of a patient and a compensation evaluation result, interacting with the patient through visual display and voice prompt in the game process, and finally generating a game report.
2. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the binocular camera is mounted above the display.
3. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 2, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the binocular camera is tilted 30 degrees down towards the training scene.
4. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the exercise training comprises active training, passive training, active and passive training and other training modes.
5. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the exoskeleton upper limb rehabilitation robot (4) is provided with a force sensor and an inertial sensor and is used for monitoring the state and position information of the affected limb in real time.
6. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the augmented virtual reality game center (5) comprises a plurality of active games and passive games, a patient selects a rehabilitation game according to a rehabilitation prescription, the patient is placed in a virtual reality environment, affected limb position data acquired by the exoskeleton upper limb rehabilitation robot (4) are interacted, and games and training tasks are completed in the augmented virtual reality environment.
7. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the augmented virtual reality game center (5) stores basic information and training information of a patient, is used as a visual center for collecting data by various sensors of a rehabilitation system, displays a rehabilitation training scene, real-time motion gesture and compensation evaluation results of the patient, interacts with the patient through visual display and voice prompt in the game process, and finally generates a game report.
8. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the myoelectric sensors arranged in the data glove (1) collect the myoelectric signals of the corresponding parts of the affected side of the patient in real time according to the corresponding training actions, and comprehensively monitor the physiological state of the patient.
9. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: three inertial sensors are arranged in the data glove (1) to collect motion information of a patient in real time, the motion information is used for restoring the motion state of the patient, and meanwhile, the motion information collected by the data glove (1) and the binocular camera (2) are mutually complemented and verified to restore the real state of the patient.
10. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to any one of claims 1-9, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the position information of the affected limb joints acquired by the data glove (1) and the binocular camera (2) is subjected to information fusion through Kalman filtering and is used as an input source of an upper computer control center.
CN202310118404.XA 2023-02-15 2023-02-15 Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion Pending CN116370259A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310118404.XA CN116370259A (en) 2023-02-15 2023-02-15 Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310118404.XA CN116370259A (en) 2023-02-15 2023-02-15 Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion

Publications (1)

Publication Number Publication Date
CN116370259A true CN116370259A (en) 2023-07-04

Family

ID=86962314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310118404.XA Pending CN116370259A (en) 2023-02-15 2023-02-15 Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion

Country Status (1)

Country Link
CN (1) CN116370259A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117894428A (en) * 2024-01-15 2024-04-16 沈阳工业大学 Rehabilitation robot control method based on multi-sensor data fusion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117894428A (en) * 2024-01-15 2024-04-16 沈阳工业大学 Rehabilitation robot control method based on multi-sensor data fusion
CN117894428B (en) * 2024-01-15 2024-08-09 沈阳工业大学 Rehabilitation robot control method based on multi-sensor data fusion

Similar Documents

Publication Publication Date Title
US20220338761A1 (en) Remote Training and Practicing Apparatus and System for Upper-Limb Rehabilitation
Bouteraa et al. Training of hand rehabilitation using low cost exoskeleton and vision-based game interface
Lin et al. Shared autonomous interface for reducing physical effort in robot teleoperation via human motion mapping
CN104524742A (en) Cerebral palsy child rehabilitation training method based on Kinect sensor
CN110125909B (en) Multi-information fusion human body exoskeleton robot control protection system
Baldi et al. Design of a wearable interface for lightweight robotic arm for people with mobility impairments
Tröbinger et al. A dual doctor-patient twin paradigm for transparent remote examination, diagnosis, and rehabilitation
CN116370259A (en) Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion
Dragusanu et al. Design, development, and control of a tendon-actuated exoskeleton for wrist rehabilitation and training
Lin et al. Intuitive, efficient and ergonomic tele-nursing robot interfaces: Design evaluation and evolution
Zongxing et al. Human-machine interaction technology for simultaneous gesture recognition and force assessment: A Review
Bouteraa et al. Fuzzy logic-based connected robot for home rehabilitation
Szczurek et al. Enhanced human–robot interface with operator physiological parameters monitoring and 3d mixed reality
Erickson et al. Characterizing multidimensional capacitive servoing for physical human–robot interaction
Baldi et al. Exploiting intrinsic kinematic null space for supernumerary robotic limbs control
Gardner et al. An unobtrusive vision system to reduce the cognitive burden of hand prosthesis control
CN113730190A (en) Upper limb rehabilitation robot system with three-dimensional space motion
Nia et al. Reinforcement learning-based grasp pattern control of upper limb prosthetics in an AI platform
Robson et al. Creating a virtual perception for upper limb rehabilitation
Kolsanov et al. Augmented Reality application for hand motor skills rehabilitation
Kwok et al. A Reliable Kinematic Measurement of Upper Limb Exoskeleton for VR Therapy with Visual-inertial Sensors
Naser et al. Internet-Based Smartphone System for After-Stroke Hand Rehabilitation
BOUTERAA et al. Robot-assisted remote rehabilitation
Wołczowski et al. Concept of a system for training of bioprosthetic hand control in one side handless humans using virtual reality and visual and sensory biofeedback
Chen Design and evaluation of a human-computer interface based on electrooculography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination