CN116370259A - Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion - Google Patents
Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion Download PDFInfo
- Publication number
- CN116370259A CN116370259A CN202310118404.XA CN202310118404A CN116370259A CN 116370259 A CN116370259 A CN 116370259A CN 202310118404 A CN202310118404 A CN 202310118404A CN 116370259 A CN116370259 A CN 116370259A
- Authority
- CN
- China
- Prior art keywords
- upper limb
- patient
- limb rehabilitation
- human
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000001364 upper extremity Anatomy 0.000 title claims abstract description 75
- 230000003993 interaction Effects 0.000 title claims abstract description 38
- 230000004927 fusion Effects 0.000 title claims abstract description 27
- 238000012549 training Methods 0.000 claims abstract description 83
- 238000000034 method Methods 0.000 claims abstract description 38
- 230000008569 process Effects 0.000 claims abstract description 38
- 230000003190 augmentative effect Effects 0.000 claims abstract description 22
- 238000011156 evaluation Methods 0.000 claims abstract description 9
- 210000003205 muscle Anatomy 0.000 claims abstract description 7
- 238000004891 communication Methods 0.000 claims abstract description 6
- 238000012545 processing Methods 0.000 claims abstract description 5
- 230000033001 locomotion Effects 0.000 claims description 29
- 230000000007 visual effect Effects 0.000 claims description 15
- 230000001447 compensatory effect Effects 0.000 claims description 13
- 210000003414 extremity Anatomy 0.000 claims description 13
- 230000035790 physiological processes and functions Effects 0.000 claims description 11
- 230000003183 myoelectrical effect Effects 0.000 claims description 9
- 230000009471 action Effects 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 5
- 238000011282 treatment Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 claims description 2
- 230000008878 coupling Effects 0.000 abstract description 6
- 238000010168 coupling process Methods 0.000 abstract description 6
- 238000005859 coupling reaction Methods 0.000 abstract description 6
- 238000013079 data visualisation Methods 0.000 abstract description 6
- 230000006399 behavior Effects 0.000 abstract description 2
- 230000036544 posture Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 6
- 206010019468 Hemiplegia Diseases 0.000 description 4
- 208000006011 Stroke Diseases 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 206010008190 Cerebrovascular accident Diseases 0.000 description 2
- 208000028752 abnormal posture Diseases 0.000 description 2
- 230000002490 cerebral effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 210000001503 joint Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010009 beating Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000003930 cognitive ability Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 238000000554 physical therapy Methods 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 238000011269 treatment regimen Methods 0.000 description 1
- 210000003857 wrist joint Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H1/00—Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
- A61H1/02—Stretching or bending or torsioning apparatus for exercising
- A61H1/0274—Stretching or bending or torsioning apparatus for exercising for the upper limbs
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H15/00—ICT specially adapted for medical reports, e.g. generation or transmission thereof
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5007—Control means thereof computer controlled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2230/00—Measuring physical parameters of the user
- A61H2230/60—Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG]
- A61H2230/605—Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG] used as a control parameter for the apparatus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2230/00—Measuring physical parameters of the user
- A61H2230/62—Posture
- A61H2230/625—Posture used as a control parameter for the apparatus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1012—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/105—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6081—Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Public Health (AREA)
- General Engineering & Computer Science (AREA)
- Epidemiology (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Physical Education & Sports Medicine (AREA)
- Dermatology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Neurosurgery (AREA)
- General Business, Economics & Management (AREA)
- Rehabilitation Therapy (AREA)
- Business, Economics & Management (AREA)
- Animal Behavior & Ethology (AREA)
- Pain & Pain Management (AREA)
- Veterinary Medicine (AREA)
- Neurology (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Cardiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Acoustics & Sound (AREA)
- Rehabilitation Tools (AREA)
- Manipulator (AREA)
Abstract
The invention discloses a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, which comprises an exoskeleton upper limb rehabilitation robot, data gloves, a binocular camera, an augmented virtual reality game center and an upper computer control center. The invention is applied to the upper limb rehabilitation training process. The exoskeleton upper limb rehabilitation robot drives the affected side to perform rehabilitation training through the mechanical arm; the data glove realizes real-time acquisition of the posture of the affected side and acquisition of muscle electric signals; the binocular camera collects joint position information in real time and recognizes compensation behaviors in the training process; the upper computer control center realizes the communication, control and data processing of the equipment, and ensures the safety of rehabilitation training; the virtual reality game center realizes interaction with the game through the exoskeleton upper limb rehabilitation robot, and generates an evaluation report. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.
Description
Technical Field
The invention relates to the field of upper limb rehabilitation, in particular to a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion.
Background
With the increase of living and medical levels, the proportion of the aged population is continuously rising. Cerebral apoplexy is a high morbidity of the elderly, hemiplegia caused by cerebral apoplexy greatly influences the daily life of the patients, and rehabilitation of hemiplegia upper limbs is a common concern. According to nerve plasticity theory, the exercise rehabilitation therapy has a key effect on the recovery of hemiplegia limbs. However, due to the huge number of stroke patients, rehabilitation therapists are seriously deficient, and the traditional artificial physical therapy cannot meet the huge rehabilitation requirements. The upper limb rehabilitation robot can effectively relieve the shortage pressure of rehabilitation therapists, improve the efficiency and effect of rehabilitation training and has great clinical application prospect. Through the scientific training of the man-machine interaction upper limb rehabilitation system with multi-sensor information fusion, the activity range, muscle strength, flexibility and the like of the affected limb of the hemiplegic patient can be recovered, and the self-care capacity of daily life is improved. Traditionally, treatment regimens developed based on rehabilitation experience have differed even for the same patient to exercise the same action. Meanwhile, the evaluation of the rehabilitation effect of the patient depends on the experience or scale of a rehabilitee, and large errors exist. The existing upper limb rehabilitation robot lacks of real-time detection of physiological states and compensatory behaviors of patients, cannot perform timely feedback and improve rehabilitation training, cannot safely monitor and effectively inhibit compensatory movements, and further affects safety and effectiveness of upper limb rehabilitation training. In the rehabilitation process, the patient often has a relatively dry and noisy process, and the emotion after the illness is relatively low, so that the situations of poor rehabilitation enthusiasm, low coordination degree and the like exist, and the rehabilitation effect is affected.
Disclosure of Invention
In order to at least solve one of the technical defects in the prior art, the invention discloses a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, and the scientization, the safeguarding, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process are realized through the fusion processing of physiological, posture and motion information acquired by a plurality of sensors.
The invention provides a man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion, which comprises an exoskeleton upper limb rehabilitation robot, data gloves, a binocular camera, an augmented virtual reality game center and an upper computer control center.
The data glove is worn on the arm of the patient, so that the real-time acquisition of the posture information of the upper limb of the patient and the real-time acquisition of the muscle electrical signals are realized, the data glove is used as a basis for controlling the exoskeleton upper limb rehabilitation robot, and the physiological state of the patient is monitored in real time, so that the safety of the rehabilitation process is realized;
the binocular camera is used for monitoring the joint point position information of a patient in real time, monitoring the joint point position information of the patient in real time through a human skeleton recognition algorithm, realizing the recognition and detection of trunk compensatory motions through a compensatory recognition algorithm, reminding the patient to correct the abnormal posture of the patient in time in the training process through voice and visual feedback, and transmitting data to an upper computer control center so as to realize the effectiveness of the rehabilitation process;
the upper computer control center realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot, the data glove and the binocular camera, ensures coordination and fusion of physiological and motion information acquired by the multiple sensors, and finally enables a patient to perform effective rehabilitation training under safe and reliable conditions. The rehabilitation therapist can view the real-time data of the system, remotely observe the physiological state of the patient and update the treatment scheme in time. Realizing the data coupling of the rehabilitation process;
the exoskeleton upper limb rehabilitation robot is worn on the patient's affected side to assist the patient's affected side in exercise training;
the augmented virtual reality game center is used for storing patient information, and meanwhile, interaction with an augmented virtual reality game is achieved through receiving position data of the exoskeleton rehabilitation robot, a rehabilitation training scene, real-time movement postures of a patient and compensation evaluation results are displayed, the patient interacts with the patient through visual display and voice prompt in the game process, and finally a game report is generated.
Further, the binocular camera install in the display top, 30 degrees orientation training scenes of downward sloping effectively avoid the shielding of training process.
Furthermore, the binocular camera collects environmental image data at 60fps, so that accurate spatial information in a visual field range can be obtained, and positioning accuracy can reach millimeter level.
Further, the rehabilitation training comprises active training, passive training, active and passive training and other training modes. Meanwhile, the force sensor and the inertial sensor loaded on the exoskeleton upper limb rehabilitation robot can monitor the state and position information of the affected limb in real time, so that the scientificalness of the rehabilitation process is realized.
Further, the virtual reality game center comprises a plurality of active games and passive games, and the patient selects the rehabilitation game according to the rehabilitation prescription. The patient is placed in the virtual reality environment, the patient interacts with the patient limb position data collected by the exoskeleton upper limb rehabilitation robot, and the game and training tasks are completed in the augmented virtual reality environment.
Further, the virtual reality game center stores basic information and training information of the patient, and is used as a visual center for collecting data by various sensors of the rehabilitation system, a rehabilitation training scene, real-time motion gestures of the patient and compensation evaluation results are displayed, the virtual reality game center interacts with the patient through visual display and voice prompt in the game process, finally a game report is generated, and rehabilitation therapists, the patient and family members of the patient can intuitively know rehabilitation conditions according to the report. Realizing the fun and data visualization of the rehabilitation process.
Furthermore, the myoelectric sensors arranged in the data glove collect the myoelectric signals of the corresponding parts of the affected side of the patient in real time according to the corresponding training actions, and comprehensively monitor the physiological state of the patient to be used as the basis for the rehabilitation of the patient by a rehabilitation engineer.
Furthermore, three inertial sensors are arranged in the data glove, motion information of a patient is collected in real time and is used for restoring the motion state of the patient, and meanwhile, the motion information collected by the data glove and the binocular camera are mutually complemented and verified to restore the real state of the patient; the motion information is used as the descriptive characteristic of the rehabilitation training actions completed by the patient, so that a rehabilitation therapist can track the rehabilitation condition of the patient in time.
Furthermore, the position information of the affected limb joints acquired by the data glove and the binocular camera is subjected to information fusion through Kalman filtering and is used as an input source of an upper computer control center. The fused data can improve the positioning precision, and the real-time accurate position information of the joints is used as feedback for real-time change of the motion of the virtual character in the augmented virtual reality game, so that the flexible following of the motion of the virtual character to the patient can be realized. Realizing immersion in the rehabilitation process.
Compared with the prior art, the invention has the following advantages and technical effects:
the invention comprises an exoskeleton upper limb rehabilitation robot, a data glove, a binocular camera, an augmented virtual reality game center and an upper computer control center. The invention is applied to the upper limb rehabilitation training process. The data glove is worn on the arm of the affected side, acquires muscle electrical signals and posture information of the affected side of the patient in real time, and transmits the data to the control center of the upper computer to serve as a basis for controlling the exoskeleton upper limb rehabilitation robot; the binocular camera monitors the joint point position information of a patient in real time through a human skeleton recognition algorithm, recognizes the torso compensatory motion through a compensatory recognition algorithm, and transmits data to an upper computer control center; the upper computer control center comprehensively evaluates and processes the information of the patient training process, realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot, the data glove and the binocular camera, ensures coordination and fusion of physiological, posture and motion information acquired by the multiple sensors, and finally enables the patient to perform effective rehabilitation training under the safe condition. The rehabilitation therapist remotely observes the physiological state of the patient through the system data generated in real time and timely updates the treatment scheme; the exoskeleton upper limb rehabilitation robot is worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games, so that multiple training modes such as active rehabilitation training, passive rehabilitation training, active and passive training on the patient's affected side are realized; the augmented virtual reality game center stores patient information, meanwhile, interaction with the augmented virtual reality game is achieved through receiving position data of the exoskeleton rehabilitation robot, results of rehabilitation training scenes, real-time motion postures of patients and compensation evaluation are displayed, the game center interacts with the patients through visual display and voice prompts in the game process, finally, game reports are generated, and rehabilitation therapists, patients and family members of the patients can intuitively know rehabilitation conditions according to the reports. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.
Drawings
FIG. 1 is a block diagram of a human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion;
FIG. 2 is a block diagram of steps of a human-computer interaction upper limb rehabilitation system patient using an augmented virtual reality game center based on multi-sensor information fusion;
fig. 3 is a composition and functional block diagram of a man-machine interaction upper limb rehabilitation system data glove based on multi-sensor information fusion.
The figure shows: 1-data glove, 2-binocular camera, 3-upper computer control center, 4-exoskeleton upper limb rehabilitation robot and 5-augmented virtual reality game center;
11-register or login, 12-power on/initialize, 13-set IP and port number, 14-select training mode, 15-select game type, 16-start game, 17-view game record, 18-view assessment report;
21-data glove, 22-myoelectric sensor, 23-inertial sensor, 24-physiological state, 25-motion information.
Detailed Description
The invention will be further illustrated with reference to specific examples, but is not limited thereto.
As shown in fig. 1, the man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion provided by the invention comprises a data glove 1, a binocular camera 2, an upper computer control center 3, an exoskeleton upper limb rehabilitation robot 4 and an augmented virtual reality game center 5.
The data glove 1 is used for being worn on the arm of the patient on the affected side, collecting muscle electrical signals and posture information of the patient on the affected side in real time, and transmitting data to the upper computer control center 3 to serve as a basis for controlling the exoskeleton upper limb rehabilitation robot 4; the binocular camera 2 monitors three-dimensional position information of joint points of a patient in real time through a human skeleton recognition algorithm, recognizes torso compensatory movements through a compensatory recognition algorithm, corrects abnormal postures of the patient in a training process in time, realizes effectiveness of a rehabilitation process, and transmits data to the upper computer control center 3; the upper computer control center 3 comprehensively evaluates and processes the information of the patient training process, realizes communication, control and data processing among the exoskeleton upper limb rehabilitation robot 4, the data glove 1 and the binocular camera 2, ensures coordination and fusion of physiological, posture and motion information acquired by multiple sensors, and finally enables the patient to perform effective rehabilitation training under the safe condition. The rehabilitation therapist remotely observes the physiological state of the patient through the system data generated in real time, and timely updates the treatment scheme to realize the data coupling of the rehabilitation process; the exoskeleton upper limb rehabilitation robot 4 is used for being worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games, so that multiple training modes such as active rehabilitation training, passive rehabilitation training, active and passive training of the patient's affected side are realized; the augmented virtual reality game center 6 is used for storing patient information, and meanwhile, through receiving the position data of the exoskeleton rehabilitation robot, interaction with the augmented virtual reality game is achieved, results of rehabilitation training scenes, real-time motion postures of patients and compensation evaluation are displayed, in the game process, the game is interacted with the patients through visual display and voice prompt, finally, a game report is generated, and rehabilitation therapists, patients and family members of the patients can intuitively know rehabilitation conditions according to the report. The whole system realizes the scientificalness, the safety, the effectiveness, the data coupling, the interestingness and the data visualization of the upper limb rehabilitation training process.
In some embodiments of the present invention, the human skeleton recognition algorithm employs the openpoise human skeleton recognition algorithm. The compensatory exercise data set of the hemiplegia patient is trained to obtain a compensatory identification algorithm based on MLP, so that the classification and identification of trunk compensatory exercise are realized. The openelse human skeleton recognition algorithm and the MLP-based compensation recognition algorithm are both existing algorithms, and are not specifically described here.
In some embodiments of the present invention, the data glove 1 is used for real-time acquisition of posture information of an upper limb on a patient's affected side and real-time acquisition of muscle electrical signals, and is used as a basis for controlling an exoskeleton upper limb rehabilitation robot to monitor the physiological state of the patient in real time. Realizing the safety of the rehabilitation process.
In some embodiments of the invention, the binocular camera 2 is mounted above the display and is tilted downward 30 degrees towards the training scene, effectively avoiding occlusion during the training process. The binocular camera collects environmental image data at 60fps, accurate spatial information in a visual field range can be obtained, and positioning accuracy can reach millimeter level.
In some embodiments of the present invention, the exoskeleton upper limb rehabilitation robot 4 is worn on the patient's patient side to assist the patient's patient side in performing exercise training, and may perform multiple training modes such as active training, passive training and active and passive training. Meanwhile, the exoskeleton is provided with a force sensor and an inertial sensor, the force sensor and the inertial sensor can monitor the state and position information of the affected limb in real time, under the condition of ensuring the safety of the affected limb, the flexible interaction between the mechanical arm and the affected limb during active training is realized through force and position feedback, and interactive force numerical reference can be provided for resistance training and passive training, so that the scientization of the rehabilitation process is realized.
The augmented virtual reality game center 5 stores a plurality of active games and passive games, and the patient selects a rehabilitation game according to the rehabilitation prescription. The patient is placed in the virtual reality environment, the affected limb position information acquired by the exoskeleton upper limb rehabilitation robot 4 is interacted, and the game and training tasks are completed in the augmented virtual reality environment. The virtual reality game center 5 stores basic information and training information of a patient, and is used as a visual center for collecting data of various sensors of a rehabilitation system, a rehabilitation training scene, real-time motion gesture of the patient and compensation evaluation results are displayed, the virtual reality game center interacts with the patient through visual display and voice prompt in the game process, finally a game report is generated, and rehabilitation therapists, the patient and family members of the patient can intuitively know rehabilitation conditions according to the report. Realizing the fun and data visualization of the rehabilitation process.
In addition, the invention also provides a virtual reality game center with convenient operation, referring to fig. 2, the process of the augmented virtual reality game center used by the patient of the human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion comprises the following steps: log in or register 11, power on/initialize 12, set IP and port number 13, select training mode 14, select game type 15, start game 16, view game record 17, view assessment report 18.
Step login or registration 11 registers the patient using the system for the first time, and after successful registration, the patient can log in, and the subsequent use only needs to log in. The login can be authenticated by the user account number and the password, or by the mobile phone scanning the two-dimensional code.
Step start/initialize 12, set IP and port number 13, select training mode 14 for setting the robotic arm. Before training, the mechanical arm needs to be started/initialized 12, and the starting and initialization of the motor and the pose of the mechanical arm are waited; setting IP and port number 13, selecting connection mode, clicking connection button, connecting mechanical arm program by TCP communication; after connection is completed, the training mode 14 is selected according to the condition of the patient, the training mode selection process comprises the steps of selecting a left hand and a right hand, then selecting a plurality of modes such as an active mode, a passive mode, a power assisting mode, an impedance mode, a data glove mode and the like, and after each mode is withdrawn, the exoskeleton upper limb rehabilitation robot can recover to an initial position. After the mode is confirmed, a security code is input to confirm the selection.
The step of selecting a game type 15, starting the selected game 16 is a function of the game center. The patient can select the game type 15 according to the classification of the game center, can select the active, passive, all and others according to the action type, can select the shoulder joint, the elbow joint, the wrist joint, the multi-joint and all according to the joint type, and can select the simple, medium and difficult according to the difficulty. After clicking the corresponding classification, a plurality of games can be selected, such as a beating mouse, a gold miner, thunder and lightning, cognitive selection and the like; the patient selects the corresponding classification according to the rehabilitation game diagnosis and treatment list, and starts the game, so that the patient can perform immersive interaction with the rehabilitation game under the assistance of the mechanical arm.
Step view game record 17 is that the patient can view own game history after each game participation, including specific contents such as action type, difficulty, training date, score and the like of the game, and visual scores can be selected, so that the training score of the patient can be intuitively displayed in a form of a table.
Step review assessment report 18 is an assessment report generated from physician diagnosis and game training results, including various scales to assess the patient's motor and cognitive abilities, etc., optionally in the form of a visual table, which will map the patient's assessment results to a chart.
As shown in fig. 3, the data glove 1 comprises a myoelectric sensor 22, an inertial sensor 23, wherein the myoelectric sensor 22 is used for acquiring a patient physiological state 24, and the inertial sensor 23 is used for acquiring patient movement information 25.
The myoelectric sensors 22 at the corresponding positions are arranged in the data glove 1, and the myoelectric signals at the corresponding positions at the affected side of the patient are collected in real time according to the corresponding training actions, so that the physiological state of the patient can be comprehensively monitored, the safety of the rehabilitation training process is ensured, and meanwhile, the myoelectric sensors are used as bases for a rehabilitation engineer to evaluate the rehabilitation condition of the patient.
In some embodiments of the present invention, three inertial sensors 23 are provided in the data glove 1 to collect the movement information of the patient's affected limb in real time and restore the movement state.
Variations and modifications to the above would be obvious to persons skilled in the art to which the invention pertains from the foregoing description and teachings. Therefore, the invention is not limited to the specific embodiments disclosed and described above, but some modifications and changes of the invention should be also included in the scope of the claims of the invention.
Claims (10)
1. A man-machine interaction upper limb rehabilitation system based on multi-sensor information fusion is characterized by comprising a data glove (1), a binocular camera (2), an upper computer control center (3), an exoskeleton upper limb rehabilitation robot (4) and an augmented virtual reality game center (5),
the data glove (1) is used for being worn on an arm at a patient side, collecting muscle electric signals and posture information of the patient side in real time, and transmitting the data to the upper computer control center (3) as a basis for controlling the exoskeleton upper limb rehabilitation robot (4);
the binocular camera (2) is used for monitoring the joint point position information of a patient in real time, monitoring the joint point position information of the patient in real time through a human skeleton recognition algorithm, realizing the recognition of trunk compensatory motions through a compensatory recognition algorithm, and transmitting data to the upper computer control center (3);
the upper computer control center (3) is used for realizing communication, control and data processing among the exoskeleton upper limb rehabilitation robot (4), the data glove (1) and the binocular camera (2), so that a patient can perform rehabilitation training under a safe condition, and system data generated in real time can assist a therapist to remotely observe the physiological state of the patient and update a treatment scheme in time;
the exoskeleton upper limb rehabilitation robot (4) is used for being worn on the patient's affected side to drive the upper limb to perform rehabilitation training and participate in rehabilitation games;
the augmented virtual reality game center (5) is used for storing patient information, and simultaneously realizing interaction with an augmented virtual reality game by receiving position data of the exoskeleton rehabilitation robot (4), displaying a rehabilitation training scene, a real-time movement gesture of a patient and a compensation evaluation result, interacting with the patient through visual display and voice prompt in the game process, and finally generating a game report.
2. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the binocular camera is mounted above the display.
3. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 2, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the binocular camera is tilted 30 degrees down towards the training scene.
4. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the exercise training comprises active training, passive training, active and passive training and other training modes.
5. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the exoskeleton upper limb rehabilitation robot (4) is provided with a force sensor and an inertial sensor and is used for monitoring the state and position information of the affected limb in real time.
6. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the augmented virtual reality game center (5) comprises a plurality of active games and passive games, a patient selects a rehabilitation game according to a rehabilitation prescription, the patient is placed in a virtual reality environment, affected limb position data acquired by the exoskeleton upper limb rehabilitation robot (4) are interacted, and games and training tasks are completed in the augmented virtual reality environment.
7. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the augmented virtual reality game center (5) stores basic information and training information of a patient, is used as a visual center for collecting data by various sensors of a rehabilitation system, displays a rehabilitation training scene, real-time motion gesture and compensation evaluation results of the patient, interacts with the patient through visual display and voice prompt in the game process, and finally generates a game report.
8. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the myoelectric sensors arranged in the data glove (1) collect the myoelectric signals of the corresponding parts of the affected side of the patient in real time according to the corresponding training actions, and comprehensively monitor the physiological state of the patient.
9. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to claim 1, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: three inertial sensors are arranged in the data glove (1) to collect motion information of a patient in real time, the motion information is used for restoring the motion state of the patient, and meanwhile, the motion information collected by the data glove (1) and the binocular camera (2) are mutually complemented and verified to restore the real state of the patient.
10. The human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion according to any one of claims 1-9, wherein the human-computer interaction upper limb rehabilitation system is characterized in that: the position information of the affected limb joints acquired by the data glove (1) and the binocular camera (2) is subjected to information fusion through Kalman filtering and is used as an input source of an upper computer control center.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310118404.XA CN116370259A (en) | 2023-02-15 | 2023-02-15 | Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310118404.XA CN116370259A (en) | 2023-02-15 | 2023-02-15 | Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116370259A true CN116370259A (en) | 2023-07-04 |
Family
ID=86962314
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310118404.XA Pending CN116370259A (en) | 2023-02-15 | 2023-02-15 | Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116370259A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117894428A (en) * | 2024-01-15 | 2024-04-16 | 沈阳工业大学 | Rehabilitation robot control method based on multi-sensor data fusion |
-
2023
- 2023-02-15 CN CN202310118404.XA patent/CN116370259A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117894428A (en) * | 2024-01-15 | 2024-04-16 | 沈阳工业大学 | Rehabilitation robot control method based on multi-sensor data fusion |
CN117894428B (en) * | 2024-01-15 | 2024-08-09 | 沈阳工业大学 | Rehabilitation robot control method based on multi-sensor data fusion |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220338761A1 (en) | Remote Training and Practicing Apparatus and System for Upper-Limb Rehabilitation | |
Bouteraa et al. | Training of hand rehabilitation using low cost exoskeleton and vision-based game interface | |
Lin et al. | Shared autonomous interface for reducing physical effort in robot teleoperation via human motion mapping | |
CN104524742A (en) | Cerebral palsy child rehabilitation training method based on Kinect sensor | |
CN110125909B (en) | Multi-information fusion human body exoskeleton robot control protection system | |
Baldi et al. | Design of a wearable interface for lightweight robotic arm for people with mobility impairments | |
Tröbinger et al. | A dual doctor-patient twin paradigm for transparent remote examination, diagnosis, and rehabilitation | |
CN116370259A (en) | Human-computer interaction upper limb rehabilitation system based on multi-sensor information fusion | |
Dragusanu et al. | Design, development, and control of a tendon-actuated exoskeleton for wrist rehabilitation and training | |
Lin et al. | Intuitive, efficient and ergonomic tele-nursing robot interfaces: Design evaluation and evolution | |
Zongxing et al. | Human-machine interaction technology for simultaneous gesture recognition and force assessment: A Review | |
Bouteraa et al. | Fuzzy logic-based connected robot for home rehabilitation | |
Szczurek et al. | Enhanced human–robot interface with operator physiological parameters monitoring and 3d mixed reality | |
Erickson et al. | Characterizing multidimensional capacitive servoing for physical human–robot interaction | |
Baldi et al. | Exploiting intrinsic kinematic null space for supernumerary robotic limbs control | |
Gardner et al. | An unobtrusive vision system to reduce the cognitive burden of hand prosthesis control | |
CN113730190A (en) | Upper limb rehabilitation robot system with three-dimensional space motion | |
Nia et al. | Reinforcement learning-based grasp pattern control of upper limb prosthetics in an AI platform | |
Robson et al. | Creating a virtual perception for upper limb rehabilitation | |
Kolsanov et al. | Augmented Reality application for hand motor skills rehabilitation | |
Kwok et al. | A Reliable Kinematic Measurement of Upper Limb Exoskeleton for VR Therapy with Visual-inertial Sensors | |
Naser et al. | Internet-Based Smartphone System for After-Stroke Hand Rehabilitation | |
BOUTERAA et al. | Robot-assisted remote rehabilitation | |
Wołczowski et al. | Concept of a system for training of bioprosthetic hand control in one side handless humans using virtual reality and visual and sensory biofeedback | |
Chen | Design and evaluation of a human-computer interface based on electrooculography |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |