CN108804246A - The usability evaluation method of upper limb rehabilitation robot - Google Patents
The usability evaluation method of upper limb rehabilitation robot Download PDFInfo
- Publication number
- CN108804246A CN108804246A CN201810592333.6A CN201810592333A CN108804246A CN 108804246 A CN108804246 A CN 108804246A CN 201810592333 A CN201810592333 A CN 201810592333A CN 108804246 A CN108804246 A CN 108804246A
- Authority
- CN
- China
- Prior art keywords
- subject
- upper limb
- limb rehabilitation
- rehabilitation robot
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000001364 upper extremity Anatomy 0.000 title claims abstract description 87
- 238000011156 evaluation Methods 0.000 title claims abstract description 47
- 238000012360 testing method Methods 0.000 claims abstract description 57
- 230000004424 eye movement Effects 0.000 claims abstract description 36
- 230000033001 locomotion Effects 0.000 claims abstract description 31
- 238000012549 training Methods 0.000 claims abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 22
- 238000002474 experimental method Methods 0.000 claims abstract description 12
- 210000000811 metacarpophalangeal joint Anatomy 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 10
- 210000003813 thumb Anatomy 0.000 claims description 10
- 210000002310 elbow joint Anatomy 0.000 claims description 8
- 210000000707 wrist Anatomy 0.000 claims description 8
- 238000010219 correlation analysis Methods 0.000 claims description 6
- 230000000694 effects Effects 0.000 claims description 6
- 210000003857 wrist joint Anatomy 0.000 claims description 6
- 210000001664 manubrium Anatomy 0.000 claims description 5
- 230000009471 action Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 241001481828 Glyptocephalus cynoglossus Species 0.000 claims description 3
- 210000000038 chest Anatomy 0.000 claims description 3
- 238000010831 paired-sample T-test Methods 0.000 claims description 3
- 241000208340 Araliaceae Species 0.000 claims description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 claims description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 claims description 2
- 235000008434 ginseng Nutrition 0.000 claims description 2
- 210000000245 forearm Anatomy 0.000 claims 1
- 239000000700 radioactive tracer Substances 0.000 abstract description 6
- 238000005516 engineering process Methods 0.000 abstract description 5
- 230000006872 improvement Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000007427 paired t-test Methods 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 206010008190 Cerebrovascular accident Diseases 0.000 description 1
- 241001269238 Data Species 0.000 description 1
- 208000006011 Stroke Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000002490 cerebral effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 230000035876 healing Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 210000000323 shoulder joint Anatomy 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/008—Reliability or availability analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Rehabilitation Tools (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The present invention provides a kind of usability evaluation method of upper limb rehabilitation robot, including step:Build motion capture platform;Record subject information and the basic size information of subject;Multiple predeterminated positions mark a little with subject;The movable information of rehabilitation training of upper limbs test and each mark point of real-time capture is carried out using the training system of target upper limb rehabilitation robot;Build eye tracker experiment porch;Software operation interface performance is carried out using the software systems of target upper limb rehabilitation robot to test;It allows subject to fill in preset subjective assessment questionnaire table and preserves;Usability testing report is formed according to the movable information of mark point, eye movement data and subjective assessment questionnaire.A kind of usability evaluation method of upper limb rehabilitation robot of the present invention evaluates the availability of upper limb rehabilitation robot using motion capture technology and eye movement tracer technique, and specific improvement idea is proposed convenient for tester.
Description
Technical field
The present invention relates to medical instrument usability testing fields more particularly to a kind of availability of upper limb rehabilitation robot to comment
Valence method.
Background technology
Availability is a basic natural quality of product, and being those influences the factor that user experiences product or system
Combination, be end user use product available degree.The concept introducing medical instruments field of availability is particularly important,
The ISO-13485 of International Organization for standardization's publication:It clearly proposes that usability testing must be carried out to all medical instruments in 2016.
A kind of robot of the upper limb rehabilitation robot as treatment patients with cerebral apoplexy upper limb disorder improves rehabilitation effect due to having
The advantages that fruit improves the potentiality of rehabilitation efficiency, reduces rehabilitation training expense has widely been researched and developed and has been used in recent years, but
Rarely people carries out usability testing evaluation to upper limb rehabilitation robot, few in China's usability testing use-case resources bank
Usability testing use-case about upper limb rehabilitation robot, it is seen that upper limb rehabilitation robot progress usability evaluation is had become urgent
Not as good as the trend waited for.
Optical profile type motion capture is to be based on principle of computer vision, by multiple high speed cameras from different perspectives to target signature
The monitoring and tracking of point carry out the technology of motion capture.It is marked by the real-time detection from different perspectives of multiple motion capture cameras
Point, real-time data transmission to data processing stations calculate the space coordinate of mark point according to the accurate volume of principle of triangulation, then
The various movements of human body are calculated from biokinetics principle.Since optical profile type motion capture system has, precision is high, adopts
Sample rate is high, motion capture is accurate, and performance and using flexible are quick, and mark point can arbitrarily increase at low cost very much and arrangement, fits
With range it is very wide the advantages that be widely used in rehabilitation medical field.To be attached to human body each by capturing for optical profile type motion capture system
Mark point information at key point can analyze distance, displacement, speed, acceleration, angle, the angle speed of each artis of human body
The data such as degree and angular acceleration, for people because engineering evaluation provides important evidence.
Eye movement tracer technique is moved as beasurement base with the sight of user, and commenting for visual user interface is primarily suitable for
To estimate, the Eye Tracking Technique of early stage is initially applied to psychological study, in recent years, the progress with eye movement theoretical research and precision
The appearance of Eye-controlling focus device, Eye Tracking Technique are also gradually applied to usability testing field, are carried out for numerous researchers soft
The research of part Interface Usability provides a new approach.Eye movement tracer technique being capable of objective, scientifically evaluation software Interface Element
The visibility of element, the layout of meaningfulness and software interface disclose thinking activities of the user when operating, soft to assess
The availability at part interface is horizontal.
Invention content
Deficiency in for the above-mentioned prior art, the present invention provide a kind of usability evaluation side of upper limb rehabilitation robot
Method can be used the safety of upper limb rehabilitation robot training system, validity and comfort index using motion capture technology
Property evaluation, carrying out software operation interface performance to the software systems of upper limb rehabilitation robot using eye movement tracer technique tests, and tests
Demonstrate,prove validity, efficiency and the user satisfaction of upper limb rehabilitation robot software operation interface.
To achieve the goals above, the present invention provides a kind of usability evaluation method of upper limb rehabilitation robot, including step
Suddenly:
S1:Build a Qualisys motion capture platforms;
S2:Record subject information and the basic size information of a subject;
S3:Multiple predeterminated positions mark a little with the subject;
S4:Rehabilitation training of upper limbs test is carried out using the training system of a target upper limb rehabilitation robot and real-time capture is each
The movable information of the mark point;
S5:Build a Tobii eye tracker experiment porch;
S6:It tests and remembers using the software systems progress software operation interface performance of the target upper limb rehabilitation robot
Record eye movement data;
S7:It allows the subject to fill in preset subjective assessment questionnaire table and preserves;
S8:Judge whether otherwise the subject not tested also continues subsequent step if any return to step S1;
S9:Angle parameter information and working space parameter information are obtained according to the movable information of the mark point, according to institute
It states angle parameter information and working space parameter information obtains one first usability evaluation result;According to the eye movement data and institute
It states subjective assessment questionnaire table and obtains one second usability evaluation result;
S10:It is surveyed according to the first usability evaluation result and second usability evaluation as a result, forming an availability
Examination report.
Preferably, the Qualisys motion captures platform includes more infrared high-speed video cameras, and the infrared high-speed is taken the photograph
Camera is used to capture the passively luminous mark point, is received by the microcomputer system inside the infrared high-speed video camera
The feedback information of the mark point is simultaneously real-time transmitted to the included software platform of Qualisys motion capture platforms and is counted in real time
According to display and later data processing.
Preferably, the subject information includes gender, age, height and weight;The basic size information includes shoulder
Width, chest depth, sitting height, big brachium and small brachium.
Preferably, the mark point uses the witch ball of a diameter of 14mm.
Preferably, the predeterminated position of the mark point includes:Manubrium upper end, left side extremitas acromialis, right side extremitas acromialis,
Left upper extremity elbow joint, left upper extremity wrist joint outside and left upper extremity thumb metacarpophalangeal joint.
Preferably, the Tobii eye trackers experiment porch includes:One eye tracker and a Tobii Studio software platforms,
Tobii Studio software platforms are for recording and preserving the eye movement data.
Preferably, the eye movement data include fixation time, watch points attentively, fixation time hotspot's distribution figure, extra page turning
The degree of deviation and eye movement video recording;
The fixation time starts the institute during having operated until test assignment is completed in operation for the subject
There is the sum of the fixation time at interface;
It is described to watch the institute counted and started for the subject during having operated until test assignment is completed in operation attentively
There is watching attentively for interface to count out;
The fixation time hotspot's distribution figure is for the case where reflecting blinkpunkt and sight activity on the screen;
The extra page turning degree of deviation is the difference of practical page turning number and best page turning number;The practical page turning number is institute
It states subject and completes the clicked page quantity of test assignment operation;The best page turning number is that the subject completes test
The minimum page quantity of task operating;
The eye movement video recording uses the sight motion process and courses of action of mobile phone for reproducing test user.
Preferably, the S9 steps further comprise step:
S91:Angle parameter information and working space parameter information are obtained according to the movable information of the mark point;The angle
Spending parameter information includes:Abduction angle is received in shoulder, shoulder bends and stretches angle, ancon bends and stretches angle and wrist refers to and bends and stretches angle;It is described
Working space parameter information includes distance of the left side extremitas acromialis to left upper extremity thumb metacarpophalangeal joint;
S92:By the angle parameter information with the point of theory data of the corresponding software systems real-time display same
Paired-sample t test is carried out with statistical analysis software under one time shaft, analyzes the angle parameter information and the corresponding theory
Correlation between angle-data obtains a correlation analysis result;
The point of theory data include receipts abduction angle in theoretical shoulder, theoretical shoulder bends and stretches angle, theoretical ancon is bent
Hade degree and theoretical wrist, which refer to, bends and stretches angle;
S93:According to the validity of training system described in the correlation analysis evaluation of result;Meanwhile it is the operation is empty
Between parameter information and the sitting posture working space model of human factor engineering compare, verify the training system safety and
Comfort;Obtain the first usability evaluation result;
S95:Availability issue is found according to the eye movement data;
S96:The validity, efficiency and user satisfaction of the software systems are evaluated according to approve- useful index, obtained
Obtain the second usability evaluation result;
The approve- useful index includes:Operating time, task completion rate, operating mistake number and subjective assessment scoring;
The operating time is the subject from operation is started to the time for completing test assignment, and do not include it is described by
The time of the operation failure of examination person;
The task completion rate is in each operation task, and test assignment operates the ratio shared by successful subject;
The operating mistake number is and completes the inconsistent operation of test assignment correct path, in same test assignment,
Same error is only remembered primary;
The subjective assessment scoring is the score of each pre-set level in the subjective assessment questionnaire table.
The present invention makes it have following advantageous effect as a result of above technical scheme:
The present invention is by the uses of Qualisys motion capture platforms, and acquisition subject is during rehabilitation training of upper limbs
Angle parameter information and working space parameter, and utilize the sitting posture operation of paired t-test statistical analysis technique and human factor engineering
Spatial model, can be carried out to the safety of the training system of target upper limb rehabilitation robot, validity and comfort index can
It is evaluated with property.Meanwhile by the use of Tobii eye tracker experiment porch, using eye movement tracer technique to target upper limb rehabilitation machine
The software operation interface of device people carries out performance test, eye movement data of the acquisition subject in operating software interface process, to more
Kind eye movement data is extracted, is analyzed, and is able to verify that the software operation interface of the software systems of target upper limb rehabilitation robot
Validity, efficiency and user satisfaction.
Description of the drawings
Fig. 1 is the flow chart of the usability evaluation method of the upper limb rehabilitation robot of the embodiment of the present invention;
Fig. 2 is the system structure diagram of the Qualisys motion capture platforms of the embodiment of the present invention;
Fig. 3 is subject's upper limb labelling point predeterminated position schematic diagram of the embodiment of the present invention;
Fig. 4 is the status diagram that the software operation interface performance of the embodiment of the present invention is tested;
Fig. 5~Fig. 8 is the reference data schematic diagram of the angle parameter information of the embodiment of the present invention.
Specific implementation mode
Below according to 1~Fig. 8 of attached drawing, presently preferred embodiments of the present invention is provided, and be described in detail, enabled more preferable geographical
Solve function, the feature of the present invention.
Please refer to Fig.1~Fig. 8, a kind of usability evaluation method of upper limb rehabilitation robot of the embodiment of the present invention, including
Step:
S1:Build a Qualisys motion captures platform 1;
Wherein, Qualisys motion captures platform 1 includes more infrared high-speed video cameras 11, and infrared high-speed video camera 11 is used
In capturing passively luminous mark point 3, the anti-of mark point 3 is received by the microcomputer system inside infrared high-speed video camera 11
Feedforward information is simultaneously real-time transmitted to the included progress realtime curve of software platform 12 of Qualisys motion captures platform 1 and later stage
Data processing.
S2:Record subject information and the basic size information of a subject 4;
Wherein, subject information includes gender, age, height and weight;Basic size information includes shoulder breadth, chest depth, seat
High, big brachium and small brachium.
S3:Multiple predeterminated positions mark a little 3 with subject 4;
In the present embodiment, mark point 3 uses the witch ball of a diameter of 14mm.Predeterminated position includes:Manubrium upper end, left side
Extremitas acromialis, right side extremitas acromialis, left upper extremity elbow joint, left upper extremity wrist joint outside and left upper extremity thumb metacarpophalangeal joint.
S4:Rehabilitation training of upper limbs test and real-time capture are carried out using the training system 2 of a target upper limb rehabilitation robot
The movable information of each mark point 3;
S5:Build a Tobii eye tracker experiment porch;
Wherein, Tobii eye trackers experiment porch includes:One eye tracker and a Tobii Studio software platforms, Tobii
Studio software platforms are for recording and preserving eye movement data.
Eye movement data includes fixation time, watch points attentively, fixation time hotspot's distribution figure, the extra page turning degree of deviation and eye movement
Video recording;
Fixation time starts all interfaces during having operated until test assignment is completed in operation for subject 4
The sum of fixation time;
Watch attentively to count and starts all interfaces during having operated until test assignment is completed in operation for subject 4
Watch attentively and counts out;
Fixation time hotspot's distribution figure is for the case where reflecting blinkpunkt and sight activity on the screen;
The extra page turning degree of deviation is the difference of practical page turning number and best page turning number;Practical page turning number is that subject 4 is complete
Clicked page quantity is operated at test assignment;Best page turning number is the minimum page that subject 4 completes test assignment operation
Face quantity;
Eye movement video recording uses the sight motion process and courses of action of mobile phone for reproducing test user.
S6:It tests and records using a software systems 5 progress software operation interface performance of target upper limb rehabilitation robot
Eye movement data;
S7:It allows subject 4 to fill in preset subjective assessment questionnaire table and preserves;
S8:Judge whether the subject not tested also 4, if any return to step S1, otherwise continues subsequent step;
S9:Angle parameter information and working space parameter information are obtained according to the movable information of mark point 3, is joined according to angle
Number information and working space parameter information obtain one first usability evaluation result;According to eye movement data and subjective assessment questionnaire table
Obtain one second usability evaluation result;
Wherein, S9 steps further comprise step:
S91:Angle parameter information and working space parameter information are obtained according to the movable information of mark point 3;Angle parameter
Information includes:Abduction angle a is received in shoulder, shoulder bends and stretches angle b, ancon bends and stretches angle c and wrist refers to and bends and stretches angle d;Operation is empty
Between parameter information include left side extremitas acromialis to left upper extremity thumb metacarpophalangeal joint distance;
S92:By angle parameter information with the point of theory data of 5 real-time display of corresponding software systems in same time shaft
It is lower to carry out paired-sample t test, the phase between analytic angle parameter information and theory of correspondences angle-data with statistical analysis software
Guan Xing obtains a correlation analysis result;
Point of theory data include receipts abduction angle in theoretical shoulder, theoretical shoulder bends and stretches angle, theoretical ancon bends and stretches angle
Degree and theoretical wrist, which refer to, bends and stretches angle;
S93:According to the validity of correlation analysis evaluation of result training system 2;Meanwhile by working space parameter information with
The sitting posture working space model of human factor engineering compares, and verifies safety and the comfort of training system 2;Obtain first
Usability evaluation result;
S95:Availability issue is found according to eye movement data;
S96:According to approve- useful index to validity, efficiency and the user of the software systems 5 of target upper limb rehabilitation robot
Satisfaction is evaluated, and the second usability evaluation result is obtained;
Approve- useful index includes:Operating time, task completion rate, operating mistake number and subjective assessment scoring;
Operating time is that subject 4 starts operation to the time of completion test assignment certainly, and does not include the operation of subject 4
The time of failure;
Task completion rate is in each operation task, and test assignment operates the ratio shared by successful subject 4;
Operating mistake number is and completes the inconsistent operation of test assignment correct path, same in same test assignment
Error is only remembered primary;
Subjective assessment scoring is the score of each pre-set level in subjective assessment questionnaire table.
S10:According to the first usability evaluation result and the second usability evaluation as a result, forming usability testing report.
Such as:15 subjects 4 are carried out with a kind of usability evaluation side of upper limb rehabilitation robot of the embodiment of the present invention
Method.
First, tester need to build Qualisys motion captures platform 1, by 6 11 reasonable placements of infrared high-speed video camera
Around the upper limb rehabilitation robot of the training system 2 of target upper limb rehabilitation robot (as shown in Figure 2), that can capture completely
Subject to all mark points of 4 upper limb of subject 3, determines and keeps camera position constant after 11 position of infrared high-speed video camera,
Motion capture system is demarcated, systematic error is recorded.
Before experiment starts, tester explains the main flow of experiment to subject 4, and answers the problem of subject 4 proposes.
Before test starts each time, the subject information and the basic size information that record subject 4, wherein subject are needed
The measurement reference position of 4 basic size information is as shown in table 1.
1. basic size information measurement reference position table of table
Then, tester sticks 6 mark points 3 with subject 4, is attached to manubrium upper end, the left side of subject 4 respectively
Right side extremitas acromialis, left upper extremity elbow joint, left upper extremity wrist joint outside and left upper extremity thumb metacarpophalangeal joint, 4 upper limb mark of subject
The schematic diagram of 3 patch point of note point is as shown in Figure 3.
Then, tester starts upper limb rehabilitation robot, into the driving of software systems 5 of target upper limb rehabilitation robot
Limb healing robot carries out self-test.
Tester is according to the subject information and basic size information of subject 4 in upper limb rehabilitation robot software systems
Patient medical record is created, and adjusts the length of robotic exoskeleton mechanical arm to adapt to the size of subject 4.
Subject 4 is allowed to be sitting on upper limb rehabilitation robot seat, tester adjusts height of seat to the comfortable height of subject 4
Degree, then 4 left upper extremity of subject is strapped on the exoskeleton manipulator arm of upper limb rehabilitation robot by tester.
Tester is that subject 4 establishes training program and executes rehabilitation training, and subject 4 creates according to tester in the process
Training program complete corresponding training action, the mark point with acquisition subject 4 in real time of Qualisys motion captures platform 1
3 information.
After the completion of training, tester lightens restrictions on subject 4 from upper limb rehabilitation robot exoskeleton manipulator arm, and takes
Mark point 3 with subject 4 is so as to the use of next bit subject 4.
The movement capturing data that the subject 4 is carried out rehabilitation training by tester is preserved by the number of subject 4.
After the completion of the upper limb rehabilitation robot usability testing of the training system 2 of target upper limb rehabilitation robot, tester
Tobii eye tracker experiment porch is built, as shown in figure 4, subject 4 is needed to be sitting in the software systems of target upper limb rehabilitation robot
5 software operation interface and before building Tobii eye tracker experiment porch, distance 50-70cm between subject 4 and eye tracker, test
4 seat of subject is adjusted to proper height so that eye tracker can capture the eyes of subject 4 by person, and tester starts to eye
Dynamic instrument is demarcated, and subject 4 need to move sight according to 5 points occurred successively on computer screen in the process, after the completion of calibration
It can start the eye movement test of the software operation interface of software systems 5.
Subject 4 carries out corresponding software operation according to the test assignment that tester provides, and tester's record is each in the process
The problem of time and subject 4 that test assignment is completed encounter during the test.
Test assignment is fully completed rear subject 4 and fills in a subjective assessment questionnaire table.
Tester is preserved the eye movement data of subject 4 and subjective assessment questionnaire table by subject number.
15 subjects 4 according to above step carry out upper limb rehabilitation robot training system and software operation interface can
It is tested with property.
The movement capturing data of 15 subjects 4 is handled, 6 mark points 3 are defined connected with rigid body from
And 4 left upper extremity model of subject is obtained, three mark points 3 in manubrium upper end and left and right sides extremitas acromialis can determine a benchmark
Face, mark point 3 and left upper extremity elbow joint mark point 3 can determine the angle of 4 shoulder joint of subject at reference plane, left side extremitas acromialis
It spends, mark point 3, left upper extremity elbow joint mark point 3 and left upper extremity wrist joint outside mark point 3 can be true at the extremitas acromialis of left side
Determine the angle of 4 elbow joint of subject, left upper extremity elbow joint mark point 3, left upper extremity wrist joint outside mark point 3 and left upper extremity
Thumb metacarpophalangeal joint mark point 3 can determine 4 carpal angle of subject, mark point 3 and left upper extremity at the extremitas acromialis of left side
The distance between thumb metacarpophalangeal joint mark point 3 can determine the working space of subject 4.
Receive that abduction angle a, shoulder bends and stretches angle b, ancon bends and stretches angle c, wrist refers to and bends and stretches angle d and operation in selection shoulder
Spatial data simultaneously exports as data form.
By the angle-data of 5 real-time display of software systems of target upper limb rehabilitation robot, i.e. point of theory data, also lead
Enter into data form, movement capturing data is subjected to initialization process, makes two groups of data with reference to same references angle, angle ginseng
It is as shown in table 2 to examine benchmark, joint angles reference data schematic diagram please refers to Fig. 5~Fig. 8.
2. joint angles reference data table of table
Abduction angle is received in shoulder | It is 0 ° that arm, which is vertically put in front, |
Shoulder bends and stretches angle | It is 0 ° that arm, which is vertically put down, |
Ancon bends and stretches angle | It is 0 ° that arm, which stretches, |
Wrist, which refers to, bends and stretches angle | It is 0 ° that palm, which stretches, |
Two groups of angle-datas are being subjected to pairing sample by two groups of angle-datas under same time shaft with statistical analysis software
This t is examined, and setting significance is 0.05, and the correlation between two groups of angle-datas is analyzed by the size of P values and 0.05,
According to the validity of the training system 2 of analysis result evaluation goal upper limb rehabilitation robot.
The sitting posture working space model of working space parameter and human factor engineering are compared, judge left side extremitas acromialis
To left upper extremity thumb metacarpophalangeal joint distance whether within the normal range (NR) of human body sitting posture working space and maximum magnitude, with
This verifies safety and the comfort of the training system 2 of target upper limb rehabilitation robot;
The eye movement data of 15 subjects 4 is handled, by fixation time, watches points attentively and page turning number is united respectively
Meter, calculates subjective assessment score and the extra page turning degree of deviation, and fixation time hotspot's distribution figure and film recording analysis eye movement are real
The thought process and Psychology and behavior of subject 4, therefrom finds out availability issue during testing.
According to 4 operating time, task completion rate, error in operation number, subjective assessment score approve- useful index to target
Validity, efficiency and the user satisfaction of the operation system of software of the software systems 5 of upper limb rehabilitation robot are evaluated.
It is provided according to the usability evaluation result of the training system 2 of target upper limb rehabilitation robot and software systems 5 available
Property test report simultaneously proposes specific improvement idea.
A kind of usability evaluation method of upper limb rehabilitation robot of the embodiment of the present invention, is adopted using motion capture technology
Collect angle parameter information and working space parameter of the subject 4 during rehabilitation training of upper limbs, utilizes paired t-test statistical
The relevant parameter of analysis method and human factor engineering's human body sitting posture working space, can be to the training system of target upper limb rehabilitation robot
Safety, validity and the comfort index of system 2 carry out usability evaluation.Using eye movement tracer technique to target upper limb rehabilitation machine
The software operation interface of the software systems 5 of device people carries out performance test, eye movement number of the experimenter 4 in operating software interface process
According to, a variety of eye movement datas are extracted, are analyzed, be able to verify that the software systems 5 of target upper limb rehabilitation robot software behaviour
Make the validity, efficiency and user satisfaction at interface.By upper limb rehabilitation robot usability testing, availability survey can be summarized as
Examination report, it is indicated that the availability issue of target upper limb rehabilitation robot simultaneously suggests improvements.
The present invention has been described in detail with reference to the accompanying drawings, those skilled in the art can be according to upper
It states and bright many variations example is made to the present invention.Thus, certain details in embodiment should not constitute limitation of the invention, this
Invention will be using the range that the appended claims define as protection scope of the present invention.
Claims (8)
1. a kind of usability evaluation method of upper limb rehabilitation robot, including step:
S1:Build a Qualisys motion capture platforms;
S2:Record subject information and the basic size information of a subject;
S3:Multiple predeterminated positions mark a little with the subject;
S4:Rehabilitation training of upper limbs test is carried out using the training system of a target upper limb rehabilitation robot and real-time capture is each described
The movable information of mark point;
S5:Build a Tobii eye tracker experiment porch;
S6:It is tested using the software systems progress software operation interface performance of the target upper limb rehabilitation robot and records eye
Dynamic data;
S7:It allows the subject to fill in preset subjective assessment questionnaire table and preserves;
S8:Judge whether otherwise the subject not tested also continues subsequent step if any return to step S1;
S9:Angle parameter information and working space parameter information are obtained according to the movable information of the mark point, according to the angle
It spends parameter information and working space parameter information obtains one first usability evaluation result;According to the eye movement data and the master
It sees evaluation questionnaire table and obtains one second usability evaluation result;
S10:According to the first usability evaluation result and second usability evaluation as a result, forming a usability testing report
It accuses.
2. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that described
Qualisys motion capture platforms include more infrared high-speed video cameras, and the infrared high-speed video camera is for capturing passive shine
The mark point, the feedback information of the mark point is received by the microcomputer system inside the infrared high-speed video camera
And it is real-time transmitted to the included software platform of Qualisys motion capture platforms and carries out realtime curve and later data processing.
3. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that the subject
Information includes gender, age, height and weight;The basic size information includes shoulder breadth, chest depth, sitting height, big brachium and forearm
It is long.
4. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that the mark point
Using the witch ball of a diameter of 14mm.
5. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that the mark point
The predeterminated position include:Manubrium upper end, left side extremitas acromialis, right side extremitas acromialis, left upper extremity elbow joint, left upper extremity wrist joint
Outside and left upper extremity thumb metacarpophalangeal joint.
6. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that the Tobii
Eye tracker experiment porch includes:One eye tracker and a Tobii Studio software platforms, Tobii Studio software platforms are used for
Record and preserve the eye movement data.
7. the usability evaluation method of upper limb rehabilitation robot according to claim 1, which is characterized in that the eye movement number
According to including fixation time, watch attentively points, fixation time hotspot's distribution figure, the extra page turning degree of deviation and eye movement video recording;
The fixation time starts institute's bounded during having operated until test assignment is completed in operation for the subject
The sum of the fixation time in face;
It is described to watch the institute's bounded counted and started for the subject during having operated until test assignment is completed in operation attentively
Watching attentively for face is counted out;
The fixation time hotspot's distribution figure is for the case where reflecting blinkpunkt and sight activity on the screen;
The extra page turning degree of deviation is the difference of practical page turning number and best page turning number;The practical page turning number be it is described by
Examination person completes test assignment and operates clicked page quantity;The best page turning number is that the subject completes test assignment
The minimum page quantity of operation;
The eye movement video recording uses the sight motion process and courses of action of mobile phone for reproducing test user.
8. the usability evaluation method of upper limb rehabilitation robot according to claim 7, which is characterized in that the S9 steps
Further comprise step:
S91:Angle parameter information and working space parameter information are obtained according to the movable information of the mark point;The angle ginseng
Counting information includes:Abduction angle is received in shoulder, shoulder bends and stretches angle, ancon bends and stretches angle and wrist refers to and bends and stretches angle;The operation
Spatial parameter information includes distance of the left side extremitas acromialis to left upper extremity thumb metacarpophalangeal joint;
S92:By the angle parameter information with the point of theory data of the corresponding software systems real-time display in same a period of time
Paired-sample t test is carried out with statistical analysis software under countershaft, analyzes the angle parameter information and the corresponding point of theory
Correlation between data obtains a correlation analysis result;
The point of theory data include receipts abduction angle in theoretical shoulder, theoretical shoulder bends and stretches angle, theoretical ancon bends and stretches angle
Degree and theoretical wrist, which refer to, bends and stretches angle;
S93:According to the validity of training system described in the correlation analysis evaluation of result;Meanwhile the working space being joined
Number information and the sitting posture working space model of human factor engineering compare, and verify the safety of the training system and comfortable
Property;Obtain the first usability evaluation result;
S95:Availability issue is found according to the eye movement data;
S96:The validity, efficiency and user satisfaction of the software systems are evaluated according to approve- useful index, obtain institute
State the second usability evaluation result;
The approve- useful index includes:Operating time, task completion rate, operating mistake number and subjective assessment scoring;
The operating time is the time that the subject starts that test assignment is extremely completed in operation certainly, and does not include the subject
Operation failure time;
The task completion rate is in each operation task, and test assignment operates the ratio shared by successful subject;
The operating mistake number is and completes the inconsistent operation of test assignment correct path, same in same test assignment
Error is only remembered primary;
The subjective assessment scoring is the score of each pre-set level in the subjective assessment questionnaire table.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810592333.6A CN108804246A (en) | 2018-06-11 | 2018-06-11 | The usability evaluation method of upper limb rehabilitation robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810592333.6A CN108804246A (en) | 2018-06-11 | 2018-06-11 | The usability evaluation method of upper limb rehabilitation robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108804246A true CN108804246A (en) | 2018-11-13 |
Family
ID=64088166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810592333.6A Pending CN108804246A (en) | 2018-06-11 | 2018-06-11 | The usability evaluation method of upper limb rehabilitation robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108804246A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112395174A (en) * | 2019-08-15 | 2021-02-23 | 中移(苏州)软件技术有限公司 | Data processing method and device, equipment and storage medium |
CN112949031A (en) * | 2021-01-27 | 2021-06-11 | 国家体育总局体育科学研究所 | Upper limb movement space range calculation system, construction method and use method thereof |
CN115670465A (en) * | 2022-09-27 | 2023-02-03 | 上汽通用五菱汽车股份有限公司 | Vehicle dynamic and static user experience evaluation method and device, electronic equipment and medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1637749A (en) * | 2003-12-30 | 2005-07-13 | 现代自动车株式会社 | Usability evaluation system for driver information system of vehicles |
CN102075796A (en) * | 2010-12-14 | 2011-05-25 | 中山大学 | Set top box-based usability testing method |
CN103505179A (en) * | 2012-06-27 | 2014-01-15 | 庄臣及庄臣视力保护公司 | Free form custom lens design manufacturing apparatus, system and business method |
WO2014015100A2 (en) * | 2012-07-18 | 2014-01-23 | Curlin Medical Inc. | Systems and methods for validating treatment instructions |
CN103713728A (en) * | 2014-01-14 | 2014-04-09 | 东南大学 | Method for detecting usability of complex system human-machine interface |
CN105147248A (en) * | 2015-07-30 | 2015-12-16 | 华南理工大学 | Physiological information-based depressive disorder evaluation system and evaluation method thereof |
CN106063699A (en) * | 2016-05-23 | 2016-11-02 | 上海理工大学 | A kind of medical apparatus and instruments description usability evaluation method based on eye movement technique |
CN107951487A (en) * | 2017-12-08 | 2018-04-24 | 上海理工大学 | A kind of multi-parameter collecting system for aiding in pressure relief ball rehabilitation training |
CN108261197A (en) * | 2018-03-19 | 2018-07-10 | 上海理工大学 | Upper limb healing evaluation system and method based on surface myoelectric and motion module |
-
2018
- 2018-06-11 CN CN201810592333.6A patent/CN108804246A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1637749A (en) * | 2003-12-30 | 2005-07-13 | 现代自动车株式会社 | Usability evaluation system for driver information system of vehicles |
CN102075796A (en) * | 2010-12-14 | 2011-05-25 | 中山大学 | Set top box-based usability testing method |
CN103505179A (en) * | 2012-06-27 | 2014-01-15 | 庄臣及庄臣视力保护公司 | Free form custom lens design manufacturing apparatus, system and business method |
WO2014015100A2 (en) * | 2012-07-18 | 2014-01-23 | Curlin Medical Inc. | Systems and methods for validating treatment instructions |
CN103713728A (en) * | 2014-01-14 | 2014-04-09 | 东南大学 | Method for detecting usability of complex system human-machine interface |
CN105147248A (en) * | 2015-07-30 | 2015-12-16 | 华南理工大学 | Physiological information-based depressive disorder evaluation system and evaluation method thereof |
CN106063699A (en) * | 2016-05-23 | 2016-11-02 | 上海理工大学 | A kind of medical apparatus and instruments description usability evaluation method based on eye movement technique |
CN107951487A (en) * | 2017-12-08 | 2018-04-24 | 上海理工大学 | A kind of multi-parameter collecting system for aiding in pressure relief ball rehabilitation training |
CN108261197A (en) * | 2018-03-19 | 2018-07-10 | 上海理工大学 | Upper limb healing evaluation system and method based on surface myoelectric and motion module |
Non-Patent Citations (2)
Title |
---|
ADSO FERNANDEZ-BAENA 等: "Biomechanical Validation of Upper-body and Lower-body Joint Movements of Kinect Motion Capture Data for Rehabilitation Treatments", 《2012 FOURTH INTERNATIONAL CONFERENCE ON INTELLIGENT NETWORKING AND COLLABORATIVE SYSTEMS》 * |
丁竹 等: "基于光学运动捕捉的上肢康复机器人可用性测试研究", 《生物医学工程学进展》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112395174A (en) * | 2019-08-15 | 2021-02-23 | 中移(苏州)软件技术有限公司 | Data processing method and device, equipment and storage medium |
CN112949031A (en) * | 2021-01-27 | 2021-06-11 | 国家体育总局体育科学研究所 | Upper limb movement space range calculation system, construction method and use method thereof |
CN112949031B (en) * | 2021-01-27 | 2023-05-12 | 国家体育总局体育科学研究所 | Upper limb action space range calculation system, construction method and use method thereof |
CN115670465A (en) * | 2022-09-27 | 2023-02-03 | 上汽通用五菱汽车股份有限公司 | Vehicle dynamic and static user experience evaluation method and device, electronic equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Slade et al. | An open-source and wearable system for measuring 3D human motion in real-time | |
US20220005577A1 (en) | Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation | |
D’Antonio et al. | Validation of a 3D markerless system for gait analysis based on OpenPose and two RGB webcams | |
CN110200601B (en) | Device and system for obtaining pulse condition | |
Webster et al. | Experimental evaluation of Microsoft Kinect's accuracy and capture rate for stroke rehabilitation applications | |
CN105832343B (en) | Multidimensional visual hand function rehabilitation quantitative evaluation system and evaluation method | |
CN105031908B (en) | One kind balance correction-type trainer | |
US11426099B2 (en) | Mobile device avatar generation for biofeedback to customize movement control | |
US10722165B1 (en) | Systems and methods for reaction measurement | |
US20170035330A1 (en) | Mobility Assessment Tool (MAT) | |
US20200129109A1 (en) | Mobility Assessment Tracking Tool (MATT) | |
Wang et al. | Literature review on wearable systems in upper extremity rehabilitation | |
US10433725B2 (en) | System and method for capturing spatially and temporally coherent eye gaze and hand data during performance of a manual task | |
CN108804246A (en) | The usability evaluation method of upper limb rehabilitation robot | |
CN111598134B (en) | Test analysis method for gymnastics movement data monitoring | |
CN105902273A (en) | Hand function rehabilitation quantitative evaluation method based on hand ulnar deviation motion | |
Cunha et al. | Real-time evaluation system for top taekwondo athletes: project overview | |
TWI509556B (en) | Goal - oriented Rehabilitation Auxiliary System and Its Work Setting Method | |
Szücs et al. | Improved algorithms for movement pattern recognition and classification in physical rehabilitation | |
Zhu et al. | A protocol for digitalized collection of Traditional Chinese Medicine (TCM) pulse information using bionic pulse diagnosis equipment | |
US20210267494A1 (en) | Analysis system and method of joint movement | |
KR20140132864A (en) | easy measuring meathods for physical and psysiological changes on the face and the body using users created contents and the service model for healing and wellness using these techinics by smart devices | |
Luo et al. | An interactive therapy system for arm and hand rehabilitation | |
Rovini et al. | Vision optical-based evaluation of SensHand accuracy for Parkinson’s disease motor assessment | |
Luo et al. | A virtual reality system for arm and hand rehabilitation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181113 |
|
RJ01 | Rejection of invention patent application after publication |