Nothing Special   »   [go: up one dir, main page]

CN115985153A - Operation and maintenance personnel training system based on image processing and behavior recognition - Google Patents

Operation and maintenance personnel training system based on image processing and behavior recognition Download PDF

Info

Publication number
CN115985153A
CN115985153A CN202310026362.7A CN202310026362A CN115985153A CN 115985153 A CN115985153 A CN 115985153A CN 202310026362 A CN202310026362 A CN 202310026362A CN 115985153 A CN115985153 A CN 115985153A
Authority
CN
China
Prior art keywords
training
data
model
module
maintenance personnel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310026362.7A
Other languages
Chinese (zh)
Inventor
梁懿
苏江文
黄志勇
王从
李云凡
孙治书
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Fujian Yirong Information Technology Co Ltd
Original Assignee
Fuzhou University
Fujian Yirong Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University, Fujian Yirong Information Technology Co Ltd filed Critical Fuzhou University
Priority to CN202310026362.7A priority Critical patent/CN115985153A/en
Publication of CN115985153A publication Critical patent/CN115985153A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image processing, and discloses an operation and maintenance personnel training system based on image processing and behavior recognition, which comprises: the modeling module is used for extracting modeling information and generating a simulation training model based on the modeling information; the training data acquisition unit is used for acquiring training data in the training process of the operation and maintenance personnel; the training data preprocessing unit is used for processing the acquired training data to obtain a current training model; the model preprocessing module is used for preprocessing the current training model and the reference training model; the scoring module is used for scoring the training of the operation and maintenance personnel; the invention carries out operation and maintenance training simulation based on the virtual reality environment, collects physiological data, action data and image data as training data, processes heterogeneous training data through a comprehensive data processing method, and evaluates the level of operation and maintenance personnel from multiple dimensions of physiology, psychology and knowledge.

Description

Operation and maintenance personnel training system based on image processing and behavior recognition
Technical Field
The invention relates to the technical field of image processing, in particular to an operation and maintenance personnel training system based on image processing and behavior recognition.
Background
Due to the factors such as complex transformer site conditions, uneven personnel quality and the like, the problems of potential safety risks caused by human factors and the like still exist in transformer operation and maintenance operation. Therefore, the comprehensive level of the power transformation operation and maintenance personnel is very important to be evaluated, and the potential safety risk caused by human factors can be effectively reduced; the level of the power transformation operation and maintenance personnel is evaluated in a conventional mode of mechanical examination paper examination, the evaluation content is single, deviation from actual operation is large, and the operation and maintenance level of the power transformation operation and maintenance personnel under a real condition cannot be evaluated.
Disclosure of Invention
The invention provides an operation and maintenance personnel training system based on image processing and behavior recognition, which solves the technical problem that the operation and maintenance level of power transformation operation and maintenance personnel under a real condition cannot be evaluated in the related technology.
The invention provides an operation and maintenance personnel training system based on image processing and behavior recognition, which comprises:
the modeling module is used for extracting modeling information and generating a simulation training model based on the modeling information;
the personnel information management module is used for managing personnel information of operation and maintenance personnel;
the training starting module is used for starting training;
the training data acquisition unit is used for acquiring training data in the training process of the operation and maintenance personnel; the training data comprises physiological data, action data and image data;
the training data preprocessing unit is used for processing the acquired training data to obtain a current training model;
a training database for storing a reference training model including a plurality of action nodes and data patch sets corresponding to the action nodes;
the model preprocessing module is used for preprocessing the current training model and the reference training model;
a reference model selection module for calculating a similarity of the first set of nodes of the current training model and the first set of nodes of the reference training model
Figure 100002_DEST_PATH_IMAGE002
Selecting a degree of similarity to the current training model pick>
Figure 54355DEST_PATH_IMAGE002
The maximum reference training model is used as a final reference training model;
a scoring module for scoring the training of the operation and maintenance personnel, wherein the similarity between the score and the current training model as well as the similarity between the score and the final reference training model
Figure 100002_DEST_PATH_IMAGE004
Is in direct proportion;
and the teaching module recommends teaching contents for the operation and maintenance personnel based on the training scores of the operation and maintenance personnel.
Furthermore, the modeling module comprises a modeling data acquisition module and a model generation module, the modeling data acquisition module is used for acquiring modeling data, and the model generation module generates a simulation training model based on the modeling data.
Further, the personnel information management module comprises:
the personnel information storage module is used for storing personnel information of operation and maintenance personnel;
the personnel information updating module is used for updating the personnel information stored by the personnel information storage module;
and the personnel identification and verification module is used for matching the corresponding personnel information based on the output biological identification information, and if the corresponding personnel information cannot be matched, the personnel identification and verification module cannot pass the verification and refuses the access of the personnel.
Further, the training starting module comprises a training item storage module and a training item selection module, wherein the training item storage module is used for storing training items;
the training item selection module is used for selecting a training item;
the training item selection module may select the training item based on a request input by an operation and maintenance person or a system administrator through a human-machine interface.
And a training environment generation module which generates a training environment based on the scene model, the equipment model and the tool model corresponding to the training item selected by the training item selection module.
Further, each training item is matched with more than one scene model, more than one equipment model and more than one tool model, and the training environment comprises the scene model, the equipment model and the personnel model.
Furthermore, the training data preprocessing unit judges the action time point of the operation and maintenance personnel based on the motion data to serve as an action node, and extracts image data and physiological data at the same time based on the action node time to serve as a data slice set; extracting all data slice sets of the operation and maintenance personnel as a current training model;
the training data preprocessing unit judges the action node and comprises the following steps:
recording action data of an operation and maintenance worker when the operation and maintenance worker starts training as initial action data, traversing action data from the initial action data to backward time frame by time frame until the condition of traversing termination is reached, and recording the time point of traversing termination as a new action node;
then, the following process is executed in an iterative manner, traversal action data is started from the action data corresponding to the action node which is recorded with the latest time and is backward time frame by time frame until the condition of traversal termination is reached, and the time point of traversal termination is recorded as a new action node; the traversal being terminatedThe condition is that the distance traversed to the motion data corresponding to one action data and the previous action node
Figure 100002_DEST_PATH_IMAGE006
Exceeding a set first distance threshold;
the termination conditions for the iteration execution are: all action data has been traversed once.
Further, the step of preprocessing the current training model and the reference training model comprises:
step 101, generating a corresponding patch set object for a data patch set of a training model, wherein the attribute of the patch set object is generated based on motion data and physiological data;
102, generating N first nodes, wherein each first node maps a two-dimensional coordinate point, the two-dimensional coordinate points of all the first nodes are distributed in a matrix array manner, the number of the two-dimensional coordinate points in each column is the same as that of the two-dimensional coordinate points in each row, and the minimum distance between every two adjacent two-dimensional coordinate points is 1;
step 103, randomly selecting N slice set objects of the training model as first slice set objects, and mapping the first slice set objects to first nodes one by one;
step 104, randomly selecting a piece set object from the training model as a second piece set object, and calculating the similarity between the second piece set object and the first piece set object
Figure 100002_DEST_PATH_IMAGE008
Selecting a degree of similarity ^ with a second slice set object>
Figure 666734DEST_PATH_IMAGE008
The largest first slice set object and updating the attribute values of the first slice set object and its associated first slice set object;
the formula for updating the attribute values of the first corpus object and the first corpus object associated therewith is as follows:
Figure 100002_DEST_PATH_IMAGE010
wherein
Figure 100002_DEST_PATH_IMAGE012
The value of the jth attribute representing the first slice set object after the update, <' >>
Figure 100002_DEST_PATH_IMAGE014
The value of the jth attribute representing the first slice set object before the update, <' >>
Figure 100002_DEST_PATH_IMAGE016
A value representing the jth attribute of the second tile set object, t representing the number of times all first tile set objects have been updated;
step 105, iteratively executing step 104 until all slice objects are selected as second slice objects.
Further, the similarity
Figure 783594DEST_PATH_IMAGE002
The calculating method comprises the following steps:
given weighted bipartite graph
Figure 100002_DEST_PATH_IMAGE018
Wherein the set->
Figure 100002_DEST_PATH_IMAGE020
Solving the maximum weight perfect matching of the weighted bipartite graph G through a Kuhn-Munkres algorithm, wherein the set Y is a first node set of a current training model and a first node set of a reference training model;
similarity of patch set objects with initial weights mapped for two first nodes when maximum weights of weighted bipartite graph G are perfectly matched is solved
Figure 763838DEST_PATH_IMAGE002
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 271043DEST_PATH_IMAGE002
Further, the similarity
Figure 811746DEST_PATH_IMAGE004
The calculating method comprises the following steps:
generating a second node for each data slice set of the current training model and the final reference training model;
given weighted bipartite graph
Figure 100002_DEST_PATH_IMAGE022
Wherein the set->
Figure 100002_DEST_PATH_IMAGE024
Solving the maximum weight perfect matching of the weighted bipartite graph U through a Kuhn-Munkres algorithm, wherein the set B is a second node set of the current training model and the set B is a second node set of the final reference training model;
when the maximum weight of the weighted bipartite graph U is perfectly matched, the initial weight is the similarity of two second nodes
Figure 100002_DEST_PATH_IMAGE026
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 268266DEST_PATH_IMAGE004
Wherein the similarity of two second nodes
Figure 869012DEST_PATH_IMAGE026
The calculation formula of (c) is as follows:
Figure 100002_DEST_PATH_IMAGE028
wherein,
Figure 100002_DEST_PATH_IMAGE030
representing the distance of the physiological data corresponding to the second node, device for selecting or keeping>
Figure 100002_DEST_PATH_IMAGE032
Representing the similarity of the second node corresponding to the image data;
Figure 100002_DEST_PATH_IMAGE034
And &>
Figure 100002_DEST_PATH_IMAGE036
Respectively, a first weight and a second weight, which can adjust the degree of similarity between the physiological data and the image data based on the degree of similarity->
Figure 207720DEST_PATH_IMAGE026
The weight of (c).
Further, the calculation formula of the score H is as follows:
Figure 100002_DEST_PATH_IMAGE038
wherein it is present>
Figure 83272DEST_PATH_IMAGE004
And the similarity of the current training model and the final reference training model.
The invention has the beneficial effects that:
the invention carries out operation and maintenance training simulation based on the virtual reality environment, collects physiological data, action data and image data as training data, processes heterogeneous training data through a comprehensive data processing method, and evaluates the level of operation and maintenance personnel from multiple dimensions of physiology, psychology and knowledge.
Drawings
FIG. 1 is a block diagram of an operation and maintenance personnel training system based on image processing and behavior recognition according to the present invention;
FIG. 2 is a block schematic diagram of a modeling module of the present invention;
FIG. 3 is a block diagram of a personnel information management module of the present invention;
FIG. 4 is a block diagram of a training initiation module of the present invention;
FIG. 5 is a flow chart of a method of pre-processing a current training model and a reference training model of the present invention.
In the figure: the system comprises a modeling module 101, a modeling data acquisition module 1011, a model generation module 1012, a personnel information management module 102, a personnel information storage module 1021, a personnel information updating module 1022, a personnel identification verification module 1023, a training starting module 103, a training item storage module 1031, a training item selection module 1032, a training environment generation module 104, a training data acquisition unit 105, a training data preprocessing unit 106, a training database 107, a model preprocessing module 108, a reference model selection module 109, a scoring module 110 and a teaching module 111.
Detailed Description
The subject matter described herein will now be discussed with reference to example embodiments. It is to be understood that such embodiments are merely discussed to enable those skilled in the art to better understand and thereby implement the subject matter described herein, and that changes may be made in the function and arrangement of the elements discussed without departing from the scope of the present disclosure. Various examples may omit, substitute, or add various procedures or components as needed. In addition, features described with respect to some examples may also be combined in other examples.
Example one
As shown in fig. 1 to 5, an operation and maintenance personnel training system based on image processing and behavior recognition includes:
a modeling module 101 for extracting modeling information and generating a simulation training model based on the modeling information;
in one embodiment of the invention, the simulation training model comprises a scene model, an equipment model, a tool model and a personnel model, wherein the equipment model comprises a normal equipment model and a fault equipment model.
In an embodiment of the present invention, the modeling module 101 includes a modeling data acquisition module 1011 and a model generation module 1012, the modeling data acquisition module 1011 is configured to acquire modeling data, and the model generation module 1012 generates a simulation training model based on the modeling data;
the personnel information management module 102 is used for managing personnel information of operation and maintenance personnel;
in one embodiment of the invention, the personnel information of the operation and maintenance personnel comprises personnel information such as personnel ID, personnel gender, personnel age, personnel working age, personnel height, personnel weight, personnel biological identification information and the like.
In one embodiment of the present invention, the personnel information management module 102 includes:
a personnel information storage module 1021 for storing personnel information of the operation and maintenance personnel;
a personnel information updating module 1022 for updating the personnel information stored by the personnel information storage module 1021;
and a person identification and verification module 1023 which matches the corresponding person information based on the output biometric information, and if the corresponding person information cannot be matched, the person identification and verification module fails and access is denied.
The input of the biometric information may be through an external biometric information collection device.
A training starting module 103, wherein the training starting module 103 is used for starting training;
in an embodiment of the present invention, the training initiation module 103 includes a training item storage module 1031 and a training item selection module 1032, where the training item storage module 1031 is used to store training items, and each training item matches more than one scene model, more than one equipment model, and more than one tool model;
the training item selection module 1032 is used to select a training item;
the selection of a training item by the training item selection module 1032 may be selected based on a request input by an operation and maintenance person or a system administrator through a human-machine interface.
And a training environment generation module 104 that generates a training environment based on the scene model, the device model, and the tool model corresponding to the training item selected by the training item selection module 1032.
The training environment comprises a scene model, an equipment model and a personnel model. Wherein the number of the person models is the same as the number of persons participating in the training program.
For example, the same training environment can accommodate three ABC co-training, three personnel models corresponding to three ABC are arranged in the training environment, and if one person is allowed to train, only one personnel model needs to be generated in the training environment.
A training data acquisition unit 105, configured to acquire training data of an operation and maintenance worker during a training process;
the training data comprises physiological data, action data and image data;
in one embodiment of the invention, the physiological data represents physical characteristic indexes of the operation and maintenance personnel, such as heart rate, blood pressure, body temperature and the like.
In one embodiment of the invention, the image data is a virtual image acquired in a virtual training environment, and at least one image acquisition point is arranged in one training environment;
in one embodiment of the invention, the motion data is collected by a wearable motion detection device that includes sensors worn on the person.
A training data preprocessing unit 106, configured to process the acquired training data to obtain a current training model;
judging the action time point of the operation and maintenance personnel as an action node based on the motion data, and extracting image data and physiological data at the same time as a data slice set based on the action node time; extracting all data slice sets of the operation and maintenance personnel as a current training model;
the data slice set is essentially a multi-source heterogeneous data set for extracting nodes at a specific time, and in one embodiment of the invention, a method for judging action nodes is provided, which includes:
distance of motion data corresponding to two adjacent action nodes
Figure 846829DEST_PATH_IMAGE006
Exceeding a set first distance threshold;
traversing action backward by time frame from motion data corresponding to previous action nodeData, the condition of traversal termination is that the distance from one motion data to the motion data corresponding to the previous motion node is traversed
Figure 884055DEST_PATH_IMAGE006
When the set first distance threshold is exceeded, traversing the time point of terminating the corresponding action data as a next action node;
and taking the action data of the operation and maintenance personnel when the operation and maintenance personnel start training as a first action node, and traversing from the action node.
Specifically, the action data when the operation and maintenance personnel start training is recorded as initial action data, the action data is traversed from the initial action data backward by time frame until the condition of traversing termination is reached, and the time point of traversing termination is recorded as a new action node;
then, the following process is executed in an iterative manner, traversal action data of time frames backward is started from the action data corresponding to the action node which is recorded at the latest time until the condition of traversal termination is reached, and the time point of traversal termination is recorded as a new action node;
the termination condition for the iterative execution is: all action data has been traversed once.
In one embodiment of the invention, the distance between the motion data is calculated by taking the motion data as an object in an Euclidean distance and cosine distance equidistance calculation mode;
specifically, the motion data includes a wrist joint angle, a knee joint angle, an elbow joint angle, and a head angle, and the distance between the motion data can be calculated by the following formula
Figure 100273DEST_PATH_IMAGE006
:/>
Figure DEST_PATH_IMAGE040
Wherein,
Figure DEST_PATH_IMAGE042
and &>
Figure DEST_PATH_IMAGE044
An ith personal body angle representing the two motion data, respectively;
in one embodiment of the invention, the length of the time frame is equal to the sampling interval of the motion data, e.g. the interval of motion data samples is 0.5s, then the time frame is equivalently set to 0.5s.
A training database 107 for storing a reference training model including a plurality of action nodes and a data patch set corresponding to the action nodes;
the same as the current training model is that the data sheet set of the reference training model also contains image data and physiological data;
in one embodiment of the invention, the reference training model may be generated by a training record of the operation and maintenance operation performed by a tester according to a standard operation protocol, and is used as a comparison reference; for one training item, different testers can generate a plurality of reference training models according to different training records, so that an appropriate reference training model needs to be selected as a reference during comparison.
A model preprocessing module 108 for preprocessing the current training model and the reference training model;
the pretreatment step comprises:
step 101, generating a corresponding patch object for a data patch set of a training model, wherein the attribute of the patch object is generated based on motion data and physiological data;
for example, attributes of the patch objects include wrist joint angle, knee joint angle, elbow joint angle, head angle, heart rate, blood pressure, body temperature.
102, generating N first nodes, wherein each first node maps a two-dimensional coordinate point, the two-dimensional coordinate points of all the first nodes are distributed in a matrix array manner, the number of the two-dimensional coordinate points in each column is the same as that of the two-dimensional coordinate points in each row, and the minimum distance between every two adjacent two-dimensional coordinate points is 1;
n is proportional to the total number of action nodes of the training model, and in one embodiment of the invention, the formula for N is as follows:
N=
Figure DEST_PATH_IMAGE046
and S is the total number of action nodes of the training model.
Step 103, randomly selecting N slice set objects of the training model as first slice set objects, and mapping the first slice set objects to first nodes one by one;
step 104, randomly selecting a piece set object from the training model as a second piece set object, and calculating the similarity between the second piece set object and the first piece set object
Figure 592565DEST_PATH_IMAGE008
Selecting a degree of similarity ^ with a second slice set object>
Figure 210628DEST_PATH_IMAGE008
The largest first slice set object and updating the attribute values of the first slice set object and its associated first slice set object;
the formula for updating the attribute values of the first slice set object and the first slice set object associated therewith is as follows:
Figure DEST_PATH_IMAGE010A
wherein
Figure 697717DEST_PATH_IMAGE012
Value representing the jth attribute of the first slice set object after an update, based on the value of the jth attribute, and based on the value of the jth attribute>
Figure 401231DEST_PATH_IMAGE014
The value of the jth attribute representing the first slice set object before the update, <' >>
Figure 87427DEST_PATH_IMAGE016
The value of the jth attribute representing the second slice object, and t represents the number of times all the first slice objects have been updated.
In one embodiment of the invention, the similarity of two patch objects
Figure 825576DEST_PATH_IMAGE008
The calculation formula of (a) is as follows: />
Figure DEST_PATH_IMAGE048
Wherein +>
Figure DEST_PATH_IMAGE050
And &>
Figure DEST_PATH_IMAGE052
The z-th attribute values of the two slice set objects, respectively, and p is the total number of attributes of the slice set object.
Of course, cosine similarity can also be adopted to calculate similarity of slice set objects
Figure 17654DEST_PATH_IMAGE008
Step 105, iteratively executing step 104 until all slice set objects are selected as second slice set objects;
the iterative process does not select a duplicate tile set object as the second tile set object.
A reference model selection module 109 for calculating the similarity of the current training model and the reference training model
Figure 208464DEST_PATH_IMAGE002
Selecting a degree of similarity to the current training model pick>
Figure 698351DEST_PATH_IMAGE002
The maximum reference training model is used as a final reference training model;
in one embodiment of the invention, the similarity
Figure 25427DEST_PATH_IMAGE002
The calculating method comprises the following steps:
given an entitlement of twoSectional drawing
Figure 575357DEST_PATH_IMAGE018
Wherein a set +>
Figure 253463DEST_PATH_IMAGE020
Solving the maximum weight perfect matching of the weighted bipartite graph G through a Kuhn-Munkres algorithm, wherein the set Y is a first node set of the current training model and a set Y is a first node set of a reference training model;
solving the similarity of the patch set objects with the initial weights mapped for the two first nodes when the maximum weight of the weighted bipartite graph G is perfectly matched
Figure 547041DEST_PATH_IMAGE002
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 744935DEST_PATH_IMAGE002
The invention comprehensively selects the most suitable reference training model to participate in scoring the operation and maintenance personnel through various aspects such as physical data, motion node number and the like of the operation and maintenance personnel.
A scoring module 110 for scoring the training of the operation and maintenance personnel, wherein the degree of similarity between the score and the current training model and the final reference training model
Figure 200188DEST_PATH_IMAGE004
Is in direct proportion.
Degree of similarity
Figure 365590DEST_PATH_IMAGE004
The calculating method comprises the following steps:
generating a second node for each data slice set of the current training model and the final reference training model;
given weighted bipartite graph
Figure 197279DEST_PATH_IMAGE022
Wherein the set->
Figure 764527DEST_PATH_IMAGE024
Solving the maximum weight perfect matching of the weighted bipartite graph U through a Kuhn-Munkres algorithm, wherein the set B is a second node set of the current training model and the set B is a second node set of the final reference training model;
when the maximum weight of the weighted bipartite graph U is perfectly matched, the initial weight is the similarity of two second nodes
Figure 390680DEST_PATH_IMAGE026
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 43379DEST_PATH_IMAGE004
Wherein the similarity of two second nodes
Figure 678759DEST_PATH_IMAGE026
The calculation formula of (a) is as follows:
Figure DEST_PATH_IMAGE028A
wherein,
Figure 116825DEST_PATH_IMAGE030
represents the distance of the physiological data corresponding to the second node, <' >>
Figure 179459DEST_PATH_IMAGE032
Representing the similarity of the second node corresponding to the image data;
Figure 53874DEST_PATH_IMAGE034
And &>
Figure 492946DEST_PATH_IMAGE036
Respectively, a first weight and a second weight which are defined, and the physiological data and the image data can be adjusted to be the similarity degree->
Figure 769206DEST_PATH_IMAGE026
The weight of (c).
The similarity calculation method of the image data may adopt histogram matching, image similarity calculation based on feature points, and the like.
Distance of physiological data
Figure 2741DEST_PATH_IMAGE030
The calculation can be carried out by adopting an equal distance calculation mode of Euclidean distance and cosine distance.
And the teaching module 111 recommends teaching contents for the operation and maintenance personnel based on the training scores of the operation and maintenance personnel.
And if the training score of the operation and maintenance personnel is lower than the score threshold of the current training item, recommending teaching contents for the operation and maintenance personnel.
The instructional content includes an instructional video generated based on the image data that is ultimately referenced to the training model.
The above description is provided for the embodiments of the present invention, but the present invention is not limited to the above specific embodiments, and the above specific embodiments are only illustrative and not restrictive, and those skilled in the art can make many forms without departing from the spirit of the present invention, and all of them fall within the protection of the present invention.

Claims (10)

1. An operation and maintenance personnel training system based on image processing and behavior recognition is characterized by comprising:
the modeling module is used for extracting modeling information and generating a simulation training model based on the modeling information;
the personnel information management module is used for managing personnel information of operation and maintenance personnel;
the training starting module is used for starting training;
the training data acquisition unit is used for acquiring training data in the training process of the operation and maintenance personnel; the training data comprises physiological data, action data and image data;
the training data preprocessing unit is used for processing the acquired training data to obtain a current training model;
the training database is used for storing a reference training model, and the reference training model comprises a plurality of action nodes and data sheet sets corresponding to the action nodes;
the model preprocessing module is used for preprocessing the current training model and the reference training model;
a reference model selection module for calculating a similarity of the first set of nodes of the current training model and the first set of nodes of the reference training model
Figure DEST_PATH_IMAGE002
Selecting a degree of similarity in conjunction with the current training model>
Figure 174755DEST_PATH_IMAGE002
The maximum reference training model is used as a final reference training model;
a scoring module for scoring the training of the operation and maintenance personnel, wherein the similarity between the score and the current training model as well as the similarity between the score and the final reference training model
Figure DEST_PATH_IMAGE004
Is in direct proportion;
and the teaching module recommends teaching contents for the operation and maintenance personnel based on the training scores of the operation and maintenance personnel.
2. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that the modeling module comprises a modeling data acquisition module and a model generation module, the modeling data acquisition module is used for acquiring modeling data, and the model generation module is used for generating a simulation training model based on the modeling data.
3. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that the personnel information management module comprises:
the personnel information storage module is used for storing personnel information of operation and maintenance personnel;
the personnel information updating module is used for updating the personnel information stored by the personnel information storage module;
and the personnel identification and verification module is used for matching the corresponding personnel information based on the output biological identification information, and if the corresponding personnel information cannot be matched, the personnel identification and verification module cannot pass the verification and refuses the access of the personnel.
4. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that the training starting module comprises a training item storage module and a training item selection module, wherein the training item storage module is used for storing training items;
the training item selection module is used for selecting a training item;
the training item selection module selects the training item based on a request input by an operation and maintenance person or a system administrator through a human-computer interaction interface;
and a training environment generation module which generates a training environment based on the scene model, the equipment model and the tool model corresponding to the training item selected by the training item selection module.
5. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that each training item is matched with more than one scene model, more than one equipment model and more than one tool model, and the training environment comprises the scene model, the equipment model and the personnel model.
6. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that the training data preprocessing unit judges the time point of the operation and maintenance personnel action based on the motion data to be used as an action node, and extracts the image data and the physiological data at the same time based on the time of the action node to be used as a data slice set; extracting all data slice sets of the operation and maintenance personnel as a current training model;
the training data preprocessing unit judges the action node and comprises the following steps:
recording action data of an operation and maintenance worker when the operation and maintenance worker starts training as initial action data, traversing action data from the initial action data to backward time frame by time frame until the condition of traversing termination is reached, and recording the time point of traversing termination as a new action node;
then, the following process is executed in an iterative manner, traversal action data of time frames backward is started from the action data corresponding to the action node which is recorded at the latest time until the condition of traversal termination is reached, and the time point of traversal termination is recorded as a new action node; the condition of traversal termination is that the distance of traversal to the motion data corresponding to the previous action node is traversed to one action data
Figure DEST_PATH_IMAGE006
Exceeding a set first distance threshold;
the termination condition for the iterative execution is: all action data has been traversed once.
7. The operation and maintenance personnel training system based on image processing and behavior recognition, as claimed in claim 1, wherein the step of preprocessing the current training model and the reference training model comprises:
step 101, generating a corresponding patch object for a data patch set of a training model, wherein the attribute of the patch object is generated based on motion data and physiological data;
102, generating N first nodes, wherein each first node maps a two-dimensional coordinate point, the two-dimensional coordinate points of all the first nodes are distributed in a matrix array manner, the number of the two-dimensional coordinate points in each column is the same as that of the two-dimensional coordinate points in each row, and the minimum distance between every two adjacent two-dimensional coordinate points is 1;
step 103, randomly selecting N slice set objects of the training model as first slice set objects, and mapping the first slice set objects to first nodes one by one;
step 104, randomly selecting a patch set object from the training model as a second patchSet object, calculating similarity of the second slice set object and the first slice set object
Figure DEST_PATH_IMAGE008
Selecting a degree of similarity ^ with the second patch set object>
Figure 29579DEST_PATH_IMAGE008
Updating the attribute values of the first slice set object and the first slice set object associated with the first slice set object;
the formula for updating the attribute values of the first corpus object and the first corpus object associated therewith is as follows:
Figure DEST_PATH_IMAGE010
wherein
Figure DEST_PATH_IMAGE012
Value representing the jth attribute of the first slice set object after an update, based on the value of the jth attribute, and based on the value of the jth attribute>
Figure DEST_PATH_IMAGE014
The value of the jth attribute representing the first slice set object before the update, <' >>
Figure DEST_PATH_IMAGE016
A value representing the jth attribute of the second tile set object, t representing the number of times all first tile set objects have been updated;
step 105, iteratively executing step 104 until all slice objects are selected as second slice objects.
8. The operation and maintenance personnel training system based on image processing and behavior recognition as claimed in claim 1, wherein similarity is similarity
Figure 933950DEST_PATH_IMAGE002
Is calculated byComprises the following steps:
given weighted bipartite graph
Figure DEST_PATH_IMAGE018
Wherein the set->
Figure DEST_PATH_IMAGE020
Solving the maximum weight perfect matching of the weighted bipartite graph G through a Kuhn-Munkres algorithm, wherein the set Y is a first node set of a current training model and a first node set of a reference training model;
solving the similarity of the patch set objects with the initial weights mapped for the two first nodes when the maximum weight of the weighted bipartite graph G is perfectly matched
Figure 283154DEST_PATH_IMAGE002
;/>
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 831947DEST_PATH_IMAGE002
9. The operation and maintenance personnel training system based on image processing and behavior recognition as claimed in claim 1, wherein similarity degree
Figure 388830DEST_PATH_IMAGE004
The calculating method comprises the following steps:
generating a second node for each data slice set of the current training model and the final reference training model;
given weighted bipartite graph
Figure DEST_PATH_IMAGE022
Wherein the set->
Figure DEST_PATH_IMAGE024
Is a second node set of the current training model, and the set B is a second node set of the final reference training modelSolving the maximum weight perfect matching of the weighted bipartite graph U through a Kuhn-Munkres algorithm;
when the maximum weight of the weighted bipartite graph U is perfectly matched, the initial weight is the similarity of two second nodes
Figure DEST_PATH_IMAGE026
Obtaining matched weight sum as similarity based on maximum weight perfect matching
Figure 593546DEST_PATH_IMAGE004
Wherein the similarity of two second nodes
Figure 526867DEST_PATH_IMAGE026
The calculation formula of (a) is as follows:
Figure DEST_PATH_IMAGE028
wherein,
Figure DEST_PATH_IMAGE030
represents the distance of the physiological data corresponding to the second node, <' >>
Figure DEST_PATH_IMAGE032
Representing the similarity of the second node corresponding to the image data;
Figure DEST_PATH_IMAGE034
And &>
Figure DEST_PATH_IMAGE036
Respectively, a first weight and a second weight which are defined, and the physiological data and the image data can be adjusted to be the similarity degree->
Figure 585959DEST_PATH_IMAGE026
The weight of (c).
10. The operation and maintenance personnel training system based on image processing and behavior recognition is characterized in that the calculation formula of the score H is as follows:
Figure DEST_PATH_IMAGE038
wherein it is present>
Figure 536247DEST_PATH_IMAGE004
And similarity of the current training model and the final reference training model. />
CN202310026362.7A 2023-01-09 2023-01-09 Operation and maintenance personnel training system based on image processing and behavior recognition Pending CN115985153A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310026362.7A CN115985153A (en) 2023-01-09 2023-01-09 Operation and maintenance personnel training system based on image processing and behavior recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310026362.7A CN115985153A (en) 2023-01-09 2023-01-09 Operation and maintenance personnel training system based on image processing and behavior recognition

Publications (1)

Publication Number Publication Date
CN115985153A true CN115985153A (en) 2023-04-18

Family

ID=85963006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310026362.7A Pending CN115985153A (en) 2023-01-09 2023-01-09 Operation and maintenance personnel training system based on image processing and behavior recognition

Country Status (1)

Country Link
CN (1) CN115985153A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117953738A (en) * 2023-10-11 2024-04-30 公安部道路交通安全研究中心 Action specification training system
CN118607642A (en) * 2024-07-01 2024-09-06 北京大学 Self-adaptive training and reasoning performance optimization system of multi-mode large model

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090150919A1 (en) * 2007-11-30 2009-06-11 Lee Michael J Correlating Media Instance Information With Physiological Responses From Participating Subjects
CN104298713A (en) * 2014-09-16 2015-01-21 北京航空航天大学 Fuzzy clustering based image retrieval method
CN110688921A (en) * 2019-09-17 2020-01-14 东南大学 Method for detecting smoking behavior of driver based on human body action recognition technology
CN111437583A (en) * 2020-04-10 2020-07-24 哈尔滨工业大学 Badminton basic action auxiliary training system based on Kinect
CN113282840A (en) * 2021-07-22 2021-08-20 光谷技术有限公司 Comprehensive training acquisition management platform
CN114237391A (en) * 2021-12-10 2022-03-25 上海工程技术大学 Urban rail transit dispatching virtual training test system and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090150919A1 (en) * 2007-11-30 2009-06-11 Lee Michael J Correlating Media Instance Information With Physiological Responses From Participating Subjects
CN104298713A (en) * 2014-09-16 2015-01-21 北京航空航天大学 Fuzzy clustering based image retrieval method
CN110688921A (en) * 2019-09-17 2020-01-14 东南大学 Method for detecting smoking behavior of driver based on human body action recognition technology
CN111437583A (en) * 2020-04-10 2020-07-24 哈尔滨工业大学 Badminton basic action auxiliary training system based on Kinect
CN113282840A (en) * 2021-07-22 2021-08-20 光谷技术有限公司 Comprehensive training acquisition management platform
CN114237391A (en) * 2021-12-10 2022-03-25 上海工程技术大学 Urban rail transit dispatching virtual training test system and method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李昌华;李智杰;高阳;: "图谱和Kuhn-Munkres算法在图匹配中的应用研究", 计算机工程与科学, no. 10, 15 October 2017 (2017-10-15) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117953738A (en) * 2023-10-11 2024-04-30 公安部道路交通安全研究中心 Action specification training system
CN117953738B (en) * 2023-10-11 2024-09-20 公安部道路交通安全研究中心 Action specification training system
CN118607642A (en) * 2024-07-01 2024-09-06 北京大学 Self-adaptive training and reasoning performance optimization system of multi-mode large model

Similar Documents

Publication Publication Date Title
CN110459324B (en) Disease prediction method and device based on long-term and short-term memory model and computer equipment
Goodfellow et al. Towards understanding ECG rhythm classification using convolutional neural networks and attention mappings
JP2022523741A (en) ECG processing system for depiction and classification
CN110119775B (en) Medical data processing method, device, system, equipment and storage medium
CN115985153A (en) Operation and maintenance personnel training system based on image processing and behavior recognition
WO2016115895A1 (en) On-line user type identification method and system based on visual behaviour
CN106845147B (en) Method for building up, the device of medical practice summary model
CN115064246B (en) Depression evaluation system and equipment based on multi-mode information fusion
WO2021071688A1 (en) Systems and methods for reduced lead electrocardiogram diagnosis using deep neural networks and rule-based systems
CN110464367B (en) Psychological anomaly detection method and system based on multi-channel cooperation
CN113435236A (en) Home old man posture detection method, system, storage medium, equipment and application
CN114190897B (en) Training method of sleep stage model, sleep stage method and device
CN116564511A (en) Chronic disease health state prediction method, device and equipment
CN116110597B (en) Digital twinning-based intelligent analysis method and device for patient disease categories
CN113485555A (en) Medical image reading method, electronic equipment and storage medium
CN113288157A (en) Arrhythmia classification method based on depth separable convolution and improved loss function
CN106510736A (en) Psychological state judging method and system based on multidimensional psychological state indexes
CN116012568A (en) System for acquiring cardiac rhythm information through photographing electrocardiogram
Dasgupta et al. Cardiogan: an attention-based generative adversarial network for generation of electrocardiograms
CN114973048A (en) Method and device for correcting rehabilitation action, electronic equipment and readable medium
CN111523265B (en) System and method for reproducing cut injury cases
CN108903911B (en) Remote acquisition and processing method for traditional Chinese medicine pulse condition information
Schiphorst et al. Video2report: A video database for automatic reporting of medical consultancy sessions
CN111951950A (en) Three-dimensional data medical classification system, method and device based on deep learning
CN111274953B (en) Method and system for judging pain according to expression

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination