Nothing Special   »   [go: up one dir, main page]

CN116467739B - Big data storage system and method for computer - Google Patents

Big data storage system and method for computer Download PDF

Info

Publication number
CN116467739B
CN116467739B CN202310328714.4A CN202310328714A CN116467739B CN 116467739 B CN116467739 B CN 116467739B CN 202310328714 A CN202310328714 A CN 202310328714A CN 116467739 B CN116467739 B CN 116467739B
Authority
CN
China
Prior art keywords
information
state
eye
face
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310328714.4A
Other languages
Chinese (zh)
Other versions
CN116467739A (en
Inventor
李义庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chizhou Guihong Information Technology Co ltd
Original Assignee
Chizhou Guihong Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chizhou Guihong Information Technology Co ltd filed Critical Chizhou Guihong Information Technology Co ltd
Priority to CN202310328714.4A priority Critical patent/CN116467739B/en
Publication of CN116467739A publication Critical patent/CN116467739A/en
Application granted granted Critical
Publication of CN116467739B publication Critical patent/CN116467739B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Bioethics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Debugging And Monitoring (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a computer big data storage system and a method, and belongs to the technical field of data security. The system comprises a data acquisition module, a data processing module, a function calling module and a data storage module; the data acquisition module can collect face information, mouse input information and keyboard input information and send the information to the data processing module; the data processing module is used for analyzing and calculating the information transmitted by the data acquisition module, determining state information according to the current state and a calculation result, and generating the state information to the function calling module; the function calling module is used for calling external equipment and a prompt window, setting state time and protecting delay and backup of sensitive operation of important files; the data storage module is used for adjusting the threshold value which needs to be referred in system judgment, so that the threshold value is more in line with the habit of a user, and all information is backed up and stored.

Description

Big data storage system and method for computer
Technical Field
The invention relates to the technical field of data security, in particular to a computer big data storage system and a method.
Background
With the continuous development of internet technology and the continuous expansion of application scenes, big data are becoming core resources in the information age. The storage of big data is an important basis for big data processing, and massive data is usually stored in a database server in real time, and is routinely maintained and managed by a database manager. In daily operations, database administrators often need to face challenges such as complex database structures, large data operations, and changing business requirements, which may lead to poor conditions for the database administrators and thus to work errors.
Studies have shown that long-term working pressures and fatigue can negatively impact worker attention, reaction rate, decision making ability, and judgment, which can lead to errors in database administrators. For example: sending non-desensitized files to unsafe objects causes problems such as data disclosure, covering certain tables or deleting certain fields by mistake, and the like, which are very likely to cause unsafe data, even serious, causing irrecoverable huge losses. Therefore, in the processes of big data storage and daily operation maintenance, the user state is monitored, and the sensitive operation of important files is protected in real time, so that the working efficiency of a database manager is improved, the risk of working errors is reduced, and the stability and the safety of a database system are further improved.
Disclosure of Invention
The invention aims to provide a computer big data storage system and a computer big data storage method, so as to solve the problems in the background technology.
In order to solve the technical problems, the invention provides the following technical scheme: a computer big data storage system, the system comprising: the system comprises a data acquisition module, a data processing module, a function calling module and a data storage module;
the data acquisition module can collect face information, mouse input information and keyboard input information and send the information to the data processing module; the data processing module analyzes and calculates the information transmitted by the data acquisition module, gradually judges whether the triggering condition of state change is met or not, and confirms or releases the protection state; the function calling module is used for calling functions of external equipment, calling a prompt window, setting state time and protecting sensitive operation of important files; the data storage module is used for adjusting the threshold value which needs to be referred in system judgment and carrying out backup storage on all information.
By the technical scheme, the real-time monitoring of the user state is realized, the setting of the system state is determined according to the user state, and the sensitive operation of the important file is automatically protected under the protection state.
The data acquisition module comprises an input information acquisition unit and an image information acquisition unit. The input information acquisition unit acquires data through test software installed in the computer, the data comprise mouse displacement speed information and keyboard input speed information, and the system sends the acquired information to the data processing module. The image information acquisition unit acquires face information of a user through the external camera, and the system sends acquired information to the data processing module. Wherein:
when the mouse displacement speed information and the keyboard input speed information acquired by the test software are both zero or the camera cannot acquire the face information, the situation that the user is not in front of the computer at the moment is indicated, the freezing time is entered, the setting of the state time is not occupied by the freezing time, and after the user input or the face information is detected, the freezing time is exited, and the state time begins to be timed.
Because the user does not continuously move the mouse and continuously tap the keyboard, the average value of the continuous displacement speed of the mouse and the tap speed of the keyboard is generally collected for a period of time during data acquisition, so that the influence caused by the instantaneous speed is reduced.
Through the technical scheme, the required information can be collected through various devices, and data support is provided for the subsequent data processing module.
The data processing module comprises a primary trigger judging unit, a secondary trigger judging unit and a state setting unit.
The primary trigger judging unit is used for analyzing the input information transmitted by the data acquisition module. And sending instruction information to an external equipment unit of the function calling module, and starting test software to start to acquire the mouse displacement speed information and the keyboard input speed information. And judging the information and corresponding threshold information reserved in the system respectively, wherein when the information is larger than the threshold, the information is normal, and when the information is smaller than or equal to the threshold and larger than zero, the information is abnormal, and recording the mouse displacement speed information or the keyboard input speed information.
The secondary trigger judging unit is used for analyzing the face information transmitted by the data acquisition module. Firstly, sending instruction information to an external equipment unit of a function calling module, starting a camera to start collecting face information, recording eye closing frequency in a period of time, and analyzing and calculating the position of eyes and a pupil coordinate set; secondly, sending instruction information to a prompt window unit of a function call module, starting a popup window function, recording the time when the number of pupil coordinate sets is changed and is larger than an error threshold value when the popup window appears according to human eye data acquired by a camera, and calculating the difference between the time and the popup window prompt time, wherein the obtained time difference is the attention transfer time; finally, substituting the eye closing frequency and the attention transfer time into a formula to calculate the face state degree, comparing and judging with a face state degree threshold reserved in the system, and if the face state degree threshold is smaller than or equal to the threshold, determining that the face state degree is normal; if the threshold is greater than the threshold, the abnormal condition is detected.
The state setting unit is used for collecting the conditions of the primary trigger judging unit and the secondary trigger judging unit and setting the state by combining the current state of the system. The state setting information includes state change information and state duration: the state change information comprises a normal state and an abnormal state, the system does not do any operation in the normal state, the system automatically enters a protection state in the abnormal state, and a user can automatically delay and forcedly backup the sensitive operation of important files in the protection state; the state duration time information is set according to the fixed time reserved in the system; after the setting is completed, the state change information and the state duration information are sent to a state protection unit and a state time unit of the function calling module in the form of instructions. Wherein:
the minimum setting of the state duration is not lower than the time for the test software and the camera to collect data, the administrator sets the state duration according to the fatigue state duration of the user and the condition that the user is not disturbed to work normally, after the state setting is completed, the state is not changed in the state duration, data collection is not performed, and the data collection is started after the state duration is exceeded.
The judgment of the system state firstly acquires the state information of the system, and the system state is divided into two cases:
The self state is a normal state: the first-stage triggering judging unit is in a normal condition, and the system is changed into a normal state; the first-stage triggering judging unit is abnormal, the condition information of the second-stage triggering judging unit is continuously analyzed, the second-stage triggering judging unit is normal, the system is changed to be in a normal state, the second-stage triggering judging unit is abnormal, the system is changed to be in an abnormal state, and the system automatically enters a protection state.
The self state is an abnormal state: the first-stage triggering judging unit is an abnormal condition, and the system is changed into an abnormal state; the first-stage trigger judging unit is normal, the condition information of the second-stage trigger judging unit is continuously analyzed, the second-stage trigger judging unit is abnormal, the system is changed to an abnormal state, the second-stage trigger judging unit is normal, the system is changed to a normal state, and the system automatically exits from a protection state.
And secondly, in order to prevent the state from being changed too frequently, the system can continue to be undetected for a period of time after each state change, namely the primary trigger judging unit and the secondary trigger judging unit do not judge in the period of time, and the state of the system is not changed. The state duration is the time set by a state time unit in the function calling module, and after the time exceeds the time, the first-stage triggering judging unit and the second-stage triggering judging unit judge again, so that the system state is changed; finally, the state change information and the state duration information are sent to a state protection unit and a state time unit of the function call module in the form of instructions.
Through the technical scheme, the judgment of the user state is realized, and the state setting is performed by combining the calculation result condition of each judgment unit with the current state of the system.
The function calling module comprises an external device unit, a prompt window unit, a state protection unit and a state time unit.
The external equipment unit is used for receiving instruction information transmitted by the data processing module; starting or closing test software, wherein the test software is used for collecting mouse displacement speed information and keyboard input speed information; and starting or closing a camera, wherein the camera is used for collecting face information.
The prompt window unit is used for receiving instruction information transmitted by the data processing module, calling a popup function, when the camera is started, the system automatically calls the popup, the popup position is set according to a certain distance between a stay position of a mouse or an input position of a keyboard, the popup page is supposed to have identification degree, the popup page is in clear contrast with the surrounding environment and can easily draw the attention of a user, the popup content is prompt of the current state information of the user, and is provided with a verification code picture, a verification code input frame and a confirmation button, the user clicks the confirmation button after inputting the verification code according to the picture, the popup disappears, and meanwhile, the instruction information is sent to an external equipment unit of the function calling module, and test software and the camera are closed; and automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input, and sending the speed information to a threshold value adjusting unit of the data storage module.
The state protection unit is used for receiving the state change information transmitted by the state setting unit of the data processing module, debugging the protection state, confirming the protection state under abnormal conditions, and releasing the protection state under normal conditions; in the protection state, the sensitive operation of the important file is delayed, the operated object is automatically backed up in the delay period, and all the operations in the protection state are recorded in a log.
The state time unit is used for receiving the state duration time information transmitted by the state setting unit of the data processing module, setting the state time, and in the set time, the data acquisition module does not acquire data and starts to acquire data when the set time is exceeded.
By the technical scheme, the camera and the test software are started or closed, and the system state and the state duration are set.
The data storage module comprises a threshold adjustment unit and a data storage unit.
The threshold value adjusting unit is used for receiving speed information transmitted by the prompt window unit of the function calling module, the speed information comprises the displacement speed of the mouse moving to the verification code input frame and the keyboard input speed when the verification code is input, and the speed information and the corresponding speed threshold value information reserved in the system are averaged to obtain the adjusting information of the corresponding threshold value.
The data storage unit is used for storing the adjusted various threshold information, the acquired information and the log information into a database; the threshold information comprises a mouse displacement speed threshold, a keyboard input speed threshold, an error threshold and a face state degree threshold; the acquisition information comprises mouse displacement speed information, keyboard input speed information, attention transfer time information and face state degree information; the log information includes state change information, operation information in a protection state, and backup file location information.
Through the technical scheme, readjustment of various threshold information required by system judgment is realized, and all information of the current state change is stored and recorded.
A method for storing big data in a computer, the method comprising the steps of:
s1, acquiring input information of a keyboard and a mouse, and judging whether the input information is in a normal condition or not;
s2, acquiring face information, substituting the face information into a formula to calculate a user state value, and judging whether the face information is in a normal condition or not;
s3, judging confirmation or release of the protection state according to the S1 and S2 conditions;
S4, adjusting and judging influence factors, and carrying out log recording and backup on all operation information in the state;
s5, after the state duration time is exceeded, the step S1 is re-entered.
In S1, acquiring input information of a keyboard and a mouse is to acquire data through test software installed in a computer, wherein the data comprises mouse displacement speed information and keyboard input speed information; at intervals, calling test software to collect mouse displacement speed information and keyboard input speed information, judging the information and corresponding threshold information reserved in the system respectively, and if the information is larger than the threshold, judging the information as normal; if the displacement speed information is smaller than or equal to the threshold value, recording the displacement speed information of the mouse or the input speed information of the keyboard; the formula is as follows:
In the formula, S result is a mouse displacement speed judgment result, S n is a mouse displacement speed of the nth actual test, Z s is a mouse displacement speed threshold, J result is a keyboard input speed judgment result, J n is a keyboard input speed of the nth actual test, and Z J is a keyboard input speed threshold.
Through the technical scheme, the input information of the user is collected, the comparison and judgment are carried out through the input speed and the corresponding threshold information reserved in the system, and the judgment result is given.
In S2, the calculating and judging of the state value includes the following steps:
S201, starting a camera function, performing image acquisition on a user in front of a computer, loading a Haar cascade classifier by adopting an existing OpenCV image processing library, and performing face detection on an image acquired by the camera.
S202, for the detected face area, performing eye detection by using a Haar cascade classifier again and determining the eye position; and calculating the position coordinates of the eyes in the whole image according to the positions of the eyes and the relative positions of the face areas.
S203, face information is detected by using 68 key points of the face in Dlib libraries, and eye feature point coordinates are acquired according to eye area position coordinate information; each eye is represented by 6 coordinates, from the inner corner, an edge line of an eye corner area is taken as a path, an inner eye corner M 1, an upper left eye M 2, an upper right eye M 3, an outer eye corner M 4, a lower right eye M 5 and a lower left eye M 6 are respectively collected according to a clockwise direction, the coordinates are substituted into a formula, the aspect ratio of the eye is calculated, and the state of the eye is judged according to the aspect ratio change within a period of time; the formula is as follows:
Where HZB is the eye aspect ratio, n is the lateral weighting coefficient, M 1 is the inner corner feature point coordinate, M 2 is the upper left eye feature point coordinate, M 3 is the upper right eye feature point coordinate, M 4 is the outer corner feature point coordinate, M 5 is the lower right eye feature point coordinate, and M 6 is the lower left eye feature point coordinate.
When the eyes of the human body are opened, the HZB fluctuates within a certain range of values; when the human eye closes, HZB drops rapidly, approaching 0; the face video recorded by the camera is analyzed frame by frame, and the frame number duty ratio of HZB lower than the eye-closing threshold value in a period of time is calculated, wherein the formula is as follows:
Wherein EC is eye-closing frequency, BYZ is the number of frames of HZB lower than the eye-closing threshold in time, and ZSS is the total number of frames in time.
S204, detecting the center of the iris in the eye area according to the position information of the eye in the image; smoothing the eye region using a gradient minimization method, obtaining a rough estimate of the iris center by color intensity, applying a distance filter to remove invalid edges that are too close or too far from the rough center of the iris, applying RANSAC to the edge points of the iris, and then calculating the radius of the iris, combining the intensity energy and the edge intensity information to locate the iris center; the calculation formula is as follows:
C1=∑(Q×r)
Where C 1 is intensity energy, C 2 is edge intensity, Q is eye area, r is a circular window with the same radius as the iris, g x and g y are the horizontal and vertical gradients of the pixel, respectively. To detect iris centers, minimizing intensity energy in a circular window, maximizing edge intensity of iris edges; the trade-off is controlled by the parameter τ, the formula is as follows:
Wherein (x a,ya) is the coordinate of the center of the iris, the integration interval is [ -15 pi, 15 pi ] and [45 pi, 65 pi ], (x, y) is the coordinate, C 1 is the intensity energy, C 2 is the edge intensity, P is the influencing factor, and τ is the parameter; the arc of the iris edge corresponds to the same range of arcs in a circle of radius r, and the integral is calculated by the sum of the edge intensities of each pixel located on the arc.
S205, converting an RGB image into HSV, setting an HSV range value of a color to be extracted according to pixel color information corresponding to iris center coordinates, calling inRange functions of an OpenCV image processing library to extract a color space, removing noise points after binarization, connecting break points, calling findContours functions to perform contour detection, drawing a rectangular area to locate the contour, and putting all locating coordinate points in the iris coordinate range into a set X, wherein the set X is a pixel position set of pupils of a human eye in the image, the set X comprises { X 1,X2,...,Xz }, wherein X 1,X2,...,Xz respectively represents 1 st, 2 nd, z data in the pupil coordinate set, and each piece of data comprises the abscissa of the pixel points.
S206, automatically calling a popup window function, and simultaneously recording the occurrence time T 1 of the popup window, wherein the camera continuously collects and analyzes pupil coordinate sets in the face information; when the number of the sets is changed, judging whether the number of the changed sets is larger than an error threshold value and smaller than or equal to the error threshold value, and not processing; when the number is larger than the error threshold value, recording the time T 2 when the number is changed, and carrying out calculation in a formula to obtain the attention transfer time:
Wherein, T is the attention transfer time, T 1 is the pop-up window appearance time, T 2 is the time when the number is changed, X is the number of the original set, GB is the number of the changed set, V is the error threshold, next is the end of the judgment, and the next data judgment is continued.
The popup window content is information for prompting the current state of the user, and is provided with a verification code input box and a confirmation button, the user clicks the confirmation button according to the prompt for inputting the verification code, the popup window disappears, and meanwhile, the test software and the camera are closed; and automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input.
S207, substituting the eye closing frequency and the attention transfer time into a formula, comprehensively calculating the human face state attitude, and if the calculated result is smaller than or equal to a human face state attitude threshold, determining that the human face state attitude is normal; when the face state degree is larger than the face state degree threshold, the user is in a tired state and is abnormal; the formula is as follows:
Wherein, ZTD is human face state attitude, alpha is eye closing influence factor, EC is eye closing frequency, beta is attention influence factor, T is attention transfer time, B is standard attention transfer time, gamma is mouse displacement speed influence factor, R s is recorded mouse displacement speed, the value is greater than zero, Z s is mouse displacement speed threshold value, delta is keyboard input speed influence factor, R J is recorded keyboard input speed, the value is greater than zero, Z J is keyboard input speed threshold value, Z G is human face state degree threshold value, and result is judgment result.
Through the technical scheme, the face information of the user is collected, the attention transfer time of the user is collected through a popup window, the face attitude of the user is comprehensively calculated, the face attitude is compared with the corresponding threshold information reserved in the system, and a judgment result is given.
In S3, the judgment of the protection state needs to be performed according to the states of S1 and S2 and the current state; two situations are distinguished:
The current state is a normal state: s1, the state is normal, and the system is changed into a normal state; s1 state is abnormal, S2 state information is continuously analyzed, S2 state is normal, the system is changed to be in a normal state, S2 state is abnormal, the system is changed to be in an abnormal state, and the system automatically enters a protection state.
The current state is an abnormal state: s1, the state is abnormal, and the system is changed into an abnormal state; s1 state is normal, S2 state information is continuously analyzed, S2 state is abnormal, the system is changed into abnormal state, S2 state is normal, the system is changed into normal state, and the protection state is automatically exited.
Through the technical scheme, the judgment results of the S1 and the S2 are combined with the current state of the system to carry out comprehensive judgment, and the setting information of the state is given.
In S4, judging that the influencing factors comprise a mouse displacement speed threshold and a keyboard input speed threshold, respectively calculating average values of the displacement speed of the mouse collected in the S2 step to the verification code input frame and the keyboard input speed when the verification code is input, and the mouse displacement speed threshold and the keyboard input speed threshold, and obtaining adjustment information of the corresponding threshold.
Through the technical scheme, readjustment of various threshold information needed by system judgment is realized, so that the threshold value is more in line with the input habit of a user.
Compared with the prior art, the invention has the following beneficial effects:
1. In the judging stage, the invention monitors the input speed of the user through the test software, takes the input speed information as an abnormal triggering condition, only in the triggered state, the camera starts to collect the face information, thereby reducing the pressure of the system on the analysis and calculation of the face data and improving the monitoring efficiency.
2. In the state value calculation stage, the attention transfer time of the user is tested in a popup window reminding mode, and the attention transfer time is adopted to judge the state information of the user; when the user notices the popup window, the state is basically recovered to be normal, and the popup window can be closed only by inputting the verification code, and the average value is obtained by collecting the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input and the corresponding threshold value reserved in the system, so that the system can be more suitable for the user input habit.
3. The invention sets the state duration at the same time of setting the state, does not perform any data acquisition work in the state duration, reduces the system work load, and simultaneously avoids frequent state switching to influence the normal work of a user.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a schematic diagram of a computer big data storage system and method of the present invention;
FIG. 2 is a flow chart of a system and method for storing big data of a computer according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention provides the following technical solutions: a computer big data storage system, the system comprising: the system comprises a data acquisition module, a data processing module, a function calling module and a data storage module;
The data acquisition module can collect face information, mouse input information and keyboard input information and send the information to the data processing module; the data processing module analyzes and calculates the information transmitted by the data acquisition module, gradually judges whether the triggering condition of state change is met or not, and confirms or releases the protection state; the function calling module is used for calling functions of external equipment, calling a prompt window, setting state time and protecting sensitive operation of important files; the data storage module is used for adjusting the threshold value which needs to be referred in system judgment and carrying out backup storage on all information.
The data acquisition module comprises an input information acquisition unit and an image information acquisition unit. The input information acquisition unit acquires data through test software installed in the computer, the data comprise mouse displacement speed information and keyboard input speed information, and the system sends the acquired information to the data processing module. The image information acquisition unit acquires face information of a user through the external camera, and the system sends acquired information to the data processing module. Wherein:
when the mouse displacement speed information and the keyboard input speed information acquired by the test software are both zero or the camera cannot acquire the face information, the situation that the user is not in front of the computer at the moment is indicated, the freezing time is entered, the setting of the state time is not occupied by the freezing time, and after the user input or the face information is detected, the freezing time is exited, and the state time begins to be timed.
Because the user does not continuously move the mouse and continuously tap the keyboard, the average value of the continuous displacement speed of the mouse and the tap speed of the keyboard is generally collected for a period of time during data acquisition, so that the influence caused by the instantaneous speed is reduced.
The data processing module comprises a primary trigger judging unit, a secondary trigger judging unit and a state setting unit.
The primary trigger judging unit is used for analyzing the input information transmitted by the data acquisition module. And sending instruction information to an external equipment unit of the function calling module, and starting test software to start to acquire the mouse displacement speed information and the keyboard input speed information. And judging the information and corresponding threshold information reserved in the system respectively, wherein when the information is larger than the threshold, the information is normal, and when the information is smaller than or equal to the threshold and larger than zero, the information is abnormal, and recording the mouse displacement speed information or the keyboard input speed information.
The secondary trigger judging unit is used for analyzing the face information transmitted by the data acquisition module. Firstly, sending instruction information to an external equipment unit of a function calling module, starting a camera to start collecting face information, recording eye closing frequency in a period of time, and analyzing and calculating the position of eyes and a pupil coordinate set; secondly, sending instruction information to a prompt window unit of a function call module, starting a popup window function, recording the time when the number of pupil coordinate sets is changed and is larger than an error threshold value when the popup window appears according to human eye data acquired by a camera, and calculating the difference between the time and the popup window prompt time, wherein the obtained time difference is the attention transfer time; finally, substituting the eye closing frequency and the attention transfer time into a formula to calculate the face state degree, comparing and judging with a face state degree threshold reserved in the system, and if the face state degree threshold is smaller than or equal to the threshold, determining that the face state degree is normal; if the threshold is greater than the threshold, the abnormal condition is detected.
The state setting unit is used for collecting the conditions of the primary trigger judging unit and the secondary trigger judging unit and setting the state by combining the current state of the system. The state setting information includes state change information and state duration: the state change information comprises a normal state and an abnormal state, the system does not do any operation in the normal state, the system automatically enters a protection state in the abnormal state, and a user can automatically delay and forcedly backup the sensitive operation of important files in the protection state; the state duration time information is set according to the fixed time reserved in the system; after the setting is completed, the state change information and the state duration information are sent to a state protection unit and a state time unit of the function calling module in the form of instructions. Wherein:
the minimum setting of the state duration is not lower than the time for the test software and the camera to collect data, the administrator sets the state duration according to the fatigue state duration of the user and the condition that the user is not disturbed to work normally, after the state setting is completed, the state is not changed in the state duration, data collection is not performed, and the data collection is started after the state duration is exceeded.
The judgment of the system state firstly acquires the state information of the system, and the system state is divided into two cases:
The self state is a normal state: the first-stage triggering judging unit is in a normal condition, and the system is changed into a normal state; the first-stage triggering judging unit is abnormal, the condition information of the second-stage triggering judging unit is continuously analyzed, the second-stage triggering judging unit is normal, the system is changed to be in a normal state, the second-stage triggering judging unit is abnormal, the system is changed to be in an abnormal state, and the system automatically enters a protection state.
The self state is an abnormal state: the first-stage triggering judging unit is an abnormal condition, and the system is changed into an abnormal state; the first-stage trigger judging unit is normal, the condition information of the second-stage trigger judging unit is continuously analyzed, the second-stage trigger judging unit is abnormal, the system is changed to an abnormal state, the second-stage trigger judging unit is normal, the system is changed to a normal state, and the system automatically exits from a protection state.
And secondly, in order to prevent the state from being changed too frequently, the system can continue to be undetected for a period of time after each state change, namely the primary trigger judging unit and the secondary trigger judging unit do not judge in the period of time, and the state of the system is not changed. The state duration is the time set by a state time unit in the function calling module, and after the time exceeds the time, the first-stage triggering judging unit and the second-stage triggering judging unit judge again, so that the system state is changed; finally, the state change information and the state duration information are sent to a state protection unit and a state time unit of the function call module in the form of instructions.
The function calling module comprises an external device unit, a prompt window unit, a state protection unit and a state time unit.
The external equipment unit is used for receiving the instruction information transmitted by the data processing module; starting or closing test software, wherein the test software is used for collecting mouse displacement speed information and keyboard input speed information; and starting or closing a camera, wherein the camera is used for collecting face information.
The prompt window unit is used for receiving instruction information transmitted by the data processing module and calling a popup function, when the camera is started, the system automatically calls the popup, the popup position is set according to a certain distance between a stay position of a mouse or an input position of a keyboard, the popup page is provided with identification degree, the popup page is in clear contrast with surrounding environment and can easily draw the attention of a user, the popup content is prompt of the current state information of the user and is provided with a verification code picture, a verification code input frame and a confirmation button, the user clicks the confirmation button after inputting the verification code according to the picture, the popup disappears, meanwhile, the instruction information is transmitted to an external equipment unit of the function calling module, and test software and the camera are closed; and automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input, and sending the speed information to a threshold value adjusting unit of the data storage module.
The state protection unit is used for receiving the state change information transmitted by the state setting unit of the data processing module, debugging the protection state, confirming the protection state under abnormal conditions, and releasing the protection state under normal conditions; in the protection state, the sensitive operation of the important file is delayed, the operated object is automatically backed up in the delay period, and all the operations in the protection state are recorded in a log.
The state time unit is used for receiving the state duration time information transmitted by the state setting unit of the data processing module, setting the state time, and in the set time, the data acquisition module does not acquire data and exceeds the set time, and the data acquisition module starts to acquire data.
The data storage module comprises a threshold adjustment unit and a data storage unit.
The threshold value adjusting unit is used for receiving the speed information transmitted by the prompt window unit of the function calling module, wherein the speed information comprises the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input, and the speed information and the corresponding speed threshold value information reserved in the system are averaged to obtain the adjusting information of the corresponding threshold value.
The data storage unit is used for storing the adjusted various threshold information, the acquired information and the log information into a database; the threshold information comprises a mouse displacement speed threshold, a keyboard input speed threshold, an error threshold and a face state degree threshold; the acquisition information comprises mouse displacement speed information, keyboard input speed information, attention transfer time information and face state degree information; the log information includes state change information, operation information in a protection state, and backup file location information.
Referring to fig. 2, the present invention provides the following technical solutions: a method for storing big data in a computer, the method comprising the steps of:
s1, acquiring input information of a keyboard and a mouse, and judging whether the input information is in a normal condition or not;
s2, acquiring face information, substituting the face information into a formula to calculate a user state value, and judging whether the face information is in a normal condition or not;
s3, judging confirmation or release of the protection state according to the S1 and S2 conditions;
S4, adjusting and judging influence factors, and carrying out log recording and backup on all operation information in the state;
s5, after the state duration time is exceeded, the step S1 is re-entered.
In S1, acquiring input information of a keyboard and a mouse is to acquire data through test software installed in a computer, wherein the data comprises mouse displacement speed information and keyboard input speed information; at intervals, calling test software to collect mouse displacement speed information and keyboard input speed information, judging the information and corresponding threshold information reserved in the system respectively, and if the information is larger than the threshold, judging the information as normal; if the displacement speed information is smaller than or equal to the threshold value, recording the displacement speed information of the mouse or the input speed information of the keyboard; the formula is as follows:
In the formula, S result is a mouse displacement speed judgment result, S n is a mouse displacement speed of the nth actual test, Z s is a mouse displacement speed threshold, J result is a keyboard input speed judgment result, J n is a keyboard input speed of the nth actual test, and Z J is a keyboard input speed threshold.
In S2, the calculating and judging of the state value includes the following steps:
S201, starting a camera function, performing image acquisition on a user in front of a computer, loading a Haar cascade classifier by adopting an existing OpenCV image processing library, and performing face detection on an image acquired by the camera.
S202, for the detected face area, performing eye detection by using a Haar cascade classifier again and determining the eye position; and calculating the position coordinates of the eyes in the whole image according to the positions of the eyes and the relative positions of the face areas.
S203, face information is detected by using 68 key points of the face in Dlib libraries, and eye feature point coordinates are acquired according to eye area position coordinate information; each eye is represented by 6 coordinates, from the inner corner, an edge line of an eye corner area is taken as a path, an inner eye corner M 1, an upper left eye M 2, an upper right eye M 3, an outer eye corner M 4, a lower right eye M 5 and a lower left eye M 6 are respectively collected according to a clockwise direction, the coordinates are substituted into a formula, the aspect ratio of the eye is calculated, and the state of the eye is judged according to the aspect ratio change within a period of time; the formula is as follows:
Where HZB is the eye aspect ratio, n is the lateral weighting coefficient, M 1 is the inner corner feature point coordinate, M 2 is the upper left eye feature point coordinate, M 3 is the upper right eye feature point coordinate, M 4 is the outer corner feature point coordinate, M 5 is the lower right eye feature point coordinate, and M 6 is the lower left eye feature point coordinate.
When the eyes of the human body are opened, the HZB fluctuates within a certain range of values; when the human eye closes, HZB drops rapidly, approaching 0; the face video recorded by the camera is analyzed frame by frame, and the frame number duty ratio of HZB lower than the eye-closing threshold value in a period of time is calculated, wherein the formula is as follows:
Wherein EC is eye-closing frequency, BYZ is the number of frames of HZB lower than the eye-closing threshold in time, and ZSS is the total number of frames in time.
S204, detecting the center of the iris in the eye area according to the position information of the eye in the image; smoothing the eye region using a gradient minimization method, obtaining a rough estimate of the iris center by color intensity, applying a distance filter to remove invalid edges that are too close or too far from the rough center of the iris, applying RANSAC to the edge points of the iris, and then calculating the radius of the iris, combining the intensity energy and the edge intensity information to locate the iris center; the calculation formula is as follows:
Where C 1 is intensity energy, C 2 is edge intensity, Q is eye area, r is a circular window with the same radius as the iris, g x and g y are the horizontal and vertical gradients of the pixel, respectively. To detect iris centers, minimizing intensity energy in a circular window, maximizing edge intensity of iris edges; the trade-off is controlled by the parameter τ, the formula is as follows:
Wherein (x a,ya) is the coordinate of the center of the iris, the integration interval is [ -15 pi, 15 pi ] and [45 pi, 65 pi ], (x, y) is the coordinate, C 1 is the intensity energy, C 2 is the edge intensity, P is the influencing factor, and τ is the parameter; the arc of the iris edge corresponds to the same range of arcs in a circle of radius r, and the integral is calculated by the sum of the edge intensities of each pixel located on the arc.
S205, converting an RGB image into HSV, setting an HSV range value of a color to be extracted according to pixel color information corresponding to iris center coordinates, calling inRange functions of an OpenCV image processing library to extract a color space, removing noise points after binarization, connecting break points, calling findContours functions to perform contour detection, drawing a rectangular area to locate the contour, and putting all locating coordinate points in the iris coordinate range into a set X, wherein the set X is a pixel position set of pupils of a human eye in the image, the set X comprises { X 1,X2,...,Xz }, wherein X 1,X2,...,Xz respectively represents 1 st, 2 nd, z data in the pupil coordinate set, and each piece of data comprises the abscissa of the pixel points.
S206, automatically calling a popup window function, and simultaneously recording the occurrence time T 1 of the popup window, wherein the camera continuously collects and analyzes pupil coordinate sets in the face information; when the number of the sets is changed, judging whether the number of the changed sets is larger than an error threshold value and smaller than or equal to the error threshold value, and not processing; when the number is larger than the error threshold value, recording the time T 2 when the number is changed, and carrying out calculation in a formula to obtain the attention transfer time:
Wherein, T is the attention transfer time, T 1 is the pop-up window appearance time, T 2 is the time when the number is changed, X is the number of the original set, GB is the number of the changed set, V is the error threshold, next is the end of the judgment, and the next data judgment is continued.
The popup window content is information for prompting the current state of the user, and is provided with a verification code input box and a confirmation button, the user clicks the confirmation button according to the prompt for inputting the verification code, the popup window disappears, and meanwhile, the test software and the camera are closed; and automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input.
S207, substituting the eye closing frequency and the attention transfer time into a formula, comprehensively calculating the human face state attitude, and if the calculated result is smaller than or equal to a human face state attitude threshold, determining that the human face state attitude is normal; when the face state degree is larger than the face state degree threshold, the user is in a tired state and is abnormal; the formula is as follows:
Wherein, ZTD is human face state attitude, alpha is eye closing influence factor, EC is eye closing frequency, beta is attention influence factor, T is attention transfer time, B is standard attention transfer time, gamma is mouse displacement speed influence factor, R s is recorded mouse displacement speed, the value is greater than zero, Z s is mouse displacement speed threshold value, delta is keyboard input speed influence factor, R J is recorded keyboard input speed, the value is greater than zero, Z J is keyboard input speed threshold value, Z G is human face state degree threshold value, and result is judgment result.
In S3, the judgment of the protection state needs to be performed according to the states of S1 and S2 and the current state; two situations are distinguished:
The current state is a normal state: s1, the state is normal, and the system is changed into a normal state; s1 state is abnormal, S2 state information is continuously analyzed, S2 state is normal, the system is changed to be in a normal state, S2 state is abnormal, the system is changed to be in an abnormal state, and the system automatically enters a protection state.
The current state is an abnormal state: s1, the state is abnormal, and the system is changed into an abnormal state; s1 state is normal, S2 state information is continuously analyzed, S2 state is abnormal, the system is changed into abnormal state, S2 state is normal, the system is changed into normal state, and the protection state is automatically exited.
In S4, judging that the influencing factors comprise a mouse displacement speed threshold and a keyboard input speed threshold, respectively calculating average values of the displacement speed of the mouse collected in the S2 step to the verification code input frame and the keyboard input speed when the verification code is input, and the mouse displacement speed threshold and the keyboard input speed threshold, and obtaining adjustment information of the corresponding threshold.
Embodiment one:
Assuming that A, B and C are 3 users in total, respectively testing on 3 computers, wherein the mouse displacement speed threshold is 360 pixel points/second, the keyboard input speed threshold is 100 letters/minute, and under the condition of the same mouse sensitivity:
The average speed of the mouse displacement of the user A is 480 pixel points/second, and the average speed of the keyboard input is 156 letters/minute;
The average speed of the mouse displacement of the user is 350 pixel points/second, and the average speed of the keyboard input is 121 letters/minute;
the average speed of the mouse displacement of the user is 480 pixels/second, and the average speed of the keyboard input is 89 letters/minute;
judging according to the corresponding threshold value to obtain:
The user condition is normal;
b user condition abnormality (mouse displacement average velocity is too low);
c user condition anomalies (keyboard input average speed is too low);
Assuming that the number of frames in which the eye aspect ratio of the user B is lower than the eye closure threshold is 180 frames within 300 frames, the attention transfer time is 0.8 seconds; c, the number of frames that the user's eye aspect ratio is lower than the eye closing threshold is 120 frames, and the attention transfer time is 1.5 seconds:
substituting the formula to calculate and obtain the eye closing frequency of the user B:
substituting the formula to calculate and obtain the eye closing frequency of the user C:
assuming that the eye closure influence factor is 0.3, the attention influence factor is 0.4, the standard attention transfer time is 1.2, the mouse displacement speed influence factor is 0.15, the keyboard input speed influence factor is 0.15, and the face state threshold value is 80%;
substituting formula to calculate B face state degree of user:
Substituting the formula to calculate and obtain the face state degree of the C user:
judging according to the face state degree threshold value of 80 percent to obtain:
B, the face state degree of the user is smaller than or equal to a threshold value, and the face state degree is normal and is not processed;
and C, when the user is greater than the threshold value, the situation is abnormal, and the protection state is entered.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. A computer big data storage system, the system comprising: the system comprises a data acquisition module, a data processing module, a function calling module and a data storage module;
The data acquisition module can collect face information, mouse input information and keyboard input information and send the information to the data processing module; the data processing module analyzes and calculates the information transmitted by the data acquisition module, gradually judges whether the triggering condition of state change is met or not, and confirms or releases the protection state; the function calling module is used for calling functions of external equipment, calling a prompt window, setting state time and protecting sensitive operation of important files; the data storage module is used for adjusting a threshold value which needs to be referred in system judgment and carrying out backup storage on all information;
The data acquisition module comprises an input information acquisition unit and an image information acquisition unit; the input information acquisition unit acquires data through test software installed in the computer, the data comprise mouse displacement speed information and keyboard input speed information, and the system sends the acquired information to the data processing module; the image information acquisition unit acquires face information of a user through an external camera, and the system sends acquired information to the data processing module;
The data processing module comprises a primary trigger judging unit, a secondary trigger judging unit and a state setting unit;
The primary trigger judging unit is used for analyzing the input information transmitted by the data acquisition module; transmitting instruction information to an external equipment unit of the function calling module, and starting test software to start to acquire mouse displacement speed information and keyboard input speed information; judging the information with corresponding threshold information reserved in the system respectively, wherein when the information is larger than the threshold, the information is normal, when the information is smaller than or equal to the threshold and larger than zero, the information is abnormal, and recording the mouse displacement speed information or keyboard input speed information;
The secondary trigger judging unit is used for analyzing the face information transmitted by the data acquisition module; firstly, sending instruction information to an external equipment unit of a function calling module, starting a camera to start collecting face information, recording eye closing frequency in a period of time, and analyzing and calculating the position of eyes and a pupil coordinate set; secondly, sending instruction information to a prompt window unit of a function call module, starting a popup window function, recording the time when the number of pupil coordinate sets is changed and is larger than an error threshold value when the popup window appears according to human eye data acquired by a camera, and calculating the difference between the time and the popup window prompt time, wherein the obtained time difference is the attention transfer time; finally, substituting the eye closing frequency and the attention transfer time into a formula to calculate the face state degree, comparing and judging with a face state degree threshold reserved in the system, and if the face state degree threshold is smaller than or equal to the threshold, determining that the face state degree is normal; if the threshold value is larger than the threshold value, the abnormal situation is generated;
The detailed steps are as follows:
S1, starting a camera function, performing image acquisition on a user in front of a computer, loading a Haar cascade classifier by adopting an existing OpenCV image processing library, and performing face detection on an image acquired by the camera;
s2, for the detected face area, performing eye detection by using a Haar cascade classifier again and determining the eye position; calculating the position coordinates of the eyes in the whole image according to the relative positions of the eyes and the face area;
S3, detecting face information by using 68 key points of the face in the Dlib library, and acquiring eye feature point coordinates according to the eye region position coordinate information; each eye is represented by 6 coordinates, from the inner corner, an edge line of an eye corner area is taken as a path, an inner eye corner M 1, an upper left eye M 2, an upper right eye M 3, an outer eye corner M 4, a lower right eye M 5 and a lower left eye M 6 are respectively collected according to a clockwise direction, the coordinates are substituted into a formula, the aspect ratio of the eye is calculated, and the state of the eye is judged according to the aspect ratio change within a period of time; the formula is as follows:
Wherein HZB is the eye aspect ratio, n is the lateral weighting coefficient, M 1 is the inner corner feature point coordinate, M 2 is the upper left eye feature point coordinate, M 3 is the upper right eye feature point coordinate, M 4 is the outer corner feature point coordinate, M 5 is the lower right eye feature point coordinate, and M 6 is the lower left eye feature point coordinate;
When the eyes of the human body are opened, the HZB fluctuates within a certain range of values; when the human eye closes, HZB drops rapidly, approaching 0; the face video recorded by the camera is analyzed frame by frame, and the frame number duty ratio of HZB lower than the eye-closing threshold value in a period of time is calculated, wherein the formula is as follows:
Wherein EC is eye-closing frequency, BYZ is the number of frames of HZB lower than the eye-closing threshold in time, ZSS is the total number of frames in time;
s4, detecting an iris center in an eye area according to the position information of the eyes in the image; smoothing the eye region using a gradient minimization method, obtaining a rough estimate of the iris center by color intensity, applying a distance filter to remove invalid edges that are too close or too far from the rough center of the iris, applying RANSAC to the edge points of the iris, and then calculating the radius of the iris, combining the intensity energy and the edge intensity information to locate the iris center;
S5, converting the RGB image into HSV, setting HSV range values of colors to be extracted according to pixel color information corresponding to iris center coordinates, calling inRange functions of an OpenCV image processing library to extract a color space, removing noise points after binarization, connecting break points, calling findContours functions to perform contour detection, drawing a rectangular area to position the contour, and putting all positioning coordinate points in the iris coordinate range into a set X, wherein the set X is a pixel position set of pupils of a human eye in the image, the set X comprises { X 1,X2,...,Xz }, wherein X 1,X2,...,Xz respectively represents 1 st, 2 nd, third and z data in the pupil coordinate set, and each piece of data comprises the abscissa of the pixel points;
S6, automatically calling a popup window function, recording the popup window occurrence time T 1, and continuously collecting and analyzing pupil coordinate sets in the face information by the camera; when the number of the sets is changed, judging whether the number of the changed sets is larger than an error threshold value and smaller than or equal to the error threshold value, and not processing; when the number is larger than the error threshold value, recording the time T 2 when the number is changed, and carrying out calculation in a formula to obtain the attention transfer time:
Wherein, T is the attention transfer time, T 1 is the occurrence time of a popup window, T 2 is the time of number change, X is the number of the original set, GB is the number of the changed set, V is an error threshold, next is the end of the judgment, and the next data judgment is continued;
The popup window content is information for prompting the current state of the user, and is provided with a verification code input box and a confirmation button, the user clicks the confirmation button according to the prompt for inputting the verification code, the popup window disappears, and meanwhile, the test software and the camera are closed; automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input;
S7, substituting the eye closing frequency and the attention transfer time into a formula, comprehensively calculating the human face state attitude, and if the calculated result is smaller than or equal to a human face state attitude threshold, determining that the human face state attitude is normal; when the face state degree is larger than the face state degree threshold, the user is in a tired state and is abnormal; the formula is as follows:
Wherein, ZTD is human face state attitude, alpha is eye closing influence factor, EC is eye closing frequency, beta is attention influence factor, T is attention transfer time, B is standard attention transfer time, gamma is mouse displacement speed influence factor, R s is recorded mouse displacement speed, the value is greater than zero, Z s is mouse displacement speed threshold value, delta is keyboard input speed influence factor, R J is recorded keyboard input speed, the value is greater than zero, Z J is keyboard input speed threshold value, Z G is human face state degree threshold value, and result is judgment result;
The state setting unit is used for collecting the conditions of the primary trigger judging unit and the secondary trigger judging unit and setting the state by combining the current state of the system; the state setting information includes state change information and state duration: the state change information comprises a normal state and an abnormal state, the system does not do any operation in the normal state, the system automatically enters a protection state in the abnormal state, and a user can automatically delay and forcedly backup the sensitive operation of important files in the protection state; the state duration time information is set according to the fixed time reserved in the system; after the setting is completed, the state change information and the state duration time information are sent to a state protection unit and a state time unit of the function calling module in the form of instructions;
the function calling module comprises an external device unit, a prompt window unit, a state protection unit and a state time unit;
The external equipment unit is used for receiving instruction information transmitted by the data processing module; starting or closing test software, wherein the test software is used for collecting mouse displacement speed information and keyboard input speed information; starting or closing a camera, wherein the camera is used for collecting face information;
The prompt window unit is used for receiving instruction information transmitted by the data processing module, calling a popup function, when the camera is started, the system automatically calls the popup, the popup content is prompting the current state information of a user, and is attached with a verification code picture, a verification code input frame and a confirmation button, the user clicks the confirmation button after inputting the verification code according to the picture, the popup disappears, and meanwhile, the instruction information is transmitted to an external equipment unit of the function calling module, and the test software and the camera are closed; automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input, and sending the speed information to a threshold value adjusting unit of the data storage module;
The state protection unit is used for receiving the state change information transmitted by the state setting unit of the data processing module, debugging the protection state, confirming the protection state under abnormal conditions, and releasing the protection state under normal conditions; in the protection state, the sensitive operation of the important file is delayed, the operated object is automatically backed up in the delay period, and all the operations in the protection state are recorded in a log;
The state time unit is used for receiving the state duration time information transmitted by the state setting unit of the data processing module, setting the state time, and in the set time, the data acquisition module does not acquire data and starts to acquire data when the set time is exceeded;
The data storage module comprises a threshold adjustment unit and a data storage unit;
The threshold value adjusting unit is used for receiving speed information transmitted by the prompt window unit of the function calling module, wherein the speed information comprises the displacement speed of a mouse moving to the verification code input frame and the keyboard input speed when the verification code is input, and the speed information and the corresponding speed threshold value information reserved in the system are averaged to obtain the adjusting information of the corresponding threshold value;
The data storage unit is used for storing the adjusted various threshold information, the acquired information and the log information into a database; the threshold information comprises a mouse displacement speed threshold, a keyboard input speed threshold, an error threshold and a face state degree threshold; the acquisition information comprises mouse displacement speed information, keyboard input speed information, attention transfer time information and face state degree information; the log information includes state change information, operation information in a protection state, and backup file location information.
2. A method for storing big data in a computer, the method comprising the steps of:
s1, acquiring input information of a keyboard and a mouse, and judging whether the input information is in a normal condition or not;
s2, acquiring face information, substituting the face information into a formula to calculate a user state value, and judging whether the face information is in a normal condition or not;
s3, judging confirmation or release of the protection state according to the S1 and S2 conditions;
S4, adjusting and judging influence factors, and carrying out log recording and backup on all operation information in the state;
S5, after the state duration time is exceeded, re-entering the step S1;
In S1, acquiring input information of a keyboard and a mouse is to acquire data through test software installed in a computer, wherein the data comprises mouse displacement speed information and keyboard input speed information; at intervals, calling test software to collect mouse displacement speed information and keyboard input speed information, judging the information and corresponding threshold information reserved in the system respectively, and if the information is larger than the threshold, judging the information as normal; if the displacement speed information is smaller than or equal to the threshold value, recording the displacement speed information of the mouse or the input speed information of the keyboard; the formula is as follows:
Wherein S result is the judgment result of the mouse displacement speed, S n is the mouse displacement speed of the nth practical test, Z s is the mouse displacement speed threshold, J result is the judgment result of the keyboard input speed, J n is the keyboard input speed of the nth practical test, and Z J is the keyboard input speed threshold;
In S2, the calculating and judging of the state value includes the following steps:
S201, starting a camera function, performing image acquisition on a user in front of a computer, loading a Haar cascade classifier by adopting an existing OpenCV image processing library, and performing face detection on an image acquired by the camera;
S202, for the detected face area, performing eye detection by using a Haar cascade classifier again and determining the eye position; calculating the position coordinates of the eyes in the whole image according to the relative positions of the eyes and the face area;
S203, face information is detected by using 68 key points of the face in Dlib libraries, and eye feature point coordinates are acquired according to eye area position coordinate information; each eye is represented by 6 coordinates, from the inner corner, an edge line of an eye corner area is taken as a path, an inner eye corner M 1, an upper left eye M 2, an upper right eye M 3, an outer eye corner M 4, a lower right eye M 5 and a lower left eye M 6 are respectively collected according to a clockwise direction, the coordinates are substituted into a formula, the aspect ratio of the eye is calculated, and the state of the eye is judged according to the aspect ratio change within a period of time; the formula is as follows:
Wherein HZB is the eye aspect ratio, n is the lateral weighting coefficient, M 1 is the inner corner feature point coordinate, M 2 is the upper left eye feature point coordinate, M 3 is the upper right eye feature point coordinate, M 4 is the outer corner feature point coordinate, M 5 is the lower right eye feature point coordinate, and M 6 is the lower left eye feature point coordinate;
When the eyes of the human body are opened, the HZB fluctuates within a certain range of values; when the human eye closes, HZB drops rapidly, approaching 0; the face video recorded by the camera is analyzed frame by frame, and the frame number duty ratio of HZB lower than the eye-closing threshold value in a period of time is calculated, wherein the formula is as follows:
Wherein EC is eye-closing frequency, BYZ is the number of frames of HZB lower than the eye-closing threshold in time, ZSS is the total number of frames in time;
s204, detecting the center of the iris in the eye area according to the position information of the eye in the image; smoothing the eye region using a gradient minimization method, obtaining a rough estimate of the iris center by color intensity, applying a distance filter to remove invalid edges that are too close or too far from the rough center of the iris, applying RANSAC to the edge points of the iris, and then calculating the radius of the iris, combining the intensity energy and the edge intensity information to locate the iris center;
S205, converting an RGB image into HSV, setting an HSV range value of a color to be extracted according to pixel color information corresponding to iris center coordinates, calling inRange functions of an OpenCV image processing library to extract a color space, removing noise points after binarization, connecting break points, calling findContours functions to perform contour detection, drawing a rectangular area to position the contour, and putting all positioning coordinate points in the iris coordinate range into a set X, wherein the set X is a pixel position set of pupils of a human eye in the image, the set X comprises { X 1,X2,...,Xz }, wherein X 1,X2,...,Xz respectively represents 1 st, 2 nd, z data in the pupil coordinate set, and each piece of data comprises the abscissa of the pixel points;
S206, automatically calling a popup window function, and simultaneously recording the occurrence time T 1 of the popup window, wherein the camera continuously collects and analyzes pupil coordinate sets in the face information; when the number of the sets is changed, judging whether the number of the changed sets is larger than an error threshold value and smaller than or equal to the error threshold value, and not processing; when the number is larger than the error threshold value, recording the time T 2 when the number is changed, and carrying out calculation in a formula to obtain the attention transfer time:
Wherein, T is the attention transfer time, T 1 is the occurrence time of a popup window, T 2 is the time of number change, X is the number of the original set, GB is the number of the changed set, V is an error threshold, next is the end of the judgment, and the next data judgment is continued;
The popup window content is information for prompting the current state of the user, and is provided with a verification code input box and a confirmation button, the user clicks the confirmation button according to the prompt for inputting the verification code, the popup window disappears, and meanwhile, the test software and the camera are closed; automatically recording the displacement speed of the mouse moving to the verification code input box and the keyboard input speed when the verification code is input;
S207, substituting the eye closing frequency and the attention transfer time into a formula, comprehensively calculating the human face state attitude, and if the calculated result is smaller than or equal to a human face state attitude threshold, determining that the human face state attitude is normal; when the face state degree is larger than the face state degree threshold, the user is in a tired state and is abnormal; the formula is as follows:
Wherein, ZTD is human face state attitude, alpha is eye closing influence factor, EC is eye closing frequency, beta is attention influence factor, T is attention transfer time, B is standard attention transfer time, gamma is mouse displacement speed influence factor, R s is recorded mouse displacement speed, the value is greater than zero, Z s is mouse displacement speed threshold value, delta is keyboard input speed influence factor, R J is recorded keyboard input speed, the value is greater than zero, Z J is keyboard input speed threshold value, Z G is human face state degree threshold value, and result is judgment result;
In S3, the judgment of the protection state needs to be performed according to the states of S1 and S2 and the current state; two situations are distinguished:
The current state is a normal state: s1, the state is normal, and the system is changed into a normal state; s1, continuously analyzing S2 state information, wherein the S2 state is normal, the system is changed into a normal state, the S2 state is abnormal, the system is changed into an abnormal state, and the system automatically enters a protection state;
The current state is an abnormal state: s1, the state is abnormal, and the system is changed into an abnormal state; s1, continuously analyzing S2 state information, wherein the S2 state is abnormal, changing the system into an abnormal state, changing the S2 state into a normal state, and automatically exiting from a protection state;
In S4, judging that the influencing factors comprise a mouse displacement speed threshold and a keyboard input speed threshold, respectively calculating average values of the displacement speed of the mouse collected in the S2 step to the verification code input frame and the keyboard input speed when the verification code is input, and the mouse displacement speed threshold and the keyboard input speed threshold, and obtaining adjustment information of the corresponding threshold.
CN202310328714.4A 2023-03-30 2023-03-30 Big data storage system and method for computer Active CN116467739B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310328714.4A CN116467739B (en) 2023-03-30 2023-03-30 Big data storage system and method for computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310328714.4A CN116467739B (en) 2023-03-30 2023-03-30 Big data storage system and method for computer

Publications (2)

Publication Number Publication Date
CN116467739A CN116467739A (en) 2023-07-21
CN116467739B true CN116467739B (en) 2024-09-27

Family

ID=87181629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310328714.4A Active CN116467739B (en) 2023-03-30 2023-03-30 Big data storage system and method for computer

Country Status (1)

Country Link
CN (1) CN116467739B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108684A (en) * 2017-12-15 2018-06-01 杭州电子科技大学 A kind of attention detection method for merging line-of-sight detection
CN108742656A (en) * 2018-03-09 2018-11-06 华南理工大学 Fatigue state detection method based on face feature point location

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2696042C2 (en) * 2017-12-11 2019-07-30 Федеральное государственное бюджетное образовательное учреждение высшего образования "Московский государственный университет имени М.В. Ломоносова" (МГУ) Method and system for recording eye movement
US10740634B1 (en) * 2019-05-31 2020-08-11 International Business Machines Corporation Detection of decline in concentration based on anomaly detection
CN110197169B (en) * 2019-06-05 2022-08-26 南京邮电大学 Non-contact learning state monitoring system and learning state detection method
CN112183238B (en) * 2020-09-10 2024-01-05 广州大学 Remote education attention detection method and system
CN115437901A (en) * 2022-10-13 2022-12-06 袁超 Computer user identification management system and method based on big data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108684A (en) * 2017-12-15 2018-06-01 杭州电子科技大学 A kind of attention detection method for merging line-of-sight detection
CN108742656A (en) * 2018-03-09 2018-11-06 华南理工大学 Fatigue state detection method based on face feature point location

Also Published As

Publication number Publication date
CN116467739A (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN112200043B (en) Intelligent danger source identification system and method for outdoor construction site
CN109117827B (en) Video-based method for automatically identifying wearing state of work clothes and work cap and alarm system
CN106056079B (en) A kind of occlusion detection method of image capture device and human face five-sense-organ
KR102142315B1 (en) ATM security system based on image analyses and the method thereof
RU2713876C1 (en) Method and system for detecting alarm events when interacting with self-service device
CN110580234A (en) Fusing method suitable for micro server and external system
CN112926925A (en) Product supervision method and device, electronic equipment and storage medium
CN111027518A (en) Suspicious crowd intelligent alarm method and device, computer equipment and storage medium
CN116246416A (en) Intelligent analysis early warning platform and method for security protection
CN110674728B (en) Method, device, server and storage medium for playing mobile phone based on video image identification
CN116467739B (en) Big data storage system and method for computer
KR101581162B1 (en) Automatic detection method, apparatus and system of flame, smoke and object movement based on real time images
CN111985331B (en) Detection method and device for preventing trade secret from being stolen
CN117173847B (en) Intelligent door and window anti-theft alarm system and working method thereof
CN111291728A (en) Detection system, detection equipment and detection method for illegal crossing of transmission belt behavior
CN105979230A (en) Monitoring method and device realized through images by use of robot
CN108090416B (en) Intelligent financial supervision method and financial supervision system based on video analysis
CN116311714A (en) Method and system for preventing false alarm of intelligent door lock
CN116993265A (en) Intelligent warehouse safety management system based on Internet of things
CN112419091B (en) Intelligent video safety control method for field operation of power distribution network driven by knowledge graph
CN115346170A (en) Intelligent monitoring method and device for gas facility area
CN103974028A (en) Method for detecting fierce behavior of personnel
CN115394025A (en) Monitoring method, monitoring device, electronic equipment and storage medium
CN114973135A (en) Head-shoulder-based sequential video sleep post identification method and system and electronic equipment
CN113516691A (en) High-altitude parabolic detection system based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240906

Address after: 247100 Building 25, Chizhou Hui Mall, Zhanqian Road, Zhanqian District, Qingxi Street, Guichi District, Chizhou City, Anhui Province, China

Applicant after: Chizhou Guihong Information Technology Co.,Ltd.

Country or region after: China

Address before: 2/F, Building 3, Zhongke Innovation Plaza, No. 150 Pubin Road, Jiangbei New District, Nanjing, Jiangsu Province, 210001

Applicant before: Jiangsu Tutu Network Technology Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant