CN111027506A - Method and device for determining sight direction, electronic equipment and storage medium - Google Patents
Method and device for determining sight direction, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111027506A CN111027506A CN201911317746.4A CN201911317746A CN111027506A CN 111027506 A CN111027506 A CN 111027506A CN 201911317746 A CN201911317746 A CN 201911317746A CN 111027506 A CN111027506 A CN 111027506A
- Authority
- CN
- China
- Prior art keywords
- determining
- sight line
- face
- coordinates
- eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 210000003128 head Anatomy 0.000 claims abstract description 68
- 238000012545 processing Methods 0.000 claims abstract description 28
- 230000001815 facial effect Effects 0.000 claims abstract description 13
- 230000006870 function Effects 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 9
- 210000000744 eyelid Anatomy 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 8
- 210000001747 pupil Anatomy 0.000 claims description 6
- 230000000977 initiatory effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The method, the device, the electronic equipment and the storage medium for determining the sight direction are provided by the present disclosure, wherein facial images are collected; determining the coordinates of the human face characteristic points in the facial image, and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction, wherein the obtained real sight line direction is more accurate compared with the prior art because the position of eyes, the head posture and the set position of the acquisition point are considered in the process of obtaining the real sight line direction.
Description
Technical Field
The present disclosure relates to computer technologies, and in particular, to a method and an apparatus for determining a gaze direction, an electronic device, and a storage medium.
Background
With the development of economy, driving and traveling become one of daily traveling modes of people.
In the prior art, the attention of the driver can be detected by using a detection device, specifically, whether the attention of the driver is focused, that is, whether the sight line of the driver is in front view or not can be determined by acquiring a head image and determining whether the attention of the driver is focused or not based on the posture of the head of the driver in the image.
However, the driver's sight line is determined only by the posture of the head, and generally, the sight line range of the driver may be shifted by a certain angle in the left-right direction when the posture of the head is fixed. This also results in that determination of the driver's sight line depending on the head posture leads to inaccuracy of the determination result.
Disclosure of Invention
In view of the above-mentioned problems, the present disclosure provides a method and an apparatus for determining a gaze direction, an electronic device, and a storage medium.
In a first aspect, the present disclosure provides a method for determining a gaze direction, including:
collecting a face image;
determining the coordinates of the human face characteristic points in the facial image, and determining the head posture according to the coordinates of the human face characteristic points;
determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images;
and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
In a second aspect, the present disclosure provides an apparatus for determining a gaze direction, comprising:
the acquisition module is used for acquiring a face image;
the processing module is used for determining the coordinates of the human face characteristic points in the face image and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
In a third aspect, the present disclosure provides an electronic device, comprising:
a processor and a memory;
wherein the memory is configured to store executable instructions of the processor;
the processor, when executing the executable instructions, may perform any of the methods described above.
In a fourth aspect, the present disclosure provides a storage medium comprising instructions which, when executed on a computer, the computer may perform the method of any of the above.
The sight direction determining device provided by the disclosure acquires a face image; determining the coordinates of the human face characteristic points in the facial image, and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction, wherein the obtained real sight line direction is more accurate compared with the prior art because the position of eyes, the head posture and the set position of the acquisition point are considered in the process of obtaining the real sight line direction.
Drawings
Explicit examples of the present disclosure have been shown by the above figures and will be described in more detail later. These drawings and written description are not intended to limit the scope of the disclosed concepts in any manner, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific examples.
FIG. 1 is a schematic diagram of a network architecture upon which the present disclosure is based;
fig. 2 is a schematic flow chart of a method for determining a gaze direction according to the present disclosure;
fig. 3 is a schematic flow chart of another method for determining a gaze direction according to the present disclosure;
fig. 4 is a schematic structural diagram of a device for determining a gaze direction according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate examples consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Detailed Description
To make the purpose, technical solutions and advantages of the disclosed examples clearer, the technical solutions in the disclosed examples will be clearly and completely described below with reference to the drawings in the disclosed examples.
With the development of economy, driving and traveling become one of daily traveling modes of people.
In the prior art, the attention of the driver can be detected by using a detection device, specifically, whether the attention of the driver is focused, that is, whether the sight line of the driver is in front view or not can be determined by acquiring a head image and determining whether the attention of the driver is focused or not based on the posture of the head of the driver in the image.
However, the driver's sight line is determined only by the posture of the head, and generally, the sight line range of the driver may be shifted by a certain angle in the left-right direction when the posture of the head is fixed. This also results in that determination of the driver's sight line depending on the head posture leads to inaccuracy of the determination result.
In order to solve the technical problem, the present disclosure provides a method and an apparatus for determining a gaze direction, an electronic device, and a storage medium, which analyze an acquired facial image to obtain a true gaze direction based on a position of an eye, a head pose, and a predetermined position of an acquisition point, thereby improving accuracy of the obtained true gaze direction.
Fig. 1 is a schematic diagram of a network architecture on which the present disclosure is based, as shown in fig. 1, the network architecture on which the present disclosure is based includes a device 1 for determining a sight line direction erected in a cockpit 2, where the device for determining the sight line direction at least includes a collecting device, and the collecting device may specifically be a camera device or a terminal device including a camera function, such as a smart phone, a vehicle data recorder, and the like; the acquisition device in the implementation direction determination apparatus 1 may be used to acquire a facial image of a driver sitting in the cockpit 2, and implement the determination of the sight-line direction based on the following method.
In a first aspect, the present disclosure provides a method for determining a gaze direction, and fig. 2 is a schematic flow chart of the method for determining a gaze direction provided by the present disclosure.
As shown in fig. 2, the method for determining the gaze direction includes:
And step 102, determining the coordinates of the human face characteristic points in the face image, and determining the head posture according to the coordinates of the human face characteristic points.
And 103, determining eye images corresponding to the eyes according to the face images, and determining a reference sight line direction of the sight line relative to the acquisition point according to the head posture and the eye images.
And 104, processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
Specifically, the execution subject of the method for determining the gaze direction provided by the example of the present disclosure is the foregoing device for determining the gaze direction.
The sight-line direction determination device as described above is provided in the cockpit, and can acquire a face image of the driver while driving the mobile device. Wherein, the determining device can be started and executed together when the mobile device is started. Of course, the device for determining the direction of the implementation may also be connected to a vehicle-mounted computer, and may be activated by receiving a touch operation or a voice operation of the driver to perform the determination process of the direction of the line of sight.
When the determining device is started, the acquisition equipment in the device can acquire the face image of the driver. In order to ensure the subsequent detection accuracy of the sight line direction, in the acquired face image, the image proportion of each eye of the driver on the face image exceeds a proportion threshold, for example, the image width of any eye in the face image exceeds 30 pixel widths.
Subsequently, the determining means will determine the head pose in the face image using a head pose determination algorithm. The head posture can be specifically expressed by a cartesian coordinate system, that is, a Pitch angle (Pitch) of the head, a Yaw angle (Yaw) of the head, and a Roll angle (Roll) of the head are represented. Through which the posture of the head currently is assumed.
In the head pose determination algorithm, in order to obtain the above-described angle, it is necessary to use the coordinates of the face feature point. The face feature points refer to feature points on the face such as eyes, nose, mouth, eyebrows, etc. that can be used to identify personal features. The face characteristic points can be identified and labeled from the face image through an image identification algorithm, and the corresponding coordinates of each face characteristic point are determined based on the identified face characteristic points, namely the face characteristic point coordinates are obtained. And then determining the head pose based on the coordinates of the face feature points by using a head pose determination algorithm.
And then, the determining device determines an eye image corresponding to the eye according to the face image, and determines a reference sight line direction of the sight line relative to the acquisition point according to the head posture and the eye image. Specifically, the eye image corresponding to the eye is an image including the minimum range of the entire eye, and in this step, the reference visual line direction is determined by analyzing the pupil position of the eye, the opening and closing angle of the eyelid, and other factors in the eye image and by combining the obtained head posture. Wherein the reference line-of-sight direction is based on the direction of the acquisition point, i.e. the line-of-sight direction when the direction of the acquisition point is the positive direction.
And finally, correcting the reference sight direction according to the set direction of the acquisition point to obtain the real sight direction. Wherein the given direction of the acquisition point is the angle between the acquisition point and the true forward direction in the cockpit.
In an alternative example, whether to initiate the alert information may also be determined based on the derived true gaze direction.
When determining whether to initiate alarm information or not for the real sight line direction, setting a threshold value of deviation of the sight line direction from the left direction to the right direction as T; when the obtained real direction of sight deviates from the positive direction by more than the threshold T, an alarm is initiated to prompt the driver to increase attention.
In addition, the current vehicle speed can be acquired, and the frame number of the extracted and acquired face image is determined according to the current vehicle speed; accordingly, when determining whether to initiate alarm information according to the obtained real sight line direction, the frame number proportion of the sight line deviation can be determined according to the real sight line direction shown by the face image of each frame, and the alarm information can be initiated according to the frame number proportion.
Further, when determining whether to initiate the alarm information for the real sight-line direction, the number of frames for storing the face image may be determined depending on the current vehicle speed. Generally, when the vehicle speed is faster, more attention is required, and in this case, the number of stored frames may be set to a lower state, whereas when the vehicle speed is slower, the number of stored frames may be set to a higher state. Generally, when a frame is acquired, the facial image of the frame is stored and processed, when the real sight line direction of the driver is determined to deviate from the threshold value, the frame in the buffer is buffered to be 1, otherwise, the frame is buffered to be 0. Then, the proportion of 0 to 1 in the cache is counted in real time to determine whether the driver has a distracted condition, so that an alarm such as voice can be given out.
According to the method for determining the sight direction, the coordinates of the human face characteristic points in the face image are determined by collecting the face image, and the head posture is determined according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction, wherein the obtained real sight line direction is more accurate compared with the prior art because the position of eyes, the head posture and the set position of the acquisition point are considered in the process of obtaining the real sight line direction.
On the basis of the above examples, fig. 3 is a schematic flowchart of another method for determining a gaze direction provided by the present disclosure, and as shown in fig. 3, the method for determining a gaze direction includes:
And 203, aligning the human face according to the position coordinates of the human face indicated by the human face frame, and outputting the coordinates of the characteristic points of the human face.
And step 204, inputting the coordinates of the human face characteristic points into a head posture detection model, and outputting a head posture.
And step 206, determining a reference sight line direction of the sight line relative to the acquisition point according to the head posture and the sight line direction of the eyes in the eye image.
And step 207, processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
And step 208, determining whether to initiate alarm information according to the obtained real sight line direction.
Specifically, similarly to the foregoing examples, the subject of execution of the method for determining a gaze direction provided by the examples of the present disclosure is the foregoing device for determining a gaze direction.
The sight-line direction determination device as described above is provided in the cockpit, and can acquire a face image of the driver while driving the mobile device. Wherein, the determining device can be started and executed together when the mobile device is started. Of course, the device for determining the direction of the implementation may also be connected to a vehicle-mounted computer, and may be activated by receiving a touch operation or a voice operation of the driver to perform the determination process of the direction of the line of sight.
When the determining device is started, the acquisition equipment in the device can acquire the face image of the driver. In order to ensure the subsequent detection accuracy of the sight line direction, in the acquired face image, the image proportion of each eye of the driver on the face image exceeds a proportion threshold, for example, the image width of any eye in the face image exceeds 30 pixel widths.
Unlike the foregoing example, when the determination means is to determine the head pose in the face image using a head pose determination algorithm, the following manner may be employed: first, a face frame in a face image indicating position coordinates of a face in the face image may be determined. Wherein the determination of the face frame may be based on a face detection model. Then, the determining device may perform alignment processing on the face according to the position coordinates of the face indicated by the face frame, and output face feature point coordinates, where the alignment processing on the face may be based on a face alignment model, and the face feature point coordinates include two eye inner and outer corner points, a nose corner point, a mouth left and right corner points, and the like.
Thereafter, similarly to the foregoing example, the head pose may be output based on the face pose detection model. The head posture can be specifically expressed by a cartesian coordinate system, that is, a Pitch angle (Pitch) of the head, a Yaw angle (Yaw) of the head, and a Roll angle (Roll) of the head are represented. Through which the posture of the head currently is assumed.
Further, unlike the foregoing example, the determination means may adopt the following manner when an eye image corresponding to an eye is to be determined from the face image, and a reference line-of-sight direction of a line of sight with respect to an acquisition point is determined from the head posture and the eye image:
firstly, according to the coordinates of the eyes in the coordinates of the human face characteristic points, the eye image corresponding to the eyes is determined in the face image, and the sight line direction of the eyes in the eye image is determined. The eye image is an image obtained by extracting eyes according to coordinates of left and right corner points of the eyes, wherein the eyes can be one eye closer to the acquisition point.
Then, determining the sight line direction of the eyes in the eye image, specifically, performing convolution processing on the eye image to sequentially determine eyelid corner points, left and right corner points and position coordinates of pupils in the eye image; and carrying out coordinate mapping on each position coordinate by using a mapping function to obtain the sight direction of the eyes in the eye image. That is, the determining device performs convolution on the eye image to calculate the gradient of each point, and selects the point with larger gradient as the eyelid corner point (the upper and lower eyelids can be distinguished according to the gradient symbol in the Y direction); then, performing curve fitting on the upper eyelid and the lower eyelid based on a polynomial fitting algorithm; two intersection points formed by upper and lower eyelid curves are used as left and right angular points of the eyes; then, the position of the pupil center is searched in the eye image (the pupil is the darkest part under the condition of daytime/strong light, and the pupil is the brightest part in the eye under the condition of night/weak light); and finally, mapping the left and right corner points of the eyes and the pupil center coordinates into a reference sight direction through a preset mapping function, namely determining the sight direction of the eyes in the eye image.
Then, a reference gaze direction of a gaze relative to an acquisition point will be determined from the head pose and the gaze direction of the eye in the eye image. Specifically, the determining device may perform vector addition on the face pose and the aforementioned eye sight direction of the eyes in the eye image to obtain a reference eye sight direction of the eyes relative to the acquisition point.
And finally, correcting the reference sight direction according to the set direction of the acquisition point to obtain the real sight direction. Wherein the given direction of the acquisition point is the angle between the acquisition point and the true forward direction in the cockpit.
In an alternative example, whether to initiate the alert information may also be determined based on the derived true gaze direction.
When determining whether to initiate alarm information or not for the real sight line direction, setting a threshold value of deviation of the sight line direction from the left direction to the right direction as T; when the obtained real direction of sight deviates from the positive direction by more than the threshold T, an alarm is initiated to prompt the driver to increase attention.
In addition, the current vehicle speed can be acquired, and the frame number of the extracted and acquired face image is determined according to the current vehicle speed; accordingly, when determining whether to initiate alarm information according to the obtained real sight line direction, the frame number proportion of the sight line deviation can be determined according to the real sight line direction shown by the face image of each frame, and the alarm information can be initiated according to the frame number proportion.
Further, when determining whether to initiate the alarm information for the real sight-line direction, the number of frames for storing the face image may be determined depending on the current vehicle speed. Generally, when the vehicle speed is faster, more attention is required, and in this case, the number of stored frames may be set to a lower state, whereas when the vehicle speed is slower, the number of stored frames may be set to a higher state. Generally, when a frame is acquired, the facial image of the frame is stored and processed, when the real sight line direction of the driver is determined to deviate from the threshold value, the frame in the buffer is buffered to be 1, otherwise, the frame is buffered to be 0. Then, the proportion of 0 to 1 in the cache is counted in real time to determine whether the driver has a distracted condition, so that an alarm such as voice can be given out.
The method for determining the sight direction provided by the disclosure comprises the steps of collecting a face image; determining the coordinates of the human face characteristic points in the facial image, and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction, wherein the obtained real sight line direction is more accurate compared with the prior art because the position of eyes, the head posture and the set position of the acquisition point are considered in the process of obtaining the real sight line direction.
Fig. 4 is a schematic structural diagram of a device for determining a gaze direction according to the present disclosure, as shown in fig. 4, the device for determining a gaze direction includes:
an acquisition module 10 for acquiring a face image;
the processing module 20 is used for determining the coordinates of the human face characteristic points in the face image and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
Illustratively, the processing module 20 is specifically configured to determine a face frame in the face image, where the face frame is used to indicate a position coordinate of a face in the face image; and aligning the human face according to the position coordinates of the human face indicated by the human face frame, and outputting the coordinates of the characteristic points of the human face.
Illustratively, the processing module 20 is specifically configured to input the coordinates of the human face feature points into the head pose detection model, and output the head pose.
Illustratively, the processing module 20 is specifically configured to determine, according to coordinates of eyes in the coordinates of the human face feature point, an eye image corresponding to the eyes in the face image, and determine a sight line direction of the eyes in the eye image; and determining a reference sight line direction of the sight line relative to the acquisition point according to the head posture and the sight line direction of the eyes in the eye image.
Exemplarily, the processing module 20 is specifically configured to perform convolution processing on the eye image to sequentially determine an eyelid corner point, a left corner point, a right corner point, and a position coordinate of a pupil in the eye image; and carrying out coordinate mapping on each position coordinate by using a mapping function to obtain the sight direction of the eyes in the eye image.
Exemplary, also include: an alarm module;
the alarm module is used for determining whether to initiate alarm information according to the obtained real sight line direction.
Illustratively, the acquisition module 10 is further configured to: acquiring a current vehicle speed, and determining the number of extracted facial images acquired according to the current vehicle speed;
the alarm module is used for: and determining the frame number proportion of the sight deviation according to the real sight direction shown by the face image of each frame, and initiating alarm information according to the frame number proportion.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and corresponding beneficial effects of the system described above may refer to the corresponding process in the foregoing method example, and are not described herein again.
The sight direction determining device provided by the disclosure receives an information query request through a query port, wherein the information query request comprises a game account identifier to be queried; responding to the information query request, and calling role information of the game account corresponding to the game account identification from a game data server; and displaying the role information of the game account on a display interface of the query port, thereby simplifying the evaluation process of the anchor to the game account and improving the security of the game account of the player.
On the next hand, this embodiment further provides an electronic device, which can be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and technical effect of the electronic device are similar, which is not described herein again.
Referring to fig. 5, a schematic structural diagram of an electronic device 900 suitable for implementing the embodiment of the present disclosure is shown, where the electronic device 900 may be a terminal device or a server. Among them, the terminal Device may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a Digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a car terminal (e.g., car navigation terminal), etc., and a fixed terminal such as a Digital TV, a desktop computer, etc. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device 900 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 901, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage means 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for the operation of the electronic apparatus 900 are also stored. The processing apparatus 901, the ROM 902, and the RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
Generally, the following devices may be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 907 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 908 including, for example, magnetic tape, hard disk, etc.; and a communication device 909. The communication device 909 may allow the electronic apparatus 900 to perform wireless or wired communication with other apparatuses to exchange data. While fig. 5 illustrates an electronic device 900 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication device 909, or installed from the storage device 908, or installed from the ROM 902. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing apparatus 901.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods shown in the above embodiments.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (10)
1. A method for determining a gaze direction, comprising:
collecting a face image;
determining the coordinates of the human face characteristic points in the facial image, and determining the head posture according to the coordinates of the human face characteristic points;
determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images;
and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
2. The gaze direction determination method of claim 1, wherein the determining the coordinates of the feature points of the person's face in the facial image comprises:
determining a face frame in a face image, wherein the face frame is used for indicating position coordinates of a face in the face image;
and aligning the human face according to the position coordinates of the human face indicated by the human face frame, and outputting the coordinates of the characteristic points of the human face.
3. The gaze direction determination method of claim 1, wherein determining the head pose from the face feature point coordinates comprises:
and inputting the coordinates of the human face characteristic points into a head posture detection model, and outputting a head posture.
4. The method for determining the direction of the line of sight according to claim 1, wherein determining the eye image corresponding to the eye according to the face image, and determining the reference direction of the line of sight relative to the acquisition point according to the head pose and the eye image comprises:
determining eye images corresponding to the eyes in the face images according to the coordinates of the eyes in the coordinates of the human face characteristic points, and determining the sight line direction of the eyes in the eye images;
and determining a reference sight line direction of the sight line relative to the acquisition point according to the head posture and the sight line direction of the eyes in the eye image.
5. The method for determining the gaze direction of an eye according to claim 4, wherein the determining the gaze direction of the eye in the eye image comprises:
performing convolution processing on the eye image to sequentially determine the eyelid corner points, the left corner points, the right corner points and the position coordinates of the pupils in the eye image;
and carrying out coordinate mapping on each position coordinate by using a mapping function to obtain the sight direction of the eyes in the eye image.
6. The gaze direction determination method of any one of claims 1-5, further comprising:
and determining whether to initiate alarm information according to the obtained real sight line direction.
7. The gaze direction determination method of claim 6, further comprising:
acquiring a current vehicle speed, and determining the number of extracted facial images acquired according to the current vehicle speed;
the determining whether to initiate the alarm information according to the obtained real sight line direction includes:
and determining the frame number proportion of the sight deviation according to the real sight direction shown by the face image of each frame, and initiating alarm information according to the frame number proportion.
8. A gaze direction determination apparatus, comprising:
the acquisition module is used for acquiring a face image;
the processing module is used for determining the coordinates of the human face characteristic points in the face image and determining the head posture according to the coordinates of the human face characteristic points; determining eye images corresponding to eyes according to the face images, and determining a reference sight line direction of a sight line relative to the acquisition point according to the head posture and the eye images; and processing the reference sight line direction according to the set direction of the acquisition point to obtain a real sight line direction.
9. An electronic device, comprising:
a processor and a memory;
wherein the memory is configured to store executable instructions of the processor;
the processor, when executing the executable instructions, may perform the method of any of claims 1-7 above.
10. A storage medium comprising instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911317746.4A CN111027506B (en) | 2019-12-19 | 2019-12-19 | Method and device for determining sight direction, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911317746.4A CN111027506B (en) | 2019-12-19 | 2019-12-19 | Method and device for determining sight direction, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111027506A true CN111027506A (en) | 2020-04-17 |
CN111027506B CN111027506B (en) | 2024-04-12 |
Family
ID=70210070
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911317746.4A Active CN111027506B (en) | 2019-12-19 | 2019-12-19 | Method and device for determining sight direction, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111027506B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111680546A (en) * | 2020-04-26 | 2020-09-18 | 北京三快在线科技有限公司 | Attention detection method, attention detection device, electronic equipment and storage medium |
CN111873911A (en) * | 2020-08-04 | 2020-11-03 | 深圳地平线机器人科技有限公司 | Method, device, medium, and electronic apparatus for adjusting rearview mirror |
CN114913155A (en) * | 2022-05-11 | 2022-08-16 | 北京宾理信息科技有限公司 | Method, apparatus, computer device, vehicle, and medium for determining a gaze direction of a vehicle user |
CN115147816A (en) * | 2021-03-29 | 2022-10-04 | 东风汽车集团股份有限公司 | Sight line area detection method, system, device, medium, and vehicle |
CN115984950A (en) * | 2022-12-28 | 2023-04-18 | 北京字跳网络技术有限公司 | Sight line detection method and device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012216180A (en) * | 2011-03-30 | 2012-11-08 | Advanced Telecommunication Research Institute International | Estimation device of visual line direction, method for estimating visual line direction, and program for causing computer to execute method for estimating visual line direction |
DE102014100352A1 (en) * | 2013-01-18 | 2014-07-24 | Carnegie Mellon University | Method for detecting condition of viewing direction of rider of vehicle, involves estimating driver's line of sight on basis of detected location for each of eye characteristic of eyeball of rider and estimated position of head |
JP2014194617A (en) * | 2013-03-28 | 2014-10-09 | Advanced Telecommunication Research Institute International | Visual line direction estimating device, visual line direction estimating method, and visual line direction estimating program |
CN106598221A (en) * | 2016-11-17 | 2017-04-26 | 电子科技大学 | Eye key point detection-based 3D sight line direction estimation method |
CN110363133A (en) * | 2019-07-10 | 2019-10-22 | 广州市百果园信息技术有限公司 | A kind of method, apparatus, equipment and the storage medium of line-of-sight detection and video processing |
CN110458122A (en) * | 2019-08-15 | 2019-11-15 | 京东方科技集团股份有限公司 | The playback method and sight Calibration System of a kind of sight Calibration Method, display device |
-
2019
- 2019-12-19 CN CN201911317746.4A patent/CN111027506B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012216180A (en) * | 2011-03-30 | 2012-11-08 | Advanced Telecommunication Research Institute International | Estimation device of visual line direction, method for estimating visual line direction, and program for causing computer to execute method for estimating visual line direction |
DE102014100352A1 (en) * | 2013-01-18 | 2014-07-24 | Carnegie Mellon University | Method for detecting condition of viewing direction of rider of vehicle, involves estimating driver's line of sight on basis of detected location for each of eye characteristic of eyeball of rider and estimated position of head |
JP2014194617A (en) * | 2013-03-28 | 2014-10-09 | Advanced Telecommunication Research Institute International | Visual line direction estimating device, visual line direction estimating method, and visual line direction estimating program |
CN106598221A (en) * | 2016-11-17 | 2017-04-26 | 电子科技大学 | Eye key point detection-based 3D sight line direction estimation method |
CN110363133A (en) * | 2019-07-10 | 2019-10-22 | 广州市百果园信息技术有限公司 | A kind of method, apparatus, equipment and the storage medium of line-of-sight detection and video processing |
CN110458122A (en) * | 2019-08-15 | 2019-11-15 | 京东方科技集团股份有限公司 | The playback method and sight Calibration System of a kind of sight Calibration Method, display device |
Non-Patent Citations (2)
Title |
---|
迟健男;张闯;胡涛;颜艳桃;刘洋;: "视线追踪系统眼部特征检测及视线方向计算方法研究" * |
高迪;印桂生;马春光;: "基于角膜反射的非侵入式视线跟踪技术" * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111680546A (en) * | 2020-04-26 | 2020-09-18 | 北京三快在线科技有限公司 | Attention detection method, attention detection device, electronic equipment and storage medium |
CN111873911A (en) * | 2020-08-04 | 2020-11-03 | 深圳地平线机器人科技有限公司 | Method, device, medium, and electronic apparatus for adjusting rearview mirror |
CN111873911B (en) * | 2020-08-04 | 2022-06-07 | 深圳地平线机器人科技有限公司 | Method, device, medium, and electronic apparatus for adjusting rearview mirror |
CN115147816A (en) * | 2021-03-29 | 2022-10-04 | 东风汽车集团股份有限公司 | Sight line area detection method, system, device, medium, and vehicle |
CN114913155A (en) * | 2022-05-11 | 2022-08-16 | 北京宾理信息科技有限公司 | Method, apparatus, computer device, vehicle, and medium for determining a gaze direction of a vehicle user |
CN115984950A (en) * | 2022-12-28 | 2023-04-18 | 北京字跳网络技术有限公司 | Sight line detection method and device, electronic equipment and storage medium |
CN115984950B (en) * | 2022-12-28 | 2024-03-12 | 北京字跳网络技术有限公司 | Sight line detection method, device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111027506B (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111027506B (en) | Method and device for determining sight direction, electronic equipment and storage medium | |
US11367307B2 (en) | Method for processing images and electronic device | |
CN111602140B (en) | Method of analyzing objects in images recorded by a camera of a head-mounted device | |
EP3985990A1 (en) | Video clip positioning method and apparatus, computer device, and storage medium | |
CN110544272B (en) | Face tracking method, device, computer equipment and storage medium | |
US11120707B2 (en) | Cognitive snapshots for visually-impaired users | |
KR20150003591A (en) | Smart glass | |
CN109835260B (en) | Vehicle information display method, device, terminal and storage medium | |
CN111783626B (en) | Image recognition method, device, electronic equipment and storage medium | |
CN111163303B (en) | Image display method, device, terminal and storage medium | |
CN110555426A (en) | Sight line detection method, device, equipment and storage medium | |
KR20180109217A (en) | Method for enhancing face image and electronic device for the same | |
CN110647881A (en) | Method, device, equipment and storage medium for determining card type corresponding to image | |
CN103869977A (en) | Image display method, device and electronic equipment | |
WO2024140154A1 (en) | Gaze detection method and apparatus, and electronic device and storage medium | |
WO2024149078A1 (en) | Sensing method supporting dynamic input of multiple cameras, system and vehicle | |
US11810336B2 (en) | Object display method and apparatus, electronic device, and computer readable storage medium | |
US10893388B2 (en) | Map generation device, map generation system, map generation method, and non-transitory storage medium including instructions for map generation | |
CN111340813B (en) | Image instance segmentation method and device, electronic equipment and storage medium | |
CN112749414B (en) | Data storage method, device, equipment and storage medium | |
CN109472855B (en) | Volume rendering method and device and intelligent device | |
WO2023000838A1 (en) | Information detection method and apparatus, medium, and electronic device | |
US20240329730A1 (en) | Picture rendering method and apparatus | |
CN114516270B (en) | Vehicle control method, apparatus, electronic device, and computer-readable medium | |
EP4086102B1 (en) | Navigation method and apparatus, electronic device, readable storage medium and computer program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |