CN111537075A - Temperature extraction method, device, machine readable medium and equipment - Google Patents
Temperature extraction method, device, machine readable medium and equipment Download PDFInfo
- Publication number
- CN111537075A CN111537075A CN202010291602.2A CN202010291602A CN111537075A CN 111537075 A CN111537075 A CN 111537075A CN 202010291602 A CN202010291602 A CN 202010291602A CN 111537075 A CN111537075 A CN 111537075A
- Authority
- CN
- China
- Prior art keywords
- temperature
- target
- image
- region
- detection object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 31
- 238000001514 detection method Methods 0.000 claims abstract description 131
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000013507 mapping Methods 0.000 claims abstract description 11
- 230000036760 body temperature Effects 0.000 abstract description 16
- 238000009529 body temperature measurement Methods 0.000 abstract description 12
- 210000001061 forehead Anatomy 0.000 description 65
- 238000012545 processing Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 210000001747 pupil Anatomy 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 239000000523 sample Substances 0.000 description 4
- 238000001931 thermography Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 208000035473 Communicable disease Diseases 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241000711573 Coronaviridae Species 0.000 description 1
- 201000011001 Ebola Hemorrhagic Fever Diseases 0.000 description 1
- 208000002979 Influenza in Birds Diseases 0.000 description 1
- 206010037660 Pyrexia Diseases 0.000 description 1
- 201000003176 Severe Acute Respiratory Syndrome Diseases 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 206010064097 avian influenza Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0887—Integrating cavities mimicking black bodies, wherein the heat propagation between the black body and the measuring element does not occur within a solid; Use of bodies placed inside the fluid stream for measurement of the temperature of gases; Use of the reemission from a surface, e.g. reflective surface; Emissivity enhancement by multiple reflections
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
- G01J5/485—Temperature profile
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01K—MEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
- G01K13/00—Thermometers specially adapted for specific purposes
- G01K13/20—Clinical contact thermometers for use with humans or animals
- G01K13/223—Infrared clinical thermometers, e.g. tympanic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a temperature extraction method, which comprises the following steps: acquiring multiple types of images of a detection object; determining a target detection area of the detection object; acquiring the temperature of a target mapping area; a target temperature is obtained based on the region growing. According to the invention, based on the characteristic of the similarity of the human body temperature and the temperature of adjacent points in the region, a region growing method is adopted to accurately find a region with similar temperature, and finally, accurate temperature measurement is realized by automatically positioning the region with more concentrated human body temperature.
Description
Technical Field
The invention belongs to the field of temperature detection, and particularly relates to a temperature extraction method, a temperature extraction device, a machine readable medium and equipment.
Background
Along with the frequent abuse of infectious diseases such as SARS, avian influenza, Ebola, the novel coronavirus and the like, the improvement of the screening capability of the infectious diseases is more and more important, the fever in symptoms is an obvious quantifiable detection index, and the application of an infrared thermal imager to screen the body temperature becomes an important means for predicting and controlling the epidemic situation. At present, the infrared thermal imaging body temperature screening system has the problems of incapability of accurately identifying a human body, high false detection rate, low extraction temperature precision and the like, and comprises the following steps:
1. in a complex environment, different objects have different temperatures, and the problem of false detection and alarm exists. If people hold an early spot in the hand in the morning and a cup and other objects with higher temperature, the infrared thermal imager can output the highest temperature in the thermal image as the temperature of the human body;
2. the temperature of different parts of a human body is different, the temperature output of different parts in a heat map is also different, the maximum temperature of the human body is selected as the temperature of the human body by the infrared thermal imager which is most of products at present, the specific parts of the human body are not detected, and the accuracy and precision are deviated;
3. the distance has great influence on the temperature measurement accuracy, the farther the distance is, the fewer the number of pixels of a human body is, and meanwhile, the influence of the absorption and scattering of atmospheric molecules is great, so that the measurement accuracy is worse, the temperature measurement at different distances has obvious difference, different products have required measurement ranges, and the measurement data in the measurement ranges are relatively calibrated. In actual use, a person is not necessarily data measured in a reasonable range, and finally the temperature value of the person is displayed to be incapable of determining the most accurate value, so that deviation or false detection exists.
Infrared thermal imaging appearance leaves the factory in the market and all reaches national standard, and final output temperature value is for the precision that accords with the national standard requirement through the blackbody detection, and the blackbody is accurate stable heat source, and the testing target is clear and definite, but in the in-service use, it becomes very difficult to get the correct position of human body and go to measure the temperature, or often neglect, and in-service use adopts the blackbody to mark and can improve the temperature measurement precision, but if the wrong measurement target of choosing just the merit is short of one. Therefore, an algorithm that can accurately extract the temperature of the human body in the thermal imager becomes very important.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present invention to provide a temperature extraction method, apparatus, machine-readable medium and device, which solve the problems of the prior art.
To achieve the above and other related objects, the present invention provides a temperature extraction method, comprising:
acquiring multiple types of images of a detection object;
determining a target detection area of the detection object;
acquiring the temperature of a target mapping area;
a target temperature is obtained based on the region growing.
Optionally, the multiple types of images include visible light images, infrared images, laser images.
Optionally, if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
Optionally, if the image of the detection object includes a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
Optionally, the pixel points in the target detection area are sequentially used as seed points for area growth until a growth stop condition is met.
Optionally, the growth stop conditions are:
the number of the similar points searched in the neighborhood of the seed point is larger than a first threshold value; wherein a difference between the pixel value of the similar point and the pixel value of the start point is smaller than a second threshold.
Optionally, the pixel points in the target detection area are sorted based on a temperature condition, and the sorted pixel points are used as seed points for area growth.
Optionally, the temperature value with the largest number of occurrences in the neighborhood of the seed point corresponding to the condition that satisfies the growth stop is used as the target temperature.
Optionally, the method further comprises: and compensating the temperature of the target detection area based on the distance between the detection object and the image acquisition device.
To achieve the above and other related objects, the present invention provides a temperature extraction system, comprising:
the image acquisition module is used for acquiring various types of images of the detection object;
a target detection area determination module for determining a target detection area of the detection object;
the region temperature acquisition module is used for acquiring the temperature of the target mapping region;
and the target temperature acquisition module is used for acquiring the target temperature based on the region growing.
Optionally, the multiple types of images include visible light images, infrared images, laser images.
Optionally, if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
Optionally, if the image of the detection object includes a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
Optionally, the pixel points in the target detection area are sequentially used as seed points for area growth until a growth stop condition is met.
Optionally, the growth stop conditions are:
the number of the similar points searched in the neighborhood of the seed point is larger than a first threshold value; wherein a difference between the pixel value of the similar point and the pixel value of the start point is smaller than a second threshold.
Optionally, the pixel points in the target detection area are sorted based on a temperature condition, and the sorted pixel points are used as seed points for area growth.
Optionally, the temperature value with the largest number of occurrences in the neighborhood of the seed point corresponding to the condition that satisfies the growth stop is used as the target temperature.
Optionally, the method further comprises: and the temperature compensation module is used for compensating the temperature of the target detection area based on the distance between the detection object and the image acquisition device.
To achieve the above and other related objects, the present invention provides an apparatus comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform one or more of the methods described previously.
To achieve the foregoing and other related objectives, the present invention provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described above.
As described above, the face image processing method, apparatus, machine-readable medium and device provided by the present invention have the following beneficial effects:
according to the invention, based on the characteristic of the similarity of the human body temperature and the temperature of adjacent points in the region, a region growing method is adopted to accurately find a region with similar temperature, and finally, accurate temperature measurement is realized by automatically positioning the region with more concentrated human body temperature.
Drawings
FIG. 1 is a schematic illustration of a heat map of a facial region provided in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of a temperature extraction method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an apparatus for measuring an object under inspection and an image capturing device according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a temperature extraction system according to an embodiment of the present invention;
fig. 5 is a schematic hardware structure diagram of a terminal device according to an embodiment;
fig. 6 is a schematic diagram of a hardware structure of a terminal device according to another embodiment.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
For regions with temperature similarities due to individual regions of the body, the forehead and chin positions are evident as two separate higher temperature regions as shown in the heat map of the facial region of fig. 1. The invention processes the image of the heat map, selects proper seed points by adopting a region growing method based on the characteristic of the similarity of the human body temperature and the temperature of adjacent points in the region, starts region growing, can accurately find a region with similar temperature, and finally realizes the accurate temperature measurement of the region with more concentrated human body temperature by automatic positioning.
Specifically, as shown in fig. 2, the present invention provides a temperature extraction method, comprising:
s11, acquiring multiple types of images of the detection object;
s12 determining a target detection region of the detection object;
s13, acquiring the temperature of the target mapping area;
s14 obtains a target temperature based on the region growing.
The invention obtains various types of images of a detection object; determining a target detection area of the detection object; acquiring the temperature of a target mapping area; a target temperature is obtained based on the region growing. According to the invention, based on the characteristic of the similarity of the human body temperature and the temperature of adjacent points in the region, a region growing method is adopted to accurately find a region with similar temperature, and finally, accurate temperature measurement is realized by automatically positioning the region with more concentrated human body temperature.
Region growing (region growing): refers to the process of developing groups of pixels or regions into larger regions. Starting from the set of seed points, the region from these points grows by merging into this region neighboring pixels with similar properties like intensity, grey level, texture color, etc. as each seed point.
In this embodiment, the plurality of types of images may include a visible light image, an infrared image, and a laser image. The visible light image can be collected by a visible light image collecting sensor, the infrared image can be collected by an infrared image collecting sensor, and certainly, after the image is collected by an RGB-IR image sensor (capable of receiving RGB components and IR components simultaneously), the RGB-IR processing unit separates the received RGB-IR image data to obtain a synchronous RGB image (visible light image) and an IR image (infrared image); the laser image can be acquired by a laser image acquisition sensor acquisition module. Of course, in another embodiment, at least two images of the two images may be collected by one device, for example, an infrared temperature probe that can simultaneously collect a visible light image and an infrared image, or a laser temperature probe that can simultaneously collect a laser image and an infrared image, or other image collecting devices with the same function; or an image acquisition device which can simultaneously acquire visible light images, infrared images and laser images.
In one embodiment, if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
The target part comprises a face, a hand back, a neck, a shoulder and the like, and the target detection area comprises a face area, a hand back area, a neck area, a shoulder area and the like. The face may include a forehead, a mouth, etc., and the face region may include a forehead region, a mouth region, etc.
Because each position temperature of human body has some differences, the forehead temperature is comparatively stable, and reasons such as mouth respiration or wear gauze mask, the temperature is unstable, probably exists highly, and thermal imaging equipment carries out temperature compensation in order to obtain the forehead temperature mostly, to closely measuring the temperature, obtains the temperature region and should be the forehead position, and forehead part temperature value has the similarity, and the temperature range difference is not big. Thus, for a heat map with smaller pixels, the forehead may be selected as the target site.
The target site is referred to as the forehead and the target detection region is referred to as the forehead region.
It will be appreciated that since infrared detection is an ongoing process, detection may be performed continuously over a period of time to obtain a plurality of infrared images. Therefore, when the temperature of the forehead area is detected, it is necessary to obtain images at the same time, that is, an infrared image at the current time and a visible light image at the current time. In the process of determining the forehead area, firstly, the face detection is carried out on the visible light image at the current moment to obtain the forehead position, and then the forehead position in the visible light image at the current moment is mapped to the infrared image of the detection object at the current moment to obtain the forehead area of the detection object in the infrared image at the current moment.
After the forehead area is determined, the forehead area in the infrared image at the current moment can be measured to obtain the temperature of the forehead area. After obtaining the temperature of the forehead area, obtaining the target temperature by an area growing method.
The embodiment adopts a face detection technology, and can simultaneously detect a plurality of faces in a video picture so as to obtain face data of a detection object. Further, an optimal face can be obtained through the face data, the face temperature is measured, and the face area when the optimal face is used as a temperature measurement object. The optimal face can be comprehensively selected through multiple dimensions such as face quality score, face size, face angle, face occlusion rate and the like.
In another embodiment, if the image of the detection object comprises a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
The target part comprises a face, a hand back, a neck, a shoulder and the like, and the target detection area comprises a face area, a hand back area, a neck area, a shoulder area and the like. The face may include a forehead, a mouth, etc., and the face region may include a forehead region, a mouth region, etc.
The target site is referred to as the forehead and the target detection region is referred to as the forehead region.
It can be understood that, when the temperature of the forehead area is detected, it is necessary to obtain the images at the same time, that is, the visible light image at the current time and the laser image at the current time. In the process of determining the forehead area, performing forehead detection on the visible light at the current moment to obtain the position of the forehead, and then mapping the forehead position in the visible light image at the current moment to the laser image of the detection object at the current moment to obtain the forehead area of the detection object in the laser image at the current moment.
After the forehead area is determined, the forehead area in the infrared image at the current moment can be measured to obtain the temperature of the forehead area. After obtaining the temperature of the forehead area, obtaining the target temperature by an area growing method.
In one embodiment, the pixel points in the target detection area are sequentially used as seed points for area growth until a growth stop condition is met.
In one embodiment, the pixel points in the target detection area are sorted based on temperature conditions, and the sorted pixel points are used as seed points for area growth. For example, the pixel points are sorted according to the temperature, and then the region growth is started from the pixel point with the highest temperature until the growth stop condition is met. Wherein the growth stop conditions are:
the number of the similar points searched in the neighborhood of the seed point is greater than a first threshold, wherein the first threshold is represented by N and can be defined as 5; wherein the difference between the pixel value of the similar point and the pixel value of the starting point is smaller than a second threshold, the second threshold is represented by T, and can be defined as 2.
In the process of searching the similar points, an eight-neighborhood method is adopted for searching.
In an embodiment, the temperature value with the largest number of occurrences in the neighborhood of the seed point corresponding to the condition that satisfies the growth stop may be used as the target temperature.
Since different distances have a certain influence on the temperature measurement accuracy, the temperature of the target detection area is compensated based on the distance between the detection object and the image acquisition device in this embodiment.
Specifically, the camera may be calibrated in advance according to the transmission principle. In this embodiment, a human face is used for explaining, and a pupil distance is obtained according to the human face detection information, so as to calculate a distance between an image acquisition device (e.g., a camera) and a human.
The distance from an object to a camera is calculated by utilizing the similar triangle formed by imaging the small hole, the calibration is carried out through the known width of the image of the object, and then the target distance is estimated through the size of the target in the image. As shown in fig. 3, assume that there is an object X having a height W. This target is then placed at a distance Z from the camera, the camera is used to photograph the object and the pixel height P of the object is measured. This yields the formula for the focal length of the camera:
F=(P×Z)/W
the distance between a person and a camera is calculated according to the principle, the size of an object pixel in a current image needs to be acquired, and the pixel width of the interpupillary distance of the eye is selected to measure the distance because the interpupillary distance of the person is not large.
Statistically, the interpupillary distance of the eyes of adult male is about 60mm-73mm, and the interpupillary distance of the eyes of adult female is about 53mm-68mm, so the intermediate value of 63mm is taken for calculation. When the interpupillary distance selects 63mm, there is the deviation to the less or great people of interpupillary distance, when actual interpupillary distance is 53mm, the error is: 18%, when the actual interpupillary distance is 73mm, the error is 14%, i.e. at 0.5m, the deviations are 0.09m and 0.07 m.
Calibrated for a4 paper size at 0.5m, F-1642 pixels, as shown,
capturing images at different distances, and measuring a pupil distance calculation formula of the images: Z-W-F/P-63-1642/P gives the calculated distances as shown in the following table:
distance (mm) | First of all | Error rate (mm) | Second step | Error rate (mm) |
400 | 382 | 4.50% | 409 | 2.25% |
500 | 509 | 1.80% | 512 | 2.40% |
600 | 594 | 1% | 612 | 2% |
900 | 848 | 5.80% | 834 | 7.30% |
It can be seen that the measured data is basically accurate, and the accuracy meets the requirement for estimating the distance, so that the distance between the image acquisition device and the detected object can be obtained by measuring the pupil distance.
In order to accurately acquire the temperature of the human body and eliminate the interference of other heat sources in the image, the temperature range is limited according to the normal body temperature of the human body, and the position is limited to a certain extent according to the measured position information. For example, when obtaining the hot image pixel value of the face region, the human body temperature range is set to be 35 degrees to 40 degrees, and other non-compliant temperature values are filtered out.
And the position can be limited according to the measured position information. For example, the distance between the detection object and the image acquisition device is acquired during measurement, and when the distance exceeds a set range, a prompt is issued to prompt the detection object to approach the image acquisition device.
As shown in fig. 4, the present invention provides a temperature extraction system, comprising:
an image acquisition module 11, configured to acquire multiple types of images of a detection object;
a target detection area determination module 12, configured to determine a target detection area of the detection object;
a region temperature obtaining module 13, configured to obtain a temperature of the target mapping region;
and a target temperature obtaining module 14 for obtaining the target temperature based on the region growing.
The invention obtains various types of images of a detection object; determining a target detection area of the detection object; acquiring the temperature of a target mapping area; a target temperature is obtained based on the region growing. According to the invention, based on the characteristic of the similarity of the human body temperature and the temperature of adjacent points in the region, a region growing method is adopted to accurately find a region with similar temperature, and finally, accurate temperature measurement is realized by automatically positioning the region with more concentrated human body temperature.
Region growing (region growing): refers to the process of developing groups of pixels or regions into larger regions. Starting from the set of seed points, the region from these points grows by merging into this region neighboring pixels with similar properties like intensity, grey level, texture color, etc. as each seed point.
In this embodiment, the plurality of types of images may include a visible light image, an infrared image, and a laser image. The visible light image can be collected by a visible light image collecting sensor, the infrared image can be collected by an infrared image collecting sensor, and certainly, after the image is collected by an RGB-IR image sensor (capable of receiving RGB components and IR components simultaneously), the RGB-IR processing unit separates the received RGB-IR image data to obtain a synchronous RGB image (visible light image) and an IR image (infrared image); the laser image can be acquired by a laser image acquisition sensor acquisition module. Of course, in another embodiment, at least two images of the two images may be collected by one device, for example, an infrared temperature probe that can simultaneously collect a visible light image and an infrared image, or a laser temperature probe that can simultaneously collect a laser image and an infrared image, or other image collecting devices with the same function; or an image acquisition device which can simultaneously acquire visible light images, infrared images and laser images.
In one embodiment, if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
The target part comprises a face, a hand back, a neck, a shoulder and the like, and the target detection area comprises a face area, a hand back area, a neck area, a shoulder area and the like. The face may include a forehead, a mouth, etc., and the face region may include a forehead region, a mouth region, etc.
Because each position temperature of human body has some differences, the forehead temperature is comparatively stable, and reasons such as mouth respiration or wear gauze mask, the temperature is unstable, probably exists highly, and thermal imaging equipment carries out temperature compensation in order to obtain the forehead temperature mostly, to closely measuring the temperature, obtains the temperature region and should be the forehead position, and forehead part temperature value has the similarity, and the temperature range difference is not big. Thus, for a heat map with smaller pixels, the forehead may be selected as the target site.
The target site is referred to as the forehead and the target detection region is referred to as the forehead region.
It will be appreciated that since infrared detection is an ongoing process, detection may be performed continuously over a period of time to obtain a plurality of infrared images. Therefore, when the temperature of the forehead area is detected, it is necessary to obtain images at the same time, that is, an infrared image at the current time and a visible light image at the current time. In the process of determining the forehead area, firstly, the face detection is carried out on the visible light image at the current moment to obtain the forehead position, and then the forehead position in the visible light image at the current moment is mapped to the infrared image of the detection object at the current moment to obtain the forehead area of the detection object in the infrared image at the current moment.
After the forehead area is determined, the forehead area in the infrared image at the current moment can be measured to obtain the temperature of the forehead area. After obtaining the temperature of the forehead area, obtaining the target temperature by an area growing method.
The embodiment adopts a face detection technology, and can simultaneously detect a plurality of faces in a video picture so as to obtain face data of a detection object. Further, an optimal face can be obtained through the face data, the face temperature is measured, and the face area when the optimal face is used as a temperature measurement object. The optimal face can be comprehensively selected through multiple dimensions such as face quality score, face size, face angle, face occlusion rate and the like.
In another embodiment, if the image of the detection object comprises a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
The target part comprises a face, a hand back, a neck, a shoulder and the like, and the target detection area comprises a face area, a hand back area, a neck area, a shoulder area and the like. The face may include a forehead, a mouth, etc., and the face region may include a forehead region, a mouth region, etc.
The target site is referred to as the forehead and the target detection region is referred to as the forehead region.
It can be understood that, when the temperature of the forehead area is detected, it is necessary to obtain the images at the same time, that is, the visible light image at the current time and the laser image at the current time. In the process of determining the forehead area, performing forehead detection on the visible light at the current moment to obtain the position of the forehead, and then mapping the forehead position in the visible light image at the current moment to the laser image of the detection object at the current moment to obtain the forehead area of the detection object in the laser image at the current moment.
After the forehead area is determined, the forehead area in the infrared image at the current moment can be measured to obtain the temperature of the forehead area. After obtaining the temperature of the forehead area, obtaining the target temperature by an area growing method.
In one embodiment, the pixel points in the target detection area are sequentially used as seed points for area growth until a growth stop condition is met.
In one embodiment, the pixel points in the target detection area are sorted based on temperature conditions, and the sorted pixel points are used as seed points for area growth. For example, the pixel points are sorted according to the temperature, and then the region growth is started from the pixel point with the highest temperature until the growth stop condition is met. Wherein the growth stop conditions are:
the number of the similar points searched in the neighborhood of the seed point is greater than a first threshold, wherein the first threshold is represented by N and can be defined as 5; wherein the difference between the pixel value of the similar point and the pixel value of the starting point is smaller than a second threshold, the second threshold is represented by T, and can be defined as 2.
In the process of searching the similar points, an eight-neighborhood method is adopted for searching.
In an embodiment, the temperature value with the largest number of occurrences in the neighborhood of the seed point corresponding to the condition that satisfies the growth stop may be used as the target temperature.
Because different distances have certain influence on the temperature measurement precision, in this embodiment, the temperature compensation module is further included, and is configured to compensate the temperature of the target detection area based on the distance between the detection object and the image acquisition device.
Specifically, the camera may be calibrated in advance according to the transmission principle. In this embodiment, a human face is used for explaining, and a pupil distance is obtained according to the human face detection information, so as to calculate a distance between an image acquisition device (e.g., a camera) and a human.
The distance from an object to a camera is calculated by utilizing the similar triangle formed by imaging the small hole, the calibration is carried out through the known width of the image of the object, and then the target distance is estimated through the size of the target in the image. As shown in fig. 3, assume that there is an object X having a height W. This target is then placed at a distance Z from the camera, the camera is used to photograph the object and the pixel height P of the object is measured. This yields the formula for the focal length of the camera:
F=(P×Z)/W
the distance between a person and a camera is calculated according to the principle, the size of an object pixel in a current image needs to be acquired, and the pixel width of the interpupillary distance of the eye is selected to measure the distance because the interpupillary distance of the person is not large.
Statistically, the interpupillary distance of the eyes of adult male is about 60mm-73mm, and the interpupillary distance of the eyes of adult female is about 53mm-68mm, so the intermediate value of 63mm is taken for calculation. When the interpupillary distance selects 63mm, there is the deviation to the less or great people of interpupillary distance, when actual interpupillary distance is 53mm, the error is: 18%, when the actual interpupillary distance is 73mm, the error is 14%, i.e. at 0.5m, the deviations are 0.09m and 0.07 m.
Calibrated for a4 paper size at 0.5m, F-1642 pixels, as shown,
capturing images at different distances, and calculating the distance between the image acquisition device and a detection object according to the pupil distance by the following formula: z ═ W ═ F/P ═ 63 ═ 1642/P, as shown in the following table:
distance (mm) | First of all | Error rate (mm) | Second step | Error rate (mm) |
400 | 382 | 4.50% | 409 | 2.25% |
500 | 509 | 1.80% | 512 | 2.40% |
600 | 594 | 1% | 612 | 2% |
900 | 848 | 5.80% | 834 | 7.30% |
It can be seen that the measured data is basically accurate, and the accuracy meets the requirement for estimating the distance, so that the distance between the image acquisition device and the detected object can be obtained by measuring the pupil distance.
In order to accurately acquire the temperature of the human body and eliminate the interference of other heat sources in the image, the temperature range is limited according to the normal body temperature of the human body, and the position is limited to a certain extent according to the measured position information. For example, when obtaining the hot image pixel value of the face region, the human body temperature range is set to be 35 degrees to 40 degrees, and other non-compliant temperature values are filtered out.
And the position can be limited according to the measured position information. For example, the distance between the detection object and the image acquisition device is acquired during measurement, and when the distance exceeds a set range, a prompt is issued to prompt the detection object to approach the image acquisition device.
An embodiment of the present application further provides an apparatus, which may include: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of fig. 1. In practical applications, the device may be used as a terminal device, and may also be used as a server, where examples of the terminal device may include: the mobile terminal includes a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, a laptop, a vehicle-mounted computer, a desktop computer, a set-top box, an intelligent television, a wearable device, and the like.
The present application further provides a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may be caused to execute instructions (instructions) of steps included in the method in fig. 1 according to the present application.
Fig. 5 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown, the terminal device may include: an input device 1100, a first processor 1101, an output device 1102, a first memory 1103, and at least one communication bus 1104. The communication bus 1104 is used to implement communication connections between the elements. The first memory 1103 may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, and the first memory 1103 may store various programs for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the first processor 1101 may be, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the first processor 1101 is coupled to the input device 1100 and the output device 1102 through a wired or wireless connection.
Optionally, the input device 1100 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; the output devices 1102 may include output devices such as a display, audio, and the like.
In this embodiment, the processor of the terminal device includes a module for executing functions of each module in each device, and specific functions and technical effects may refer to the foregoing embodiments, which are not described herein again.
Fig. 6 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application. FIG. 6 is a specific embodiment of the implementation of FIG. 5. As shown, the terminal device of the present embodiment may include a second processor 1201 and a second memory 1202.
The second processor 1201 executes the computer program code stored in the second memory 1202 to implement the method described in fig. 1 in the above embodiment.
The second memory 1202 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The second memory 1202 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a second processor 1201 is provided in the processing assembly 1200. The terminal device may further include: communication component 1203, power component 1204, multimedia component 1205, speech component 1206, input/output interfaces 1207, and/or sensor component 1208. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 1200 generally controls the overall operation of the terminal device. The processing assembly 1200 may include one or more second processors 1201 to execute instructions to perform all or part of the steps of the data processing method described above. Further, the processing component 1200 can include one or more modules that facilitate interaction between the processing component 1200 and other components. For example, the processing component 1200 can include a multimedia module to facilitate interaction between the multimedia component 1205 and the processing component 1200.
The power supply component 1204 provides power to the various components of the terminal device. The power components 1204 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia components 1205 include a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The voice component 1206 is configured to output and/or input voice signals. For example, the voice component 1206 includes a Microphone (MIC) configured to receive external voice signals when the terminal device is in an operational mode, such as a voice recognition mode. The received speech signal may further be stored in the second memory 1202 or transmitted via the communication component 1203. In some embodiments, the speech component 1206 further comprises a speaker for outputting speech signals.
The input/output interface 1207 provides an interface between the processing component 1200 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 1208 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 1208 may detect an open/closed state of the terminal device, relative positioning of the components, presence or absence of user contact with the terminal device. The sensor assembly 1208 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 1208 may also include a camera or the like.
The communication component 1203 is configured to facilitate communications between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
As can be seen from the above, the communication component 1203, the voice component 1206, the input/output interface 1207 and the sensor component 1208 referred to in the embodiment of fig. 6 can be implemented as the input device in the embodiment of fig. 5.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.
Claims (20)
1. A method of temperature extraction, comprising:
acquiring multiple types of images of a detection object;
determining a target detection area of the detection object;
acquiring the temperature of a target mapping area;
a target temperature is obtained based on the region growing.
2. The temperature extraction method according to claim 1, wherein the plurality of types of images include a visible light image, an infrared image, and a laser image.
3. The temperature extraction method according to claim 2, wherein if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
4. The temperature extraction method according to claim 2, wherein if the image of the detection object includes a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
5. The temperature extraction method according to claim 1, wherein pixel points in the target detection region are sequentially used as seed points for region growth until a growth stop condition is satisfied.
6. The temperature extraction method according to claim 5, wherein the growth stop condition is:
the number of the similar points searched in the neighborhood of the seed point is larger than a first threshold value; wherein a difference between the pixel value of the similar point and the pixel value of the start point is smaller than a second threshold.
7. The temperature extraction method according to claim 6, wherein the pixel points in the target detection region are sorted based on a temperature condition, and the sorted pixel points are used as seed points for region growth.
8. The temperature extraction method according to claim 6, wherein a temperature value that has a maximum number of occurrences in a neighborhood of the seed point corresponding to the condition that satisfies the growth stop is used as the target temperature.
9. The temperature extraction method according to claim 1, further comprising:
and compensating the temperature of the target detection area based on the distance between the detection object and the image acquisition device.
10. A temperature extraction system, comprising:
the image acquisition module is used for acquiring various types of images of the detection object;
a target detection area determination module for determining a target detection area of the detection object;
the region temperature acquisition module is used for acquiring the temperature of the target mapping region;
and the target temperature acquisition module is used for acquiring the target temperature based on the region growing.
11. The temperature extraction system of claim 10, wherein the plurality of types of images comprise visible light images, infrared images, laser images.
12. The temperature extraction system according to claim 11, wherein if the image of the detection object includes a visible light image and an infrared image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the infrared image of the detection object.
13. The temperature extraction system according to claim 11, wherein if the image of the detection object includes a visible light image and a laser image; detecting a target part of the visible light image to obtain a target part position; and determining a target detection area of the target part position in the laser image of the detection object.
14. The temperature extraction system according to claim 10, wherein pixel points in the target detection region are sequentially used as seed points for region growth until a growth stop condition is satisfied.
15. The temperature extraction system of claim 14, wherein the growth stop condition is:
the number of the similar points searched in the neighborhood of the seed point is larger than a first threshold value; wherein a difference between the pixel value of the similar point and the pixel value of the start point is smaller than a second threshold.
16. The temperature extraction system according to claim 15, wherein the pixel points in the target detection region are sorted based on a temperature condition, and the sorted pixel points are used as seed points for region growth.
17. The temperature extraction system according to claim 15, wherein a temperature value that has a largest number of occurrences in a neighborhood of the seed point corresponding to the condition that satisfies the growth stop is used as the target temperature.
18. The temperature extraction system of claim 10, further comprising:
and the temperature compensation module is used for compensating the temperature of the target detection area based on the distance between the detection object and the image acquisition device.
19. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of one or more of claims 1-9.
20. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods recited in claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010291602.2A CN111537075A (en) | 2020-04-14 | 2020-04-14 | Temperature extraction method, device, machine readable medium and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010291602.2A CN111537075A (en) | 2020-04-14 | 2020-04-14 | Temperature extraction method, device, machine readable medium and equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111537075A true CN111537075A (en) | 2020-08-14 |
Family
ID=71952302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010291602.2A Pending CN111537075A (en) | 2020-04-14 | 2020-04-14 | Temperature extraction method, device, machine readable medium and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111537075A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112241702A (en) * | 2020-10-16 | 2021-01-19 | 沈阳天眼智云信息科技有限公司 | Body temperature detection method based on infrared double-light camera |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103487729A (en) * | 2013-09-06 | 2014-01-01 | 广东电网公司电力科学研究院 | Electrical equipment defect detection method based on fusion of ultraviolet video and infrared video |
CN105405244A (en) * | 2015-12-22 | 2016-03-16 | 山东神戎电子股份有限公司 | Interference source shielding method used for forest water prevention |
CN105425123A (en) * | 2015-11-20 | 2016-03-23 | 国网福建省电力有限公司泉州供电公司 | Method and system for collaboratively detecting power equipment failure through ultraviolet imaging and infrared imaging |
CN106127170A (en) * | 2016-07-01 | 2016-11-16 | 重庆中科云丛科技有限公司 | A kind of merge the training method of key feature points, recognition methods and system |
US20160347005A1 (en) * | 2014-03-24 | 2016-12-01 | Empire Technology Development Llc | Methods and systems for monitoring melt zones in polymer three dimensional printing |
CN106989825A (en) * | 2017-05-04 | 2017-07-28 | 北京许继电气有限公司 | Online full filed current conversion station infrared temperature measurement system and method |
CN107870181A (en) * | 2017-06-20 | 2018-04-03 | 成都飞机工业(集团)有限责任公司 | A kind of later stage recognition methods of composite debonding defect |
CN207263308U (en) * | 2017-09-04 | 2018-04-20 | 北京盈想东方科技股份有限公司 | A kind of non-refrigeration type Multifunctional hand-held infrared viewer |
CN108765401A (en) * | 2018-05-29 | 2018-11-06 | 电子科技大学 | A kind of thermal imaging testing method based on ranks variable step segmentation and region-growing method |
CN109199320A (en) * | 2018-07-27 | 2019-01-15 | 上海贝高医疗科技有限公司 | A kind of portable visual acuity screening instrument and its light channel structure |
CN109766888A (en) * | 2017-11-09 | 2019-05-17 | 天津理工大学 | A kind of infrared image target extraction method with controllable immune growth domain |
CN110246150A (en) * | 2019-06-14 | 2019-09-17 | 上海联影医疗科技有限公司 | Metal detection method, apparatus, equipment and storage medium |
CN110987189A (en) * | 2019-11-21 | 2020-04-10 | 北京都是科技有限公司 | Method, system and device for detecting temperature of target object |
US10719727B2 (en) * | 2014-10-01 | 2020-07-21 | Apple Inc. | Method and system for determining at least one property related to at least part of a real environment |
-
2020
- 2020-04-14 CN CN202010291602.2A patent/CN111537075A/en active Pending
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103487729A (en) * | 2013-09-06 | 2014-01-01 | 广东电网公司电力科学研究院 | Electrical equipment defect detection method based on fusion of ultraviolet video and infrared video |
US20160347005A1 (en) * | 2014-03-24 | 2016-12-01 | Empire Technology Development Llc | Methods and systems for monitoring melt zones in polymer three dimensional printing |
US10719727B2 (en) * | 2014-10-01 | 2020-07-21 | Apple Inc. | Method and system for determining at least one property related to at least part of a real environment |
CN105425123A (en) * | 2015-11-20 | 2016-03-23 | 国网福建省电力有限公司泉州供电公司 | Method and system for collaboratively detecting power equipment failure through ultraviolet imaging and infrared imaging |
CN105405244A (en) * | 2015-12-22 | 2016-03-16 | 山东神戎电子股份有限公司 | Interference source shielding method used for forest water prevention |
CN106127170A (en) * | 2016-07-01 | 2016-11-16 | 重庆中科云丛科技有限公司 | A kind of merge the training method of key feature points, recognition methods and system |
CN106989825A (en) * | 2017-05-04 | 2017-07-28 | 北京许继电气有限公司 | Online full filed current conversion station infrared temperature measurement system and method |
CN107870181A (en) * | 2017-06-20 | 2018-04-03 | 成都飞机工业(集团)有限责任公司 | A kind of later stage recognition methods of composite debonding defect |
CN207263308U (en) * | 2017-09-04 | 2018-04-20 | 北京盈想东方科技股份有限公司 | A kind of non-refrigeration type Multifunctional hand-held infrared viewer |
CN109766888A (en) * | 2017-11-09 | 2019-05-17 | 天津理工大学 | A kind of infrared image target extraction method with controllable immune growth domain |
CN108765401A (en) * | 2018-05-29 | 2018-11-06 | 电子科技大学 | A kind of thermal imaging testing method based on ranks variable step segmentation and region-growing method |
CN109199320A (en) * | 2018-07-27 | 2019-01-15 | 上海贝高医疗科技有限公司 | A kind of portable visual acuity screening instrument and its light channel structure |
CN110246150A (en) * | 2019-06-14 | 2019-09-17 | 上海联影医疗科技有限公司 | Metal detection method, apparatus, equipment and storage medium |
CN110987189A (en) * | 2019-11-21 | 2020-04-10 | 北京都是科技有限公司 | Method, system and device for detecting temperature of target object |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112241702A (en) * | 2020-10-16 | 2021-01-19 | 沈阳天眼智云信息科技有限公司 | Body temperature detection method based on infrared double-light camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111310692B (en) | Detection object management method, system, machine readable medium and equipment | |
CN103955272B (en) | A kind of terminal user's attitude detection system | |
US10269153B2 (en) | Information processing apparatus providing information related to skin state of user | |
WO2021159682A1 (en) | Abnormal object management method and system, machine-readable medium, and device | |
CN110916620A (en) | Body temperature measuring method and terminal | |
CN111325127A (en) | Abnormal object judgment method, system, machine readable medium and equipment | |
CN110196103A (en) | Thermometry and relevant device | |
CN109101873A (en) | For providing the electronic equipment for being directed to the characteristic information of external light source of object of interest | |
US20170188938A1 (en) | System and method for monitoring sleep of a subject | |
US10146306B2 (en) | Gaze position detection apparatus and gaze position detection method | |
US20150227789A1 (en) | Information processing apparatus, information processing method, and program | |
CN107016697A (en) | A kind of height measurement method and device | |
CN110568930B (en) | Method for calibrating fixation point and related equipment | |
US9924865B2 (en) | Apparatus and method for estimating gaze from un-calibrated eye measurement points | |
CN111297337A (en) | Detection object judgment method, system, machine readable medium and equipment | |
CN113010126A (en) | Display control method, display control device, electronic device, and medium | |
CN107091704A (en) | Pressure detection method and device | |
CN104376323B (en) | A kind of method and device for determining target range | |
CN107092852A (en) | Pressure detection method and device | |
CN112132056A (en) | Living body detection method, system, equipment and medium | |
CN111537075A (en) | Temperature extraction method, device, machine readable medium and equipment | |
WO2021082636A1 (en) | Region of interest detection method and apparatus, readable storage medium and terminal device | |
TWI603225B (en) | Viewing angle adjusting method and apparatus of liquid crystal display | |
US10942575B2 (en) | 2D pointing indicator analysis | |
CN112613357B (en) | Face measurement method, device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200814 |
|
RJ01 | Rejection of invention patent application after publication |