Nothing Special   »   [go: up one dir, main page]

CN112966721A - Blue light detection method and device - Google Patents

Blue light detection method and device Download PDF

Info

Publication number
CN112966721A
CN112966721A CN202110165474.1A CN202110165474A CN112966721A CN 112966721 A CN112966721 A CN 112966721A CN 202110165474 A CN202110165474 A CN 202110165474A CN 112966721 A CN112966721 A CN 112966721A
Authority
CN
China
Prior art keywords
image
pixel
pixel value
target
target pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110165474.1A
Other languages
Chinese (zh)
Other versions
CN112966721B (en
Inventor
郝英俊
宋力帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Best Color Technology Co ltd
Original Assignee
Beijing Best Color Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Best Color Technology Co ltd filed Critical Beijing Best Color Technology Co ltd
Priority to CN202110165474.1A priority Critical patent/CN112966721B/en
Publication of CN112966721A publication Critical patent/CN112966721A/en
Application granted granted Critical
Publication of CN112966721B publication Critical patent/CN112966721B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The application relates to a blue light detection method. The method comprises the following steps: acquiring a first image, wherein the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband; acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband; adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; the third image is transmitted to the display unit for display. By adopting the method, the blue light image and the visible light image in the environment can be respectively obtained and displayed in a fusion manner, so that a user can visually observe whether the blue light exists in the environment and the position and the intensity of the blue light.

Description

Blue light detection method and device
Technical Field
The present disclosure relates to the field of blue light detection, and more particularly, to a method and an apparatus for blue light detection.
Background
According to the existing research, the irradiation of blue light can cause damage to retinal cells, resulting in decreased or even lost vision. Wherein, the short-wave blue light with the wavelength of 400-460 nanometers has the greatest harm degree to the retina. Existing electronic display equipment, LED lamps and lanterns etc. all can send stronger blue light, cause the harm to the human eye, however, among the prior art, when measuring the blue light, only can show the intensity of blue light through measuring numerical value, can't audio-visual observation the position and the power that the blue light sent.
Disclosure of Invention
In view of the above, it is necessary to provide a blue light detection method and apparatus.
A method for blue light detection, the method comprising: acquiring a first image, wherein the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband; acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband; adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; the third image is transmitted to the display unit for display.
In one embodiment, adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relationship between the first image and the second image and the pixel value of the pixel in the first image includes: screening a first target pixel from the first pixel image according to a preset pixel threshold value; determining second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relation, wherein the number of the first target pixels is multiple, and the second target pixels and the first target pixels are in one-to-one correspondence according to the coordinate mapping relation; the pixel value of the second target pixel is adjusted to the pixel value corresponding to the first target pixel.
In one embodiment, the second image is a color image, and after adjusting the pixel value of each second target pixel to the pixel value corresponding to the first target pixel, the method further includes: and converting the pixel values of other pixels except the second target pixel in the second image into gray pixel values.
In one embodiment, the first image is a grayscale image, and adjusting the pixel value of each second target pixel to the pixel value corresponding to the first target pixel includes: and setting the R pixel value in the second target pixel as a specific pixel value smaller than the first preset pixel value, setting the B pixel value in the second target pixel as a specific pixel value larger than the second preset pixel value, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel.
In one embodiment, the number of gray levels of the first image and the second image is the same, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel comprises: setting a pixel value corresponding to a target pixel as a pixel value of a second target pixel; or, the difference between the highest pixel value in the number of gray levels of the second image and the pixel value corresponding to the first target pixel is set as the pixel value of the second target pixel.
In one embodiment, the number of gray levels of the first image is different from that of the second image, and the determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel includes: converting a pixel value of the first target pixel into a converted pixel value corresponding to the number of gray levels of the second image according to the gray level scale; the first converted pixel value is set as the pixel value of the second target pixel, or the difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value is set as the pixel value of the second target pixel.
In one embodiment, the first image is a color image, and the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel, including; and setting the pixel value of the second target pixel as the pixel value of the first target pixel, wherein the gray scale of the first image and the gray scale of the second image are the same, or converting the pixel value of the first target pixel into a second conversion pixel value according to the gray scale proportion, and setting the pixel value of the second target pixel as the second conversion pixel value, wherein the gray scale of the first image and the gray scale of the second image are different.
A blue light detecting device, the device comprising: the first acquisition unit is used for acquiring a first image, the first image is an image obtained according to sensing data of the first image sensor, and a detection waveband of the first image sensor is a short-wave blue light waveband; the second acquisition unit is used for acquiring a second image, the second image is an image obtained according to the sensing data of the second image sensor, and the detection waveband of the second image sensor is a visible light waveband; the adjusting unit is used for adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; and the transmission unit is used for transmitting the third image to the display unit for displaying.
A computer device comprising a memory storing a computer program and a processor implementing the steps of any of the above blue light detection methods when the processor executes the computer program.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the blue light detection method of any one of the preceding claims.
According to the blue light detection method, the blue light image and the visible light image in the environment are obtained respectively and are displayed in a fusion mode, so that a user can visually observe whether the blue light exists in the environment and the position and the strength of the blue light.
Drawings
FIG. 1 is a flow diagram of a blue light detection method in one embodiment;
FIG. 2 is a block diagram of a blue light detection device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In one embodiment, as shown in fig. 1, the blue light detection method provided by the present application includes the following steps:
step S102, a first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband.
It should be understood that the first image sensor is a CMOS or CCD image sensor covered with a filter layer so that only light corresponding to the specific blue band is sensed by the first image sensor. Optionally, the specific blue light band refers to one or more sub-bands with a wavelength in a band of 380-. The references to "blue light" hereinafter in this document all refer to light in the above-specified blue wavelength band. The image acquisition is carried out according to the wave bands, so that the first image sensor only acquires images of blue light wave bands harmful to human eyes, and the interference of light of other wave bands on the blue light detection effect is avoided. It should be understood that if a technical solution is to achieve the above-mentioned effects, but due to a problem of a filter layer process, light other than the above-mentioned specific blue light band is sensed by the image sensor, the solution is substantially the same as the solution described in this embodiment, and it is within the scope of the present invention.
And step S104, acquiring a second image, wherein the second image is an image obtained according to the sensing data of the second image sensor, and the detection waveband of the second image sensor is a visible light waveband.
It should be understood that the second image sensor is a CMOS or CCD image sensor covered with a filter layer so that only light in the visible wavelength band is sensed by the second image sensor. Optionally, the visible light band refers to a band with a wavelength of 380-780nm, or one or more sub-bands of 380-780 nm. And acquiring images according to the wave bands, so that the images which can be seen by human eyes in the current environment and acquired by the second image sensor, namely the images acquired by the second image sensor can display the current environment image and serve as an environment reference image when the blue light detection result is subsequently displayed. It should be understood that, if a technical solution is to achieve the above-mentioned effects, but due to a problem of a filter layer process, light outside the above-mentioned visible light band is sensed by the image sensor, the solution is substantially the same as the solution described in this embodiment, and it is within the scope of the present invention.
And step S106, adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image.
It should be understood that the above-mentioned pixel coordinate mapping relationship is determined according to the viewing angle difference between the first image sensor and the second image sensor, and the coordinate corresponding relationship between each pixel in the first image and the second image can be obtained through a calibration procedure.
Optionally, adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relationship between the first image and the second image and the pixel value of the pixel in the first image, includes: screening a first target pixel from the first pixel image according to a preset pixel threshold value; determining second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relation, wherein the number of the first target pixels is multiple, and the second target pixels and the first target pixels are in one-to-one correspondence according to the coordinate mapping relation; the pixel value of the second target pixel is adjusted to the pixel value corresponding to the first target pixel.
It should be appreciated that in the first image, pixels corresponding to only blue light have higher pixel values because only blue light is detected, and pixels corresponding to non-blue light may also have lower pixel values because of interference effects. Therefore, the pixels in the first image are screened through the preset pixel threshold, and the first target pixel corresponding to the blue light can be screened out.
In the above embodiment, the content displayed by the first image is a detection target of the present scheme, that is, a blue light image, the second image is a detected environment image, and in the above manner, the information in the first image and the information in the second image are combined to obtain a third image, and since the pixel value of the first image represents the intensity of the detected blue light, the pixel corresponding to the blue light in the third image can also represent the intensity of the blue light by the color of the pixel. Furthermore, the scheme enables a user to observe whether blue light harmful to human eyes exists in the detected environment or not and observe the intensity and the emitting position of the blue light intuitively through the finally displayed third image.
Further, since the first image has a noise phenomenon during the imaging process, if the pixels are only screened by the pixel threshold, the noise may be erroneously determined as the first target pixel due to the higher pixel value of the noise. In order to solve the above problem, each first target pixel is judged to be a neighboring pixel, and since blue light is generally emitted by a lamp or a display device in an actual use scene, each pixel sensing the blue light is not isolated, that is, a pixel sensing the blue light should exist in the neighboring pixel. If a first target pixel is an isolated pixel, that is, the pixel values of the neighboring pixels are all smaller than the preset pixel threshold, the isolated pixel can be screened out to filter the noise pixel in the first target pixel. Or, by combining the previous and subsequent frame images of the first image, it is determined whether the first target pixel is a noise, that is, whether the pixel value of a certain first target pixel in the previous and subsequent frame images of the preset number of the first image is higher than the preset pixel threshold, it can be determined that the certain first target pixel is a true pixel sensing blue light, otherwise, the certain first target pixel is a noise pixel, and the noise pixel can be filtered.
Optionally, the second image is a color image, and after the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel, the method further includes: and converting the pixel values of other pixels except the second target pixel in the second image into gray pixel values.
It should be understood that, since the final purpose of the third image is to highlight the image corresponding to the blue light, if the image other than the image corresponding to the blue light is a color image, the display result will be confused. Therefore, in the above scheme, the pixel values of other pixels except for the second target pixel in the color second image are converted into the gray-scale pixel values, so that the images except for the image corresponding to the blue light in the finally generated third image are all gray images, thereby avoiding confusion of display results.
Optionally, the color space of the second image is an RGB color space, that is, the color displayed by the pixel is determined by three pixel values of red (R), green (G) and blue (B), and the conversion of the pixel values of the other pixels in the second image except the second target pixel into the gray pixel values may be implemented by:
(1) conversion equations, such as:
Gray=R*0.299+G*0.587+B*0.114
in the above formula, Gray is a Gray pixel value, R is an R pixel value, G is a G pixel value, and B is a B pixel value;
(2) average value method: the R, G, B three pixel values are subjected to a tie to obtain a corresponding Gray pixel value;
(3) the green pixel method, i.e., using the pixel value of the G pixel as the Gray pixel value.
After the Gray pixel value is obtained in the above manner, if the pixel values of the other pixels in the second image except for the second target pixel are converted into the Gray pixel value, all of the R, G, B three pixel values of the pixel are set as the Gray pixel value, that is, all of the pixels in the third image except for the pixel corresponding to the blue light are Gray.
Optionally, the adjusting the pixel value of each second target pixel to the pixel value corresponding to the first target pixel includes: and setting the R pixel value in the second target pixel as a specific pixel value smaller than the first preset pixel value, setting the B pixel value in the second target pixel as a specific pixel value larger than the second preset pixel value, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel. The pixel value corresponding to the first target pixel refers to a gray scale value of the first target pixel.
It should be understood that, in the above solution, the B pixel value in the second target pixel is set to a larger fixed value, and the R pixel is set to a smaller fixed value, when the G pixel value is determined by the pixel value corresponding to the first pixel, the color exhibited by the second target pixel changes from light blue to dark blue with the G pixel value. And then the image corresponding to the blue light in the third image is presented as blue with different shades according to the intensity of the blue light, so that the user can observe the blue light more intuitively. For example, in an 8-bit image, the R pixel value is set to 0, and the B pixel value is set to 255. The first predetermined pixel value may be set to be one fifth of the maximum pixel value, and the second predetermined pixel value may be set to be four fifths of the maximum pixel value, so that it is ensured that the R pixel value can be set to be a smaller pixel value and the B pixel value can be set to be a larger pixel value.
Optionally, the determining, by using the same gray scale number as the first image and the second image, a G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel includes: setting a pixel value corresponding to a target pixel as a pixel value of a second target pixel; or, the difference between the highest pixel value in the number of gray levels of the second image and the pixel value corresponding to the first target pixel is set as the G pixel value in the second target pixel.
It should be understood that the above mentioned number of gray levels refers to the number of gray levels in a gray image, and refers to the number of gray levels in an RGB image, for example, in an 8-bit gray image, the number of gray levels is 256, the gray level value is 0-255,8 is an RGB image, the number of gray levels is 256, and R, G, B all have a value of 0-255.
It should be understood that when the number of gray levels of the first image and the second image is the same, the value of the G pixel in the second target pixel can be directly set as the pixel value of the first target pixel, and since the first target pixel is a gray pixel, the larger the value of the first target pixel, the stronger the blue light detected by the pixel. Therefore, when the G pixel value of the second target pixel is set to the pixel value of the first target pixel, the shade of blue of the pixel is different according to the magnitude of the G pixel value, and the intensity of the detected blue light can be reflected by the shade of blue of the pixel. Alternatively, since the blue color of the pixel is lighter as the G value is larger, the difference between the highest pixel value in the number of gray levels of the second image and the pixel value corresponding to the first target pixel is set as the pixel value of the second target pixel. For example, if the gray scale levels of the first image and the second image are both 256 (8-bit map), the R pixel value of the second target pixel may be set to zero, the B pixel value may be set to 255, the pixel value of the second target pixel may be (0,155,255) when the gray scale value of the first target pixel is 100, and the RGB pixel value of the second target pixel may be (0,55,255) when the gray scale value of the first target pixel is 200. The second target pixel value is (0,155,255) and (0,55,255) both appear blue, but the blue color appearing at (0,55,255) is darker, indicating a corresponding higher intensity of blue light.
Optionally, the determining, by the first image and the second image, the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel includes: converting a pixel value of the first target pixel into a converted pixel value corresponding to the number of gray levels of the second image according to the gray level scale; the first converted pixel value is set as the pixel value of the second target pixel, or the difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value is set as the pixel value of the second target pixel.
It should be understood that if the number of gray levels of the first image is different from that of the second image, for example, the first image is 256 levels (8 bit map) and the second image is 512 levels (16 bit map), the above-mentioned ratio of the number of gray levels is 2, and it is necessary to multiply the gray value of the first target pixel by 2 and set the value to the G pixel value of the second target pixel, or set the value to the G pixel value of the second target pixel by interpolation of subtracting the value from 511.
Optionally, the first image is a color image, and the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel, including; and setting the pixel value of the second target pixel as the pixel value of the first target pixel, wherein the gray scale of the first image and the gray scale of the second image are the same, or converting the pixel value of the first target pixel into a second conversion pixel value according to the gray scale proportion, and setting the pixel value of the second target pixel as the second conversion pixel value, wherein the gray scale of the first image and the gray scale of the second image are different.
It should be understood that when the first image is a color image, the pixel value of the first target pixel is directly the RGB pixel value corresponding to blue, and this value may be directly set as the pixel value of the second target pixel, or the pixel value of the second target pixel may be set in conjunction with the gray scale ratio when the first image and the second image have different gray scales.
Step S108, the third image is transmitted to the display unit for display.
Alternatively, the display unit may have a display structure in which the first image sensor and the second image sensor are disposed in the same device, or may be an external display device of a device in which the first image sensor and the second image sensor are disposed.
In one embodiment, as shown in fig. 2, a blue light display device includes: the first acquisition unit is used for acquiring a first image, the first image is an image obtained according to sensing data of the first image sensor, and a detection waveband of the first image sensor is a short-wave blue light waveband; the second acquisition unit is used for acquiring a second image, the second image is an image obtained according to the sensing data of the second image sensor, and the detection waveband of the second image sensor is a visible light waveband; the adjusting unit is used for adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; and the transmission unit is used for transmitting the third image to the display unit for displaying.
Optionally, the adjusting unit includes: the screening module is used for screening a first target pixel from the first pixel image according to a preset pixel threshold value; the first determining module is used for determining a plurality of second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relation, and the second target pixels and the first target pixels are in one-to-one correspondence according to the coordinate mapping relation; and the adjusting module is used for adjusting the pixel value of the second target pixel to the pixel value corresponding to the first target pixel.
Optionally, the adjusting unit further includes: and the conversion module is used for converting the pixel values of other pixels except the second target pixel in the second image into gray pixel values, wherein the second image is a color image.
Optionally, the adjusting unit further includes: and the second determining module is used for setting the R pixel value in the second target pixel to be a specific pixel value smaller than the first preset pixel value, setting the B pixel value in the second target pixel to be a specific pixel value larger than the second preset pixel value, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel.
Optionally, the second determining module includes: the first setting submodule is used for setting a pixel value corresponding to a target pixel as a pixel value of a second target pixel; or, the second setting sub-module is configured to set a difference between a highest pixel value in the number of gray levels of the second image and a pixel value corresponding to the first target pixel as a pixel value of the second target pixel.
Optionally, the second determining module includes: a first conversion sub-module for converting a pixel value of the first target pixel into a converted pixel value corresponding to a number of gray levels of the second image according to a gray level scale; a third setting sub-module for setting the first converted pixel value as a pixel value of the second target pixel, or a fourth setting sub-module for setting a difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value as a pixel value of the second target pixel, wherein the number of gray levels of the first image is different from that of the second image.
Optionally, the adjusting unit further includes: and the first setting module is used for setting the pixel value of the second target pixel as the pixel value of the first target pixel, wherein the gray level number of the first image is the same as that of the second image, or the second setting module is used for converting the pixel value of the first target pixel into a second conversion pixel value according to the gray level number proportion and setting the pixel value of the second target pixel into the second conversion pixel value, wherein the gray level number of the first image is different from that of the second image.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring a first image, wherein the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband; acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband; adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; the third image is transmitted to the display unit for display.
In one embodiment, the processor, when executing the computer program, further performs the steps of: adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image, comprising: screening a first target pixel from the first pixel image according to a preset pixel threshold value; determining second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relation, wherein the number of the first target pixels is multiple, and the second target pixels and the first target pixels are in one-to-one correspondence according to the coordinate mapping relation; the pixel value of the second target pixel is adjusted to the pixel value corresponding to the first target pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the second image is a color image, and after the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel, the method further includes: and converting the pixel values of other pixels except the second target pixel in the second image into gray pixel values.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the first image is a gray image, and the adjusting of the pixel value of each second target pixel to the pixel value corresponding to the first target pixel comprises: and setting the R pixel value in the second target pixel as a specific pixel value smaller than the first preset pixel value, setting the B pixel value in the second target pixel as a specific pixel value larger than the second preset pixel value, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the gray scale of the first image is the same as that of the second image, and the G pixel value in the second target pixel is determined according to the pixel value corresponding to the first target pixel, and the method comprises the following steps: setting a pixel value corresponding to a target pixel as a pixel value of a second target pixel; or, the difference between the highest pixel value in the number of gray levels of the second image and the pixel value corresponding to the first target pixel is set as the pixel value of the second target pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the gray scale of the first image is different from that of the second image, and the G pixel value in the second target pixel is determined according to the pixel value corresponding to the first target pixel, and the method comprises the following steps: converting a pixel value of the first target pixel into a converted pixel value corresponding to the number of gray levels of the second image according to the gray level scale; the first converted pixel value is set as the pixel value of the second target pixel, or the difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value is set as the pixel value of the second target pixel.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the first image is a color image, and the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel; and setting the pixel value of the second target pixel as the pixel value of the first target pixel, wherein the gray scale of the first image and the gray scale of the second image are the same, or converting the pixel value of the first target pixel into a second conversion pixel value according to the gray scale proportion, and setting the pixel value of the second target pixel as the second conversion pixel value, wherein the gray scale of the first image and the gray scale of the second image are different.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a first image, wherein the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband; acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband; adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image; the third image is transmitted to the display unit for display.
In one embodiment, the computer program when executed by the processor further performs the steps of: adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image, comprising: screening a first target pixel from the first pixel image according to a preset pixel threshold value; determining second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relation, wherein the number of the first target pixels is multiple, and the second target pixels and the first target pixels are in one-to-one correspondence according to the coordinate mapping relation; the pixel value of the second target pixel is adjusted to the pixel value corresponding to the first target pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: the second image is a color image, and after the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel, the method further includes: and converting the pixel values of other pixels except the second target pixel in the second image into gray pixel values.
In one embodiment, the computer program when executed by the processor further performs the steps of: the first image is a gray image, and the adjusting of the pixel value of each second target pixel to the pixel value corresponding to the first target pixel comprises: and setting the R pixel value in the second target pixel as a specific pixel value smaller than the first preset pixel value, setting the B pixel value in the second target pixel as a specific pixel value larger than the second preset pixel value, and determining the G pixel value in the second target pixel according to the pixel value corresponding to the first target pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: the gray scale of the first image is the same as that of the second image, and the G pixel value in the second target pixel is determined according to the pixel value corresponding to the first target pixel, and the method comprises the following steps: setting a pixel value corresponding to a target pixel as a pixel value of a second target pixel; or, the difference between the highest pixel value in the number of gray levels of the second image and the pixel value corresponding to the first target pixel is set as the pixel value of the second target pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: the gray scale of the first image is different from that of the second image, and the G pixel value in the second target pixel is determined according to the pixel value corresponding to the first target pixel, and the method comprises the following steps: converting a pixel value of the first target pixel into a converted pixel value corresponding to the number of gray levels of the second image according to the gray level scale; the first converted pixel value is set as the pixel value of the second target pixel, or the difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value is set as the pixel value of the second target pixel.
In one embodiment, the computer program when executed by the processor further performs the steps of: the first image is a color image, and the pixel value of each second target pixel is adjusted to the pixel value corresponding to the first target pixel; and setting the pixel value of the second target pixel as the pixel value of the first target pixel, wherein the gray scale of the first image and the gray scale of the second image are the same, or converting the pixel value of the first target pixel into a second conversion pixel value according to the gray scale proportion, and setting the pixel value of the second target pixel as the second conversion pixel value, wherein the gray scale of the first image and the gray scale of the second image are different.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of blue light detection, the method comprising:
acquiring a first image, wherein the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a specific blue light waveband;
acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband;
adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image;
transmitting the third image to a display unit for display.
2. The method of claim 1, wherein the adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relationship between the first image and the second image and the pixel value of the pixel in the first image comprises:
screening a first target pixel from the first pixel image according to a preset pixel threshold value;
determining a plurality of second target pixels corresponding to the first target pixels from the second image according to the pixel coordinate mapping relationship, wherein the second target pixels correspond to the first target pixels one to one according to the coordinate mapping relationship;
adjusting a pixel value of the second target pixel to a pixel value corresponding to the first target pixel.
3. The method of claim 2, wherein the second image is a color image, and wherein after adjusting the pixel value of each of the second target pixels to the pixel value corresponding to the first target pixel, the method further comprises: and converting pixel values of other pixels except the second target pixel in the second image into gray pixel values.
4. The method of claim 3, wherein the first image is a grayscale image, and wherein the adjusting the pixel value of each of the second target pixels to the pixel value corresponding to the first target pixel comprises:
setting the R pixel value of the second target pixel to be a specific pixel value smaller than a first preset pixel value, setting the B pixel value of the second target pixel to be a specific pixel value larger than a second preset pixel value, and determining the G pixel value of the second target pixel according to the pixel value corresponding to the first target pixel.
5. The method of claim 4, wherein the first image and the second image have the same number of gray levels, and wherein determining the G pixel value of the second target pixel according to the pixel value corresponding to the first target pixel comprises:
setting a pixel value corresponding to the target pixel as a pixel value of the second target pixel;
or the like, or, alternatively,
setting a difference between a highest pixel value in the number of gray levels of the second image and a pixel value corresponding to the first target pixel as a pixel value of the second target pixel.
6. The method of claim 4, wherein the first image and the second image have different gray scale levels, and wherein determining the G pixel value of the second target pixel according to the pixel value corresponding to the first target pixel comprises:
converting a pixel value of the first target pixel into a converted pixel value corresponding to the number of gray levels of the second image according to a gray level ratio;
setting the first converted pixel value as the pixel value of the second target pixel,
or the like, or, alternatively,
setting a difference between the highest pixel value in the number of gray levels of the second image and the first converted pixel value as a pixel value of the second target pixel.
7. The method of claim 3, wherein the first image is a color image, and wherein the adjusting the pixel value of each of the second target pixels to the pixel value corresponding to the first target pixel comprises;
setting a pixel value of the second target pixel to a pixel value of the first target pixel, wherein the number of gray levels of the first image and the second image is the same,
or the like, or, alternatively,
and converting the pixel value of the first target pixel into a second conversion pixel value according to the gray scale proportion, and setting the pixel value of the second target pixel as the second conversion pixel value, wherein the gray scales of the first image and the second image are different.
8. A blue light detection device, the device comprising:
the device comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring a first image, the first image is obtained according to sensing data of a first image sensor, and a detection waveband of the first image sensor is a short-wave blue light waveband;
the second acquisition unit is used for acquiring a second image, wherein the second image is an image obtained according to sensing data of a second image sensor, and a detection waveband of the second image sensor is a visible light waveband;
the adjusting unit is used for adjusting the pixel value of the pixel in the second image according to the pixel coordinate mapping relation between the first image and the second image and the pixel value of the pixel in the first image to obtain a third image;
and the transmission unit is used for transmitting the third image to a display unit for displaying.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202110165474.1A 2021-02-06 2021-02-06 Blue light detection method and device Active CN112966721B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110165474.1A CN112966721B (en) 2021-02-06 2021-02-06 Blue light detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110165474.1A CN112966721B (en) 2021-02-06 2021-02-06 Blue light detection method and device

Publications (2)

Publication Number Publication Date
CN112966721A true CN112966721A (en) 2021-06-15
CN112966721B CN112966721B (en) 2021-11-09

Family

ID=76274883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110165474.1A Active CN112966721B (en) 2021-02-06 2021-02-06 Blue light detection method and device

Country Status (1)

Country Link
CN (1) CN112966721B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023142916A1 (en) * 2022-01-29 2023-08-03 京东方科技集团股份有限公司 Image processing method, apparatus and device, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206670794U (en) * 2016-09-28 2017-11-24 普天智能照明研究院有限公司 Illumination instrument with blue light detection
CN108701227A (en) * 2016-03-07 2018-10-23 奇跃公司 Blue light for bio-safety is adjusted
CN110163804A (en) * 2018-06-05 2019-08-23 腾讯科技(深圳)有限公司 Image defogging method, device, computer equipment and storage medium
CN110361165A (en) * 2019-06-22 2019-10-22 四川奥希特电子材料有限公司 A kind of equipment for the manufacturing method and its manufacture testing and analyzing equipment for liquid crystal display blue light components
CN110677597A (en) * 2018-07-03 2020-01-10 杭州海康威视数字技术股份有限公司 Image processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108701227A (en) * 2016-03-07 2018-10-23 奇跃公司 Blue light for bio-safety is adjusted
CN206670794U (en) * 2016-09-28 2017-11-24 普天智能照明研究院有限公司 Illumination instrument with blue light detection
CN110163804A (en) * 2018-06-05 2019-08-23 腾讯科技(深圳)有限公司 Image defogging method, device, computer equipment and storage medium
CN110677597A (en) * 2018-07-03 2020-01-10 杭州海康威视数字技术股份有限公司 Image processing method and device
CN110361165A (en) * 2019-06-22 2019-10-22 四川奥希特电子材料有限公司 A kind of equipment for the manufacturing method and its manufacture testing and analyzing equipment for liquid crystal display blue light components

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CLAUDIO PIEMONTE: "A new Silicon Photomultiplier structure for blue light detection", 《NUCLEAR INSTRUMENTS AND METHODS IN PHYSICS RESEARCH》 *
张艳贤 等: "计算机视觉系统在放射治疗摆位中的应用研究", 《中国医学装备》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023142916A1 (en) * 2022-01-29 2023-08-03 京东方科技集团股份有限公司 Image processing method, apparatus and device, and storage medium

Also Published As

Publication number Publication date
CN112966721B (en) 2021-11-09

Similar Documents

Publication Publication Date Title
US10231600B2 (en) Image processing apparatus
CN104717432B (en) Handle method, image processing equipment and the digital camera of one group of input picture
US7747071B2 (en) Detecting and correcting peteye
JP6553624B2 (en) Measurement equipment and system
US20130089256A1 (en) Color image processing method, color image processing device, and color image processing program
JP6290392B2 (en) Conversion of images from dual-band sensors into visible color images
EP3306915A1 (en) Method and apparatus for controlling image data
US12118753B2 (en) System and method for color mapping for improved viewing by a color vision deficient observer
JP4421438B2 (en) Image color balance correction system and method
JP2003520515A (en) Color signal processing
CN110458787B (en) Image fusion method and device and computer storage medium
JP5740147B2 (en) Light source estimation apparatus and light source estimation method
CA3153067A1 (en) Picture-detecting method and apparatus
KR101504564B1 (en) Method of processing a relative illumination phenomenon on a digital image and associated processing system
CN112966721B (en) Blue light detection method and device
AU2018293302A1 (en) Method for filter selection
JP2005167984A (en) System and method for calculating presence of spontaneous light-emitting element in image
CN113358217B (en) Ultraviolet intensity detection method and device
JP7507554B2 (en) IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
Wang et al. Retinex-based color correction for displaying high dynamic range images
JP2017033356A (en) Image processor, image processing method, and image processing program
JP5351438B2 (en) Display control device
CN109788217B (en) Method and apparatus for compensating defective pixels
CN113615148B (en) Camera color image processing
JP5520135B2 (en) Image processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant