CN113974546A - Pterygium detection method and mobile terminal - Google Patents
Pterygium detection method and mobile terminal Download PDFInfo
- Publication number
- CN113974546A CN113974546A CN202010731055.5A CN202010731055A CN113974546A CN 113974546 A CN113974546 A CN 113974546A CN 202010731055 A CN202010731055 A CN 202010731055A CN 113974546 A CN113974546 A CN 113974546A
- Authority
- CN
- China
- Prior art keywords
- region
- pterygium
- eye
- contour
- iris
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 201000002154 Pterygium Diseases 0.000 title claims abstract description 114
- 238000001514 detection method Methods 0.000 title claims abstract description 58
- 210000001747 pupil Anatomy 0.000 claims abstract description 56
- 238000000034 method Methods 0.000 claims abstract description 44
- 210000000744 eyelid Anatomy 0.000 claims abstract description 35
- 230000009545 invasion Effects 0.000 claims abstract description 21
- 210000001508 eye Anatomy 0.000 claims description 130
- 210000003786 sclera Anatomy 0.000 claims description 16
- 238000005259 measurement Methods 0.000 abstract description 3
- 210000000554 iris Anatomy 0.000 description 66
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 210000004087 cornea Anatomy 0.000 description 4
- 230000000284 resting effect Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 239000013589 supplement Substances 0.000 description 4
- 241001621636 Pterygia Species 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 230000000638 stimulation Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001179 pupillary effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 206010020565 Hyperaemia Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 201000009310 astigmatism Diseases 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000001684 chronic effect Effects 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 210000000795 conjunctiva Anatomy 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002789 length control Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001613 neoplastic effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000004171 remote diagnosis Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0008—Apparatus for testing the eyes; Instruments for examining the eyes provided with illuminating means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
- A61B3/111—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a pterygium detection method which is suitable for being executed in a mobile terminal and comprises the following steps: acquiring an eye image sequence of a user to be detected, and selecting a qualified target image from the eye image sequence, wherein the target image sequentially comprises a pupil region, an iris region, a scleral region, an eyelid region and an eye corner region from inside to outside; detecting the pupil contour, the iris contour and the eyelid contour of the target image, and marking the invasion area of the pterygium according to the detection result; acquiring an actual measurement contour line or a fitting contour line of an invasion area, and determining one or more seed points in the invasion area; taking the seed point as a starting point, obtaining the pterygium of the invaded region by adopting a region growing algorithm, and stopping growing when the region growing algorithm grows to the contour line of the region; and calculating a probability value of occurrence of the pterygium in the target image according to the region growing result of each region. The invention also discloses a mobile terminal for executing the method.
Description
Technical Field
The present invention relates to the field of AI (Artificial Intelligence) medical treatment, and in particular, to a pterygium detection method and a mobile terminal.
Background
Pterygium is a common and frequently encountered disease in ophthalmology, is usually seen in the nasal side, is triangular, and is a neoplastic tissue on bulbar conjunctiva and cornea of palpebral fissure. Pterygium covering the cornea can cause astigmatism, which increases with time after invading the cornea, and can even cover the pupillary region and severely affect vision.
Pterygium can manifest as a resting or proceeding period. The resting pterygium does not grow for many years, the head is flat, the body is not congested, the surface is smooth and is in a film shape, and the pterygium in the period can be observed temporarily without treatment. The progress of pterygium is growing and enlarged, the head is raised, the body is fat, the surface is uneven, and there are large and dilated blood vessels, and because the hyperemia looks red, the progress of pterygium may require surgical treatment.
The cause of pterygium is not yet known, but it is considered that it is closely related to ultraviolet rays which are exposed to ultraviolet rays for a long time, and may be related to long-term chronic stimulation such as wind dust, chemical stimulation, smoke, and the like. Therefore, measures such as extensive regular inspection, early warning, disease condition evaluation and the like are developed in the population easy to develop, and the early detection and the early treatment are facilitated. The traditional method needs to carry out offline in-person diagnosis on a patient through a doctor site, needs the patient to go to the site, and is inconvenient and inconvenient. Even if the eye images are taken by the patient at the scene, most of the prior art identifies the pterygium in the iris area based on the iris circle, and the pterygium can be distributed in a plurality of areas of the eye, and the iris is identified only inaccurately.
Disclosure of Invention
To this end, the present invention provides a pterygium detection method and a mobile terminal in an attempt to solve or at least alleviate the above-existing problems.
According to an aspect of the present invention, there is provided a pterygium detection method adapted to be executed in a mobile terminal, the method including the steps of: acquiring an eye image sequence of a user to be detected, and selecting a qualified target image from the eye image sequence, wherein the target image sequentially comprises a pupil region, an iris region, a scleral region, an eyelid region and an eye corner region from inside to outside; detecting a pupil contour, an iris contour and an eyelid contour of the target image, and determining an invasion area of the pterygium according to a detection result; acquiring an actual measurement contour line or a fitting contour line of an invasion area, and determining one or more seed points in the invasion area; taking the seed point as a starting point, obtaining the pterygium of the invasion area by adopting a region growing algorithm, and stopping growing when the region growing algorithm grows to the contour line of the area; and calculating a probability value of occurrence of the pterygium in the target image according to the region growing result of each region.
Alternatively, in the pterygium detection method according to the present invention, the method further includes the steps of: identifying a complete pterygium containing a seed point of an invasion region by adopting a region growing algorithm from the seed point; the complete pterygium identified from all the invaded regions are merged as the total pterygium region in the target image.
Alternatively, in the pterygium detection method according to the present invention, the seed point is any one of: user-specified pterygium points, random points within each region, or feature points identified from a priori knowledge of the pterygium.
Alternatively, in the pterygium detection method according to the present invention, a region growing algorithm performs recognition based on pixel colors and pixel intensities, and the region growing result includes the area ratio of the pterygium in each invasion region.
Alternatively, in the pterygium detection method according to the present invention, the probability values are calculated from the differences in luminance and color of the pterygium from the pupil region, the iris region, the scleral region, and a priori knowledge of the luminance and color of the pterygium.
Alternatively, in the pterygium detection method according to the present invention, the method further includes the steps of: and uploading the target image, the area ratio, the probability value and the user information to a server for storage.
Alternatively, in the pterygium detection method according to the present invention, the detection result includes any one of: pupil contour integrity, pupil left-right difference, iris contour integrity, iris left-right difference, sclera left-right difference, and color and brightness difference from normal eyes.
Alternatively, in the pterygium detection method according to the present invention, the mobile terminal has a pterygium detection application having an eye position prompt box and a photographing button, and the step of acquiring the eye image sequence of the user to be detected includes: under the guidance of the eye position prompt box, the camera lens assembly of the mobile terminal is called through the photographing button of the application so as to acquire eye images of a plurality of users to be detected, wherein the eye images do not contain scene reflection light spots.
Alternatively, in the pterygium detection method according to the present invention, the eye position cue box has an eye contour cue line and/or an eye feature cue point; the eye contour prompt line comprises an eyelid contour line and/or an iris contour line; the eye feature prompt points comprise left eye corner positioning points and/or right eye corner positioning points.
Optionally, in the pterygium detection method according to the present invention, the mobile terminal has an LED fill-in lamp and/or a macro lens for acquiring the eye image sequence.
Alternatively, in the pterygium detection method according to the present invention, the step of selecting a qualified target image includes: selecting an image with qualified image focusing, motion blur and eye angle as a target image; and the image focusing, the motion blur and the eye angle are obtained by calculation according to the camera parameters, the eye physiological parameters and the eye image sequence parameters of the mobile terminal.
Alternatively, in the pterygium detection method according to the present invention, the camera parameters include a focal length, an optical center, and a distortion parameter; the eye physiological data comprises interpupillary distance, iris diameter, relative positions of the pupil and the iris; the eye image sequence parameters include the frequency spectrum, interpupillary distance, iris diameter, relative positions of the pupil and iris, and eccentricity of the pupil and iris outline of each image in the image sequence.
Alternatively, in the pterygium detection method according to the present invention, the pupil contour and the iris contour are detected using a Hough circle algorithm, and the eyelid contour and the eye corner are detected using a face feature point algorithm.
According to another aspect of the present invention, there is provided a mobile terminal including: one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing the pterygium detection method as described above.
According to still another aspect of the present invention, there is provided a readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a mobile terminal, cause the mobile terminal to perform the pterygium detection method as described above.
According to the technical scheme of the invention, the mobile terminal is adopted to obtain the eye image of the user to be detected, the pterygium problem in the eye image can be identified through the terminal program, the pterygium in the eye image can be screened, detected and evaluated quickly, and the remote detection of the pterygium is realized. The invention can be deployed on the application program of the mobile terminal, is easy to popularize and apply, and can find the illness state early and give medical treatment. Further, the invention can also utilize a rear camera of the mobile terminal and a fill-in lamp for shading attenuation to capture eye images, detect pterygium in the advancing period and the resting period and generate the calculation result of the severity degree. And then, the application program uploads the detection result to a server to establish a patient file so as to assist a doctor in remote diagnosis, prevention and treatment.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 illustrates a schematic diagram of a mobile terminal 100 according to one embodiment of the present invention;
FIG. 2 illustrates a flow diagram of a pterygium detection method 200 according to one embodiment of the invention;
FIG. 3 illustrates a schematic diagram of various complications in capturing an image of an eye, according to one embodiment of the invention;
fig. 4 and 5 are schematic diagrams respectively showing a target image and a pterygium segmentation result in the target image according to an embodiment of the present invention;
fig. 6 and 7 respectively show an interface diagram of an application detecting pterygium according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 illustrates a block diagram of a mobile terminal 100 according to an embodiment of the present invention. The mobile terminal 100 may include a memory interface 102, one or more data processors, image processors and/or central processing units 104, and a peripheral interface 106.
The memory interface 102, the one or more processors 104, and/or the peripherals interface 106 can be discrete components or can be integrated in one or more integrated circuits. In the mobile terminal 100, the various elements may be coupled by one or more communication buses or signal lines. Sensors, devices, and subsystems can be coupled to peripheral interface 106 to facilitate a variety of functions.
For example, a motion sensor 110, a light sensor 112, and a distance sensor 114 may be coupled to the peripheral interface 106 to facilitate directional, lighting, and ranging functions. Other sensors 116 may also be coupled to the peripheral interface 106, such as a positioning system (e.g., a GPS receiver), an acceleration sensor, a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functions.
The camera subsystem 120 and optical sensor 122, which may be, for example, a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) optical sensor, may be used to facilitate implementation of camera functions such as recording photographs and video clips. Communication functions may be facilitated by one or more wireless communication subsystems 124, which may include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The particular design and implementation of the wireless communication subsystem 124 may depend on the one or more communication networks supported by the mobile terminal 100. For example, the mobile terminal 100 may include a network designed to support LTE, 3G, GSM networks, GPRS networks, EDGE networks, Wi-Fi or WiMax networks, and BluetoothTMA communication subsystem 124 of the network.
The audio subsystem 126 may be coupled to a speaker 128 and a microphone 130 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. The I/O subsystem 140 may include a touch screen controller 142 and/or one or more other input controllers 144. The touch screen controller 142 may be coupled to a touch screen 146. For example, the touch screen 146 and touch screen controller 142 may detect contact and movement or pauses made therewith using any of a variety of touch sensing technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies.
One or more other input controllers 144 may be coupled to other input/control devices 148 such as one or more buttons, rocker switches, thumbwheels, infrared ports, USB ports, and/or pointing devices such as styluses. The one or more buttons (not shown) may include up/down buttons for controlling the volume of the speaker 128 and/or microphone 130.
The memory interface 102 may be coupled with a memory 150. The memory 150 may include high speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 150 may store an operating system 152, such as an operating system like Android, iOS or Windows Phone. The operating system 152 may include instructions for handling basic system services and performing hardware dependent tasks. The memory 150 may also store an application 154 (i.e., an application program, hereinafter referred to as an application). While the mobile terminal is running, the operating system 152 is loaded from the memory 150 and executed by the processor 104. The application 154 is also loaded from the memory 150 and executed by the processor 104 at runtime. The application 154 runs on top of the operating system, and implements various functions desired by the user, such as instant messaging, web browsing, picture management, video playing, etc., using interfaces provided by the operating system and underlying hardware. The application 154 may be provided independently of the operating system, or may be provided by the operating system itself, including various social application software, such as QQ, WeChat, microblog, and the like, and also including various application software for live video play and game, and may also include system-owned application programs, such as a photo album, a calculator, and a recording pen. In addition, a driver module may also be added to the operating system when the application 154 is installed in the mobile terminal 100.
The program for performing the pterygium detection method 200 provided by the embodiment of the present invention is one of the applications 154. In some embodiments, the mobile terminal 100, which may be an iOS terminal, is configured to perform the pterygium detection method 200 according to the present invention.
Fig. 2 illustrates a flow diagram of a pterygium detection method 200 according to one embodiment of the present invention, suitable for execution in a mobile terminal, such as the mobile terminal 100, for detecting a pterygium in an eye image of a user.
As shown in fig. 2, the method begins at step S210. In step S210, an eye image sequence of a user to be measured is acquired, and a qualified target image is selected from the eye image sequence, where the target image sequentially includes a pupil region, an iris region, a sclera region, and an eyelid and an eye corner region from inside to outside.
According to one embodiment of the present invention, a mobile terminal has a pterygium detection application with an eye position prompt box and a picture taking button. The eye position prompt frame is provided with an eye contour prompt line and/or an eye feature prompt point; the eye contour prompt line comprises an eyelid contour line and/or an iris contour line; the eye feature prompt points comprise left eye corner positioning points and/or right eye corner positioning points. The photographing button can call the camera lens component, such as a rear camera. In addition, the mobile terminal is provided with an LED light supplement lamp and/or a macro lens and is used for collecting eye image sequences. The self-powered LED auxiliary lighting macro lens is clamped on a rear camera of the terminal and used for obtaining an eye image with higher resolution. At the moment, the photographing button can turn on the light supplement lamp, the rear camera or the macro lens to acquire the eye image of the person to be detected, and the eye image is displayed on the terminal screen in a preview mode.
Based on this, the step of obtaining the eye image sequence of the user to be detected includes: under the guidance of the eye position prompt box, a camera lens assembly of the mobile terminal is called through an applied photographing button so as to obtain eye images of a plurality of users to be detected, wherein the eye images do not contain scene reflection light spots. The camera captures images continuously at a certain frame rate, and the images captured within a period of time are all stored in a memory or a memory card to obtain an eye image sequence.
Here, the left eye and the right eye can be used for respectively acquiring the eye images of the person, and the person to be tested can take the corresponding eye images according to the text prompts of the left eye and the right eye in the terminal screen. When the user takes a picture each time, the eye prompting line or the eye prompting point is displayed in the rectangular frame of the screen, so that the user can take a picture after completing the eye positioning.
Moreover, considering the complexity that the user may face when capturing an eye image, as shown in fig. 3, the six cases a-f from top to bottom are a front view case, an oblique view case, a case of blocking one iris and pupil, a case of blocking two irises, a case of opening eyelids, and a case of reflecting an external scene, respectively. Therefore, the invention provides a plurality of eye positioning methods, which specifically comprise automatic eye positioning, semi-automatic eye positioning and semi-automatic eye characteristic point positioning. The automatic eye positioning is based on the traditional Adaboost algorithm or the SSD algorithm based on deep learning. The eye feature points for automatic eye localization are LBP features or features obtained by convolutional neural network training. When the eye is positioned semi-automatically, the human easily understood features such as an eye positioning frame is manually assisted, and the canthus, a plurality of points on the eyelid and the like are marked. During collection, the eyes fall into the prompt box on the interface as much as possible. When the automatic eye positioning fails, the target image intercepted in the prompt box can be used for a manual marking process of subsequent semi-automatic detection.
Considering that the presence of a diseased eye may affect the automatic detection of the eye, eye positioning and eyelid boundary detection may be aided by blinking, and eye position may be marked and tracked manually as necessary. For complex conditions that the old people have loose eyelids or the eyelids are small in opening degree and the eyelids need to be opened for examination, the target area can be positioned and tracked in a mode of manually marking characteristic points in an auxiliary mode.
In addition, when the eye image is obtained, the interference of external light rays is avoided, the operation is carried out indoors, and direct sunlight is avoided. The acquisition process needs to keep the mobile terminal and human eyes relatively stable as much as possible so as to complete the acquisition and detection process as soon as possible. Meanwhile, the eye area is uniformly illuminated by using a visible light diffusion illumination method, and the light source is presented as a reflection light spot on the eye image; it should also be avoided to form a reflected image of an indoor scene, such as a window, screen, and light, etc., on the cornea. At most, only the light spots of the light supplement lamp are in the ideal eye image.
In addition, the invention can also use a terminal LED light supplement lamp, a slit lamp and other light sources to add a frosted optical filter or other substitutes, so as to reduce the stimulation to eyes. The white light LED spectrum of the mobile terminal camera flash lamp generally consists of blue light in a range of 400-460 nm and an excitation light waveband in a range of 460-610 nm. For visible light in the wavelength band of 305-700 nm, the weighted retinal irradiance should be less than 220 muW/cm 2. When the slit lamp is used as auxiliary lighting, the slit lamp is required to be operated from the darkest gear.
It should be understood that there may be a front-to-back focusing process and jitter when photographing eyes with a terminal, causing defocus blur and motion blur. In order to observe the pterygium at the corner of the eye, the person who is the subject needs to rotate his or her eyeball around his or her eyes when capturing the eye image. Therefore, the acquired image sequence needs to be screened to obtain the target image.
According to another embodiment of the present invention, the step of selecting a qualified target image comprises: and selecting the image with qualified image focusing, motion blur and eye angle as a target image. The image focusing, the motion blur and the eye angle are obtained by calculation according to the camera parameters of the mobile terminal, the eye physiological parameters and the eye image sequence parameters.
Specifically, the camera parameters include focal length, optical center, and distortion parameters; the eye physiological data comprises interpupillary distance, iris diameter, relative positions of the pupil and the iris; the eye image sequence parameters include the frequency spectrum, interpupillary distance, iris diameter, relative positions of the pupil and iris, and eccentricity of the pupil and iris outline of each image in the image sequence. The indexes for selecting the qualified images include that the frequency spectrum of a clear image contains more high-frequency components, the pupil and the iris outline with a proper angle are close to concentric circles, and the like. When the selected qualified eye image is approved, the pterygium detection is continued.
Subsequently, in step S220, the pupil contour, the iris contour, and the eyelid contour of the target image are detected, and the invasion area of the pterygium is determined according to the detection result.
Wherein, the detection result comprises at least one of the following: pupil contour integrity, left and right pupil differences, iris contour integrity, left and right iris differences, left and right sclera differences, and differences in color and brightness from normal eyes. Where the actual pupil profile may not be a complete circle, the pupil profile integrity may be located as the ratio of the actual pupil profile length to the perimeter of the fitted circle (length _ control _ pupil _ left/perimeter _ circle _ pupil _ left). The iris left-right difference may include a difference in iris diameter (abs (d _ iris _ left-d _ iris _ right)/d _ iris _ right), a difference in iris area luminance histogram, and the like. The present invention can determine the invasion area of the pterygium based on these combined results.
According to one embodiment, in step S220, the present invention detects whether the pupil contour, the iris contour, and the eyelid contour of the target image are complete, and marks one or more regions outside the complete contour as invasive regions. Accordingly, one or more regions within the full outline are each marked as non-intrusive regions.
In an eye image, a pterygium usually appears as an irregular shape without a clear outline. For pterygia of varying degrees of progression, it is possible to cover the sclera, iris and pupil in that order, starting from the angle of the eye. Segmentation algorithms attempt to detect objects such as pupil contours, iris contours, eyelid contours, corners of the eye, and areas of the sclera, and then analyze the likelihood of pterygium presence in the sclera, iris, and pupil, and its severity. When the pupil contour, iris contour, eyelid contour are known, the scleral area can be automatically determined. The detection order is the pupil contour, iris contour, eyelid contour, eye angle, sclera region, or eyelid contour, eye angle, pupil contour, iris contour, sclera region.
When the pupil contour detection fails, it may be that the pterygium has invaded more of the pupil area, where the invasion area includes the pupil area, the iris area, the scleral area, the eyelids, and the corner of the eye area. If the pupil contour detection is complete, it indicates that the pterygium has not invaded the pupil area.
When the pupil contour is complete but the iris contour detection fails, it represents that the pterygium invades into the iris region, and the invasion region includes the iris region, scleral region, eyelid and canthus region. When the iris outline detection is complete, it indicates that the pterygium has not invaded the iris and pupil areas.
When the iris contour is intact but the eyelid and canthus contour detection fails, it represents the encroachment of the pterygium into the scleral area, and the encroachment areas include the scleral area, the eyelid and the canthus area. If the eyelid and corner contour detection is complete, typically pterygium does not occur.
The invention is not limited to specific forms, and all algorithms capable of detecting the pupil contour, the iris contour and the eyelid contour of the eye image are within the protection scope of the invention. According to one embodiment, the pupil profile and the iris profile are detected using a Hough circle algorithm, and the eyelid profile and the canthus are detected using a face feature point algorithm.
Subsequently, in step S230, a measured or fitted contour of the invasive region is obtained, and one or more seed points are determined within the invasive region.
Of course, the contour line of each intrusion area may be a contour line drawn by the user in the eye image. Preferably, the actual contour line, i.e., the contour line detected in step S220, is used, and if the actual contour line does not exist, the fitted contour line is obtained according to the existing contour points. If the fitted contour is also not too accurate, a user hand drawn contour may be used. And acquiring an actual measurement contour line or a fitting contour line of each invasion area, and actually segmenting each invasion area.
According to one embodiment, the seed point is any one of: user-specified pterygium points, random points within each region, or feature points identified from a priori knowledge of the pterygium. Within each invaded zone, one or more seed points can be determined.
Subsequently, in step S240, the pterygium of the invaded region is obtained using a region growing algorithm starting from the seed point, and the region growing algorithm stops growing when the contour line of the region is grown.
That is, when the pupil contour detection fails, the user can specify the approximate pterygium region in the pupil and the pupil. The algorithm performs pterygium segmentation in the pupil and uses this information to detect pterygium in the iris and sclera by region growing or the like. When the iris contour detection fails, the estimated iris contour points and the approximate pterygium regions in the iris may be designated by the user. The algorithm performs pterygium segmentation in the iris and uses this information to detect pterygium in the sclera by region growing or the like. When eyelid contour and canthus detection fails, and thus sclera region detection fails, the sclera region and the approximate pterygium region in the sclera can be specified by the user. Generally, pterygia present only in the sclera is usually early, less prominent, and multiple scleral areas can be further specified by the user, and the algorithm detects the pterygia by way of comparative analysis.
Here, the region growing algorithm performs recognition based on characteristics such as pixel color and pixel brightness. The pupil area, the iris area, the scleral area, the eyelid and the corner area all have corresponding primary colors and brightness of images, and the pterygium growing in the areas also has corresponding pixel colors and brightness. The pterygium is gradually identified from these differences.
In addition, in the above implementations, the region growing algorithm automatically stops growing when it grows to the contour line to determine the pterygium within each region. In another implementation, the region growing algorithm does not stop growing until the contour is reached, but continues to grow until the entire pterygium is identified. Specifically, from a seed point of an invaded region, a complete pterygium containing the seed point is identified by adopting a region growing algorithm; the complete pterygium identified from all the invaded regions are merged as the total pterygium region in the target image. At this time, the area ratio of the pterygium area in the target image can be calculated according to the area of the pixel points occupied by the total area and the area in the eyelid outline.
Subsequently, in step S250, a probability value of occurrence of the pterygium in the target image is calculated from the region growing result of each region.
According to one embodiment, the region growth results include the area fraction of pterygium within each invasion region. Knowing the contour lines of the regions and the pterygium pixel points detected in each region, the area ratio of the pterygium in the region can be calculated according to the plane geometric knowledge. The application of the present invention marks the possibility of the existence of pterygium in the eye image pixel by pixel and counts the proportion of the pterygium in the areas of the pupil, iris, sclera, and the like.
As can be seen from the foregoing, the present invention calculates the pterygium ratio in each intrusion area, and also calculates the pterygium ratio in the entire target image. According to these ratio values, the probability value of the pterygium appearing in the target image can be calculated, and the corresponding relationship between each ratio value and the probability value can be set by a person skilled in the art according to needs, which is not limited by the invention.
In another implementation, the probability values are calculated based on a priori knowledge of the pterygium's brightness and color differences from the pupillary region, the iris region, the scleral region, and the pterygium's brightness and color. The probability values may be a weighting of the differences and a priori knowledge, with weighting coefficients obtained through experimentation or training.
According to another embodiment, after calculating the probability value, the method 200 may further include the steps of: and uploading the target image, the area ratio, the probability value and the user information to a server for storage. Fig. 4 is a schematic diagram of an object image of the present invention, and fig. 5 is a diagram of the result of processing the object image, in which an eyelid region, an iris region, and a pupil region are divided.
Fig. 6 and 7 are schematic diagrams illustrating an application interface of a mobile terminal according to an embodiment of the present invention, which shows a detection result of pterygium, wherein fig. 6 is a portrait mode and fig. 7 is a landscape mode, and the landscape mode can further enlarge an eye image. Fig. 6 and 7 calculate the area ratio and probability value of pterygium based on fig. 5. After the detection results are uploaded to the server, a patient file can be established, and reference is provided for remote evaluation of doctors.
When the terminal is operated, firstly, the acquisition function button is clicked, then the information acquisition page such as name and the like is entered (the skip input can be clicked), and the information acquisition page is entered after the input. The acquisition interface contains a preview screen, an indicative rectangular box, an acquisition button, an image/result viewing button, and a simple graphic showing whether the disease results. In addition, the images can be switched between the result graph and the original graph after being clicked, and the images can be switched by sliding left and right. And then, selecting the original image and the corresponding detection result to upload to a server to establish a patient file. Meanwhile, the invention provides a function of modifying parameter options, and a sliding button appears after clicking, and can modify part of parameters to realize the control of the algorithm.
According to the technical scheme of the invention, the rear camera and the shading attenuated fill-in lamp are utilized to capture the eye image through the terminal application program, the pterygium of the advancing stage and the resting stage in the eye image is detected, and the severity is evaluated. And then, the application program uploads the detection result to the server to provide reference for the diagnosis process.
A9, the method of A8, wherein the eye position cue box has eye contour cue lines and/or eye feature cue points; the eye contour prompt line comprises an eyelid contour line and/or an iris contour line; the eye feature prompt points comprise left eye corner positioning points and/or right eye corner positioning points. A10, the method according to any one of A1-A9, wherein the mobile terminal has an LED fill light and/or a macro lens for capturing the eye image sequence. A11, the method as claimed in any one of A1-A10, wherein the step of picking up qualified target images includes: selecting an image with qualified image focusing, motion blur and eye angle as a target image; and the image focusing, the motion blur and the eye angle are obtained by calculation according to the camera parameters, the eye physiological parameters and the eye image sequence parameters of the mobile terminal.
A12, the method of a11, wherein the camera parameters include focal length, optical center, and distortion parameters; the eye physiological data comprises interpupillary distance, iris diameter, relative positions of the pupil and the iris; the eye image sequence parameters comprise frequency spectrums, interpupillary distances, iris diameters, relative positions of pupils and irises and eccentricities of pupils and iris outlines of all images in the image sequence. A13, the method of a1, wherein the pupil contour and iris contour are detected using Hough circle algorithm, and the eyelid contour and canthus are detected using face feature point algorithm.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with examples of this invention. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.
Claims (10)
1. A pterygium detection method adapted to be performed in a mobile terminal, the method comprising the steps of:
acquiring an eye image sequence of a user to be detected, and selecting a qualified target image from the eye image sequence, wherein the target image sequentially comprises a pupil region, an iris region, a scleral region, an eyelid region and an eye corner region from inside to outside;
detecting the pupil contour, the iris contour and the eyelid contour of the target image, and determining the invasion area of the pterygium according to the detection result;
acquiring a measured contour line or a fitted contour line of the invasion area, and determining one or more seed points in the invasion area;
taking the seed point as a starting point, obtaining the pterygium of the invasion area by adopting a region growing algorithm, and stopping growing when the region growing algorithm grows to the contour line of the area; and
and calculating the probability value of pterygium appearing in the target image according to the region growing result of each region.
2. The method of claim 1, further comprising the steps of:
identifying a complete pterygium containing the seed point from the seed point of the invaded region by adopting a region growing algorithm;
the complete pterygium identified from all the invaded regions are merged as the total pterygium region in the target image.
3. The method of claim 1, wherein the seed point is any one of: user-specified pterygium points, random points within each region, or feature points identified from a priori knowledge of the pterygium.
4. The method of any of claims 1-3, wherein the region growing algorithm identifies based on pixel color and pixel intensity, and the region growing result includes pterygium area ratios within each invasion region.
5. The method of any one of claims 1-4, wherein the probability values are calculated based on pterygium-to-pupil region, iris region, scleral region brightness and color differences, and a priori knowledge of pterygium brightness and color.
6. The method of claim 4, further comprising the steps of: and uploading the target image, the area ratio, the probability value and the user information to a server for storage.
7. The method of any one of claims 1-6, wherein the detection result comprises any one of: pupil contour integrity, pupil left-right difference, iris contour integrity, iris left-right difference, sclera left-right difference, and color and brightness difference from normal eyes.
8. The method of any one of claims 1-7, wherein the mobile terminal has a pterygium detection application with an eye position prompt box and a picture button, the step of acquiring a sequence of eye images of a user under test comprising:
under the guidance of the eye position prompt box, the camera lens assembly of the mobile terminal is called through the applied photographing button so as to acquire eye images of a plurality of users to be detected, wherein the eye images do not contain scene reflection light spots.
9. A mobile terminal, comprising:
a memory;
one or more processors;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any of the methods of claims 1-8.
10. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform any of the methods of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010731055.5A CN113974546A (en) | 2020-07-27 | 2020-07-27 | Pterygium detection method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010731055.5A CN113974546A (en) | 2020-07-27 | 2020-07-27 | Pterygium detection method and mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113974546A true CN113974546A (en) | 2022-01-28 |
Family
ID=79731453
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010731055.5A Pending CN113974546A (en) | 2020-07-27 | 2020-07-27 | Pterygium detection method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113974546A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023150229A1 (en) * | 2022-02-04 | 2023-08-10 | Genentech, Inc. | Guided self-capture of diagnostic images |
CN116999017A (en) * | 2023-09-04 | 2023-11-07 | 指南星视光(武汉)科技有限公司 | Auxiliary eye care intelligent control system based on data analysis |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015177878A (en) * | 2014-03-19 | 2015-10-08 | キヤノン株式会社 | Ophthalmologic apparatus and control method |
CN106960199A (en) * | 2017-03-30 | 2017-07-18 | 博奥生物集团有限公司 | A kind of RGB eye is as the complete extraction method in figure white of the eye region |
CN109344808A (en) * | 2018-07-24 | 2019-02-15 | 中山大学中山眼科中心 | A kind of eyes image processing system based on deep learning |
CN109636796A (en) * | 2018-12-19 | 2019-04-16 | 中山大学中山眼科中心 | A kind of artificial intelligence eye picture analyzing method, server and system |
WO2019190194A1 (en) * | 2018-03-27 | 2019-10-03 | 중앙대학교 산학협력단 | Device for diagnosing pterygium for optional surgery on basis of decision-making model, method therefor, and pov level determination method using smart phone |
CN111128382A (en) * | 2019-12-30 | 2020-05-08 | 清华大学 | Artificial intelligence multimode imaging analytical equipment |
-
2020
- 2020-07-27 CN CN202010731055.5A patent/CN113974546A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015177878A (en) * | 2014-03-19 | 2015-10-08 | キヤノン株式会社 | Ophthalmologic apparatus and control method |
CN106960199A (en) * | 2017-03-30 | 2017-07-18 | 博奥生物集团有限公司 | A kind of RGB eye is as the complete extraction method in figure white of the eye region |
WO2019190194A1 (en) * | 2018-03-27 | 2019-10-03 | 중앙대학교 산학협력단 | Device for diagnosing pterygium for optional surgery on basis of decision-making model, method therefor, and pov level determination method using smart phone |
CN109344808A (en) * | 2018-07-24 | 2019-02-15 | 中山大学中山眼科中心 | A kind of eyes image processing system based on deep learning |
CN109636796A (en) * | 2018-12-19 | 2019-04-16 | 中山大学中山眼科中心 | A kind of artificial intelligence eye picture analyzing method, server and system |
CN111128382A (en) * | 2019-12-30 | 2020-05-08 | 清华大学 | Artificial intelligence multimode imaging analytical equipment |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023150229A1 (en) * | 2022-02-04 | 2023-08-10 | Genentech, Inc. | Guided self-capture of diagnostic images |
CN116999017A (en) * | 2023-09-04 | 2023-11-07 | 指南星视光(武汉)科技有限公司 | Auxiliary eye care intelligent control system based on data analysis |
CN116999017B (en) * | 2023-09-04 | 2024-06-04 | 指南星视光(武汉)科技有限公司 | Auxiliary eye care intelligent control system based on data analysis |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4196714B2 (en) | Digital camera | |
JP6577454B2 (en) | On-axis gaze tracking system and method | |
EP3776583B1 (en) | Guidance method and system for teledentistry imaging | |
US20230263385A1 (en) | Methods and apparatus for making a determination about an eye | |
WO2018161289A1 (en) | Depth-based control method, depth-based control device and electronic device | |
JP2004317699A (en) | Digital camera | |
WO2015100294A1 (en) | Wide field retinal image capture system and method | |
WO2016195066A1 (en) | Method of detecting motion of eyeball, program for same, storage medium for program, and device for detecting motion of eyeball | |
CN107077596A (en) | System for producing the face-image met for selected identification document | |
JP2004320286A (en) | Digital camera | |
CN110916608B (en) | Diopter detection device | |
WO2021204211A1 (en) | Method and apparatus for acquiring facial image and iris image, readable storage medium, and device | |
CN113974546A (en) | Pterygium detection method and mobile terminal | |
US11969210B2 (en) | Methods and apparatus for making a determination about an eye using color temperature adjusted lighting | |
JP2004320285A (en) | Digital camera | |
CN112069986A (en) | Machine vision tracking method and device for eye movements of old people | |
CN110363782B (en) | Region identification method and device based on edge identification algorithm and electronic equipment | |
CN108852280A (en) | A kind of Image Acquisition of vision drop and analysis method, system and equipment | |
CN102655565B (en) | Anti-red-eye portrait shooting system and method | |
CN107273847B (en) | Iris acquisition method and apparatus, electronic device, and computer-readable storage medium | |
US11872050B1 (en) | Image integrity and repeatability system | |
CN112528713B (en) | Gaze point estimation method, gaze point estimation system, gaze point estimation processor and gaze point estimation equipment | |
US11969212B2 (en) | Methods and apparatus for detecting a presence and severity of a cataract in ambient lighting | |
US20230156320A1 (en) | Apparatus and method for imaging fundus of eye | |
EP4081098A1 (en) | Methods and apparatus for making a determination about an eye using color temperature adjusted ambient lighting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20230826 Address after: No. 8, Row 9, Fatou Dongli Middle Yard, Chaoyang District, Beijing, 100020 Applicant after: Wang Xiaopeng Address before: 541000 building D2, HUTANG headquarters economic Park, Guimo Avenue, Qixing District, Guilin City, Guangxi Zhuang Autonomous Region Applicant before: Guangxi Code Interpretation Intelligent Information Technology Co.,Ltd. |