CN117392535A - Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment - Google Patents
Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment Download PDFInfo
- Publication number
- CN117392535A CN117392535A CN202311226910.7A CN202311226910A CN117392535A CN 117392535 A CN117392535 A CN 117392535A CN 202311226910 A CN202311226910 A CN 202311226910A CN 117392535 A CN117392535 A CN 117392535A
- Authority
- CN
- China
- Prior art keywords
- white
- bud
- white point
- remote sensing
- target detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 123
- 235000013399 edible fruits Nutrition 0.000 title claims abstract description 54
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000012549 training Methods 0.000 claims abstract description 34
- 238000004364 calculation method Methods 0.000 claims abstract description 12
- 241000519995 Stachys sylvatica Species 0.000 claims description 94
- 238000013140 knowledge distillation Methods 0.000 claims description 19
- 230000006870 function Effects 0.000 claims description 18
- 230000007246 mechanism Effects 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 8
- 238000011176 pooling Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 6
- 230000000873 masking effect Effects 0.000 claims description 4
- 238000000926 separation method Methods 0.000 claims description 2
- 230000006978 adaptation Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 abstract description 10
- 230000012010 growth Effects 0.000 abstract description 3
- 238000012545 processing Methods 0.000 abstract description 3
- 238000011002 quantification Methods 0.000 abstract description 3
- 230000009662 flower bud growth Effects 0.000 abstract description 2
- 241001629511 Litchi Species 0.000 description 8
- 238000012360 testing method Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000004069 differentiation Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 230000017260 vegetative to reproductive phase transition of meristem Effects 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 101100295091 Arabidopsis thaliana NUDT14 gene Proteins 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003203 everyday effect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000012271 agricultural production Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005200 bud stage Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000001850 reproductive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000009105 vegetative growth Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/096—Transfer learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Molecular Biology (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a flower bud target detection and white spot rate estimation method for a fruit tree in a complex environment, which relates to the technical field of image processing, combines an unmanned aerial vehicle remote sensing technology with an artificial intelligent algorithm, utilizes a flower bud target detection network constructed by unmanned aerial vehicle remote sensing image training in a fruit tree flower bud period to obtain a white spot terminal bud detection model and a non-white spot terminal bud detection model, and utilizes the white spot terminal bud detection model to realize fruit tree bud target detection in the complex environment; in addition, the white spot rate is provided as a data index for quantifying the growth vigor of the flower buds, a target detection result of the non-white spot terminal buds is obtained by using a non-white spot terminal bud detection model, the white spot rate of the fruit tree is obtained by calculation by using the target detection results of the white spot terminal buds and the non-white spot terminal buds, and the flower bud growth quantification index is obtained while the flower bud target identification under the complex environment is realized.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a complex environment-oriented fruit tree flower bud target detection and white point rate estimation method.
Background
In recent years, with the rapid development of Chinese digital economy, agricultural digitization has become an important component of Chinese digitization transformation, especially the application of precise agricultural technology. The accurate agricultural technology can realize accurate automatic monitoring management of crops by using satellite remote sensing, laser ranging and other technologies and by using an artificial intelligence technology as a basis, so that the cultivated land resource potential is better utilized, the material investment is scientifically and reasonably utilized, the crop yield and quality are improved, the production cost is reduced, and the pollution caused by agricultural activities is reduced. In addition, by utilizing the artificial intelligence technology, flower buds, flower ears, fruits and the like of the fruit trees can be classified and identified so as to better promote, manage and sell.
In recent years, rapid development of big data and popularization of high-performance computing have promoted vigorous development of deep learning, and remarkable achievement has been achieved in many fields. At present, deep learning combines various mechanisms and modules, has been developed rapidly in the field of target detection, and features of objects can be learned through a neural network, so that the accuracy and efficiency of target detection are improved, and good performance is achieved in the detection task of fruit tree buds.
The "white spots" indicate the spikelet primordia present at the top of the flower buds, which are also indicative of completion of the flowering induction. The 'litchi flower bud differentiation stage view' introduces the concept that white spots are used as boundary marks of 'flowering induction period' and 'ear initiation development period', so that the litchi flower bud differentiation and flowering mechanism research is more purposeful and targeted. The flower bud differentiation is the physiological process of converting leaf buds into flower buds, namely the process of converting the vegetative growth into reproductive growth of fruit trees, so that the accurate detection of flower bud targets and the quantification of flower bud differentiation have guiding value for practical production.
The prior art provides a method for identifying fruit tree buds, which comprises the steps of obtaining image data of the fruit tree buds and labeling bud coordinates and bud categories; constructing a bud target detection model according to the image data and the bud coordinates, and constructing a bud category identification model according to the cut image data and the bud category; and identifying the bud images of the fruit trees according to the bud target detection model and the bud category identification model. However, in the scheme, the automatic classification and identification of the buds of the fruit trees can be realized, but the detection conditions of the complex environment are difficult to adapt.
The unmanned aerial vehicle can stably and efficiently collect image data, the unmanned aerial vehicle images are analyzed through an artificial intelligence technology, and image data of fruit trees in different environments can be better obtained. Therefore, how to combine the remote sensing image processing based on the artificial intelligence algorithm with the unmanned aerial vehicle technology and further deepen, and realize more accurate fruit tree bud target detection and white point rate calculation is an important subject to be researched in the process of advancing the digital transformation of agricultural production.
Disclosure of Invention
In order to solve the problem of low accuracy of fruit tree bud target detection in the current complex environment, the invention provides a complex environment-oriented fruit tree flower bud target detection and white spot rate estimation method, which is characterized in that unmanned aerial vehicle remote sensing images are collected as data sets, a white spot terminal bud detection model and a non-white spot terminal bud detection model are obtained through training, and target detection positioning and white spot rate estimation of white spot terminal buds and non-white spot terminal buds are realized.
In order to achieve the technical effects, the technical scheme of the invention is as follows:
a fruit tree flower bud target detection and white point rate estimation method facing complex environment comprises the following steps:
s1, acquiring unmanned aerial vehicle remote sensing images of a fruit tree in a flower bud period;
s2, marking rectangular target frames of the white-spot terminal buds and rectangular target frames of the non-white-spot terminal buds in the remote sensing image, obtaining position parameters of the rectangular target frames, converting the position parameters of the rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files and respectively pairing the parameter files with the remote sensing image to obtain a white-spot terminal bud remote sensing image dataset and a non-white-spot terminal bud remote sensing image dataset;
s3, constructing a flower bud target detection network, and training the flower bud target detection network by using a white spot terminal bud remote sensing image data set to obtain a white spot terminal bud detection model; training the flower bud target detection network architecture by using the non-white spot terminal bud remote sensing image data set to obtain a non-white spot terminal bud detection model;
s4, inputting the unmanned aerial vehicle remote sensing image of the area to be detected into a trained white spot terminal bud detection model to obtain a target detection result of the white spot terminal buds of the fruit trees;
s5, separating the white point terminal bud target detection result, inputting the white point terminal bud target detection result after the separation treatment into a non-white point terminal bud detection model to obtain a target detection result of the non-white point terminal bud, and calculating the white point rate of the fruit tree by utilizing the white point terminal bud of the fruit tree and the target detection result of the non-white point terminal bud.
According to the technical scheme, the unmanned aerial vehicle remote sensing technology and the artificial intelligent algorithm are combined, unmanned aerial vehicle remote sensing images of the flower bud period of the fruit tree are used for training a flower bud target detection network, a white point terminal bud detection model and a non-white point terminal bud detection model are obtained through training, and fruit tree bud target detection and white point rate calculation under a complex environment are efficiently and accurately achieved.
Preferably, in step S1, the unmanned aerial vehicle is used to photograph the remote sensing images of the unmanned aerial vehicle in the flower bud stage of the fruit tree, wherein the photographing direction includes a plurality of photographing conditions including sunny days, cloudy days, forward light and reverse light, which is beneficial to adapting the subsequent flower bud target detection of the fruit tree to various environments under natural light.
Preferably, step S2 comprises the steps of:
s21, respectively marking a rectangular target frame of a white spot terminal bud and a rectangular target frame of a non-white spot terminal bud in the flower bud period of the fruit tree on the unmanned aerial vehicle remote sensing image;
s22, respectively obtaining data parameters of the rectangular target frames of the white dot terminal buds and the non-white dot terminal buds after marking is finished, wherein the data parameters comprise the upper left corner coordinates and the lower right corner coordinates of the rectangular target frames in the images and the pixel length and width of the rectangular frames;
s23, converting data parameters of rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files, and pairing the parameter files of the white-spot terminal buds and the non-white-spot terminal buds with the corresponding remote sensing images to obtain a white-spot terminal bud remote sensing data set and a non-white-spot terminal bud remote sensing data set.
Preferably, the flower bud target detection network comprises a feature extraction module, a multi-head attention mechanism module, a cavity convolution pyramid pooling layer, a knowledge distillation-based field self-adaptation module and a detection head;
the characteristic extraction module is used for extracting multi-level characteristics of the input remote sensing image, wherein the multi-level characteristics comprise bottom space characteristics and high-level semantic characteristics;
the cavity convolution pyramid pooling layer is used for primarily positioning and classifying high-level semantic features;
the multi-head attention mechanism module is used for capturing global information of the bottom space features to obtain a new feature map;
the domain self-adaption module carries out knowledge distillation on the feature map and the pre-trained teacher model, and then carries out cascade connection on the features subjected to knowledge distillation and the high-level semantic features subjected to preliminary classification so as to mutually adapt to each other;
the detection head is used for outputting the full-connection layer to obtain the classification category, the positioning frame position and the confidence score of the input remote sensing image.
Preferably, the feature extraction module adopts a ResNet101 structure as a backbone network, an input end of the multi-head attention mechanism module is connected with a second layer of the ResNet101, and an input end of the cavity convolution pyramid pooling layer is connected with a fourth layer of the ResNet 101.
Preferably, the multi-head attention mechanism module performs position coding on the input bottom space features, then performs self-attention mechanism calculation, filters out irrelevant or noise areas in jump connection, takes the self-attention calculation result as an output feature diagram, and filters out irrelevant or noise areas in jump connection.
The long-range structure information in the image features is extracted by utilizing the multi-head attention mechanism module, and the association between each element in the feature map is established, so that the receptive field is enlarged.
Preferably, the data parameters are converted into a parameter file in a YOLO data format, and the detection head is a YOLO detection head.
Preferably, the field self-adaptive module based on knowledge distillation comprises a teacher-student model, and the student model is supervised and trained by extracting the trained teacher model;
in the training process, constructing a difference value of output predicted values between a teacher model and a student model as a loss function of knowledge distillation; adding the loss function of knowledge distillation and the loss function of the student model to obtain a total loss function; and gradient updating is carried out by using the total loss function until convergence, and finally, a student model with higher performance and precision is obtained.
Preferably, in step S3, training parameters of the flower bud target detection network are adjusted by using the white point terminal bud remote sensing image dataset, so as to obtain a trained white point terminal bud detection model; adjusting training parameters of the flower bud target detection network by using the non-white spot terminal bud remote sensing image data set to obtain a trained non-white spot terminal bud detection model;
the training parameters include: iteration times, batch size, optimizer choice, initial learning rate setting, weight decay rate setting, and loss function choice.
Preferably, step S5 comprises the steps of:
s51, performing mask operation on a white point positioning frame in a target detection result of the white point terminal bud, setting all pixel values in the white point positioning frame to 0, namely identifying non-white point terminal buds outside the white point target frame, and obtaining a remote sensing image after masking, so that the identification of the non-white point terminal buds is not influenced;
s52, inputting the remote sensing image after masking into a trained non-white spot terminal bud detection model to obtain a target detection result of the non-white spot terminal bud;
s53, counting the number of white point locating frames and non-white point locating frames, and obtaining the white point rate of the fruit tree by using the sum of the number of white point locating frames and the number of upper white point locating frames and the number of non-white point locating frames.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
the invention provides a complex environment-oriented flower bud target detection and white spot rate estimation method for a fruit tree, which combines an unmanned aerial vehicle remote sensing technology with an artificial intelligent algorithm, utilizes a flower bud target detection network constructed by unmanned aerial vehicle remote sensing image training in a flower bud period of the fruit tree to obtain a white spot terminal bud detection model and a non-white spot terminal bud detection model, and utilizes the white spot terminal bud detection model to realize fruit tree bud target detection in the complex environment; in addition, the white spot rate is provided as a data index for quantifying the growth vigor of the flower buds, a target detection result of the non-white spot terminal buds is obtained by using a non-white spot terminal bud detection model, the white spot rate of the fruit tree is obtained by calculation by using the target detection results of the white spot terminal buds and the non-white spot terminal buds, and the flower bud growth quantification index is obtained while the flower bud target identification under the complex environment is realized.
Drawings
Fig. 1 shows a flow chart of a method for detecting flower bud targets and estimating white point rate of a fruit tree facing a complex environment, which is proposed in embodiment 1 of the invention;
fig. 2 shows a schematic diagram of a flower bud target detection network according to embodiment 3 of the present invention;
fig. 3 shows a schematic diagram of a domain-adaptive module based on knowledge distillation as proposed in example 3 of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
for better illustration of the present embodiment, some parts of the drawings may be omitted, enlarged or reduced, and do not represent actual dimensions;
it will be appreciated by those skilled in the art that some well known descriptions in the figures may be omitted.
The technical scheme of the invention is further described below with reference to the accompanying drawings and examples.
The positional relationship depicted in the drawings is for illustrative purposes only and is not to be construed as limiting the present patent;
example 1
As shown in fig. 1, this embodiment provides a method for detecting flower bud targets and estimating white point rate of fruit trees facing to complex environments, taking flower bud target detection and estimating white point rate of litchis as an example, including the following steps:
s1, acquiring unmanned aerial vehicle remote sensing images of litchi flowers and buds;
s2, marking rectangular target frames of the white-spot terminal buds and rectangular target frames of the non-white-spot terminal buds in the remote sensing image, obtaining position parameters of the rectangular target frames, converting the position parameters of the rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files and respectively pairing the parameter files with the remote sensing image to obtain a white-spot terminal bud remote sensing image dataset and a non-white-spot terminal bud remote sensing image dataset;
s3, constructing a flower bud target detection network, and training the flower bud target detection network by using a white spot terminal bud remote sensing image data set to obtain a white spot terminal bud detection model; training the flower bud target detection network architecture by using the non-white spot terminal bud remote sensing image data set to obtain a non-white spot terminal bud detection model;
s4, inputting the unmanned aerial vehicle remote sensing image of the area to be detected into a trained white spot terminal bud detection model to obtain a target detection result of the white spot terminal buds of the litchi;
s51, performing mask operation on a white point positioning frame in a target detection result of a white point terminal bud, and setting all pixel values in the white point positioning frame to 0 to obtain a masked remote sensing image;
s52, inputting the remote sensing image after masking into a trained non-white spot terminal bud detection model to obtain a target detection result of the non-white spot terminal bud;
s53, counting the number of white point locating frames and non-white point locating frames, and obtaining the white point rate of the fruit tree by using the sum of the number of white point locating frames and the number of upper white point locating frames and the number of non-white point locating frames.
Specifically, in step S1, the unmanned aerial vehicle shoots unmanned aerial vehicle remote sensing images of the litchi flower bud period according to the planned route, and performs route planning after acquiring longitude and latitude coordinates of a central point of a single litchi tree according to the orthographic images of a research area, so that the unmanned aerial vehicle can automatically and efficiently acquire data according to a fixed route every day. For example, firstly, an orthographic image of a research area is acquired through an unmanned aerial vehicle, longitude and latitude coordinates of a litchi center point are manually/automatically obtained, a fixed route is planned according to the longitude and latitude coordinates, and the unmanned aerial vehicle shoots and collects data every day through the fixed route. The flower bud target in the flower bud period is small and is easily influenced by complex environments, so that the flight parameters of the unmanned aerial vehicle are required to be set according to actual conditions. In this embodiment, the unmanned aerial vehicle acquisition platform uses DJI M30, and easy operation is convenient, flight is stable, take photo by plane performance is high, does benefit to and promotes in actual production process. The DJI M30 camera has 4800 ten thousand effective pixels and a camera view angle of 84 degrees, and the highest effective resolution of the photo is 8000 x 6000. The unmanned aerial vehicle navigational altitude is set to 25m,20 times zoom.
When the unmanned aerial vehicle shoots, each point position shoots from two angles, the shooting angles need to contain all directions, and the remote sensing images under various different light rays such as sunny days, cloudy days, forward light, backlight and the like are fully collected, so that the influence of strong and weak light on flower bud target identification is improved.
Example 2
In the present embodiment, based on embodiment 1, step S2 includes the steps of:
s21, combining priori knowledge by mapping professionals, performing visual interpretation on the unmanned aerial vehicle remote sensing image, and respectively marking a rectangular target frame of a white dot terminal bud and a rectangular target frame of a non-white dot terminal bud of the litchi flower bud period on the unmanned aerial vehicle remote sensing image by using LabelImg software; wherein, the white-spot terminal bud target presents the small-grain-shaped and full-white-fluff-coated buds, and the non-white-spot terminal buds do not grow out of the buds, and only the branches are grown;
s22, respectively obtaining data parameters of the rectangular target frames of the white dot terminal buds and the non-white dot terminal buds after marking is finished, wherein the data parameters comprise the upper left corner coordinates and the lower right corner coordinates of the rectangular target frames in the images and the pixel length and width of the rectangular frames;
s23, converting data parameters of rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files, and pairing the parameter files of the white-spot terminal buds and the non-white-spot terminal buds with the corresponding remote sensing images to obtain a white-spot terminal bud remote sensing data set and a non-white-spot terminal bud remote sensing data set.
Further, by adopting a random sampling mode, the white point terminal bud remote sensing data set and the non-white point terminal bud remote sensing data set are respectively represented by 7:1:2 is divided into a training set, a verification set and a test set, wherein the training set is used for training the white point terminal bud detection model and the non-white point terminal bud detection model, then the verification set is used for evaluating the white point terminal bud detection model and the non-white point terminal bud detection model in the training process, and the test set is used for testing the validity of the white point terminal bud detection model and the non-white point terminal bud detection model to obtain the trained white point terminal bud detection model and the trained non-white point terminal bud detection model.
In practical application, the white spot target is small, and the complex environment comprises the condition that the flower bud target is blocked by the leaf or is blocked by the shadow, and the like, so that the recognition of the target can be influenced. In order to improve the generalization performance of the white point terminal bud detection model and the non-white point terminal bud detection model and to be more suitable for the influence of complex environments, the two sets of data sets are subjected to data enhancement to expand samples, and the specific steps comprise: the training set data is subjected to a random data enhancement strategy, and the data enhancement method comprises rotation, contrast enhancement, color enhancement, shading operation and the like, so that the training set is expanded by 8 times. In this embodiment, the white point terminal bud dataset obtains 3606 pairs of training set samples, 515 pairs of verification set samples, 1030 pairs of test set samples; the non-white spot terminal bud dataset resulted in 2285 pairs of training set samples, 326 pairs of validation set samples, 652 pairs of test set samples. Each sample pair comprises an unmanned aerial vehicle image with the size of 3000 x 4000 and a corresponding flower bud target rectangular frame parameter file, wherein in the flower bud target rectangular frame parameter file, a first number represents a classification category, a second number represents a rectangular frame upper left corner length and width normalization coordinate, a third number represents a rectangular frame lower right corner length and width normalization coordinate, a fourth number represents a rectangular frame pixel length, and a fifth number represents a rectangular frame pixel width.
Example 3
In this embodiment, as shown in fig. 2, the flower bud target detection network includes a feature extraction module, a multi-head attention mechanism module MSHA, a hole convolution pyramid pooling layer ASPP, a domain self-adaptive module based on knowledge distillation, and a detection head;
the feature extraction module adopts a ResNet101 structure as a main network and is used for extracting multi-level features of an input remote sensing image, wherein the multi-level features comprise bottom space features and high-level semantic features;
the input end of the hole convolution pyramid pooling layer ASPP is connected with a fourth layer (layer 4 in fig. 2) of the ResNet101 and is used for primarily positioning and classifying high-level semantic features;
the input end of the multi-head attention mechanism module MSHA is connected with the second layer (layer 2 in fig. 2) of the ResNet101, and is used for inputting the bottom space characteristics F l2 Performing position coding, performing self-attention mechanism calculation, filtering out irrelevant or noise areas in jump connection, obtaining self-attention calculation results by acquiring Q, K, V three feature matrixes, filtering out irrelevant or noise areas in jump connection, and taking the obtained self-attention calculation results as an output feature diagram F' l2 The method comprises the steps of carrying out a first treatment on the surface of the The self-attention calculation formula is as follows:
the domain self-adaption module carries out knowledge distillation on the feature map and the pre-trained teacher model, and then carries out cascade connection on the features subjected to knowledge distillation and the high-level semantic features subjected to preliminary classification so as to mutually adapt to each other; in this embodiment, as shown in fig. 3, the domain adaptive module based on knowledge distillation includes a teacher-student model, and the student model is supervised and trained by extracting the trained teacher model;
in the training process, constructing a difference value of output predicted values between a teacher model and a student model as a loss function of knowledge distillation; adding the loss function of knowledge distillation and the loss function of the student model to obtain a total loss function; and gradient updating is carried out by using the total loss function until convergence, so that a trained student model is obtained.
The detection head is used for outputting a full-connection layer to obtain the classification category, the positioning frame position and the confidence score of the input remote sensing image; in this embodiment, the detection head is a YOLO detection head, and the parameter file in the data set is a YOLO data format.
Adjusting training parameters of the flower bud target detection network by using the white spot terminal bud remote sensing image data set to obtain a trained white spot terminal bud detection model; adjusting training parameters of the flower bud target detection network by using the non-white spot terminal bud remote sensing image data set to obtain a trained non-white spot terminal bud detection model;
the training parameters include: iteration times, batch size, optimizer choice, initial learning rate setting, weight decay rate setting, and loss function choice. In this embodiment, the training number is set to 1000, the optimizer is a random gradient descent SGD, the initial learning rate is 0.01, the training batch size is 4, the momentum is set to 0.9, and the weight decay is set to 0.0001.
It is to be understood that the above examples of the present invention are provided by way of illustration only and are not intended to limit the scope of the invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the invention are desired to be protected by the following claims.
Claims (10)
1. The fruit tree flower bud target detection and white point rate estimation method for the complex environment is characterized by comprising the following steps of:
s1, acquiring unmanned aerial vehicle remote sensing images of a fruit tree in a flower bud period;
s2, marking rectangular target frames of the white-spot terminal buds and rectangular target frames of the non-white-spot terminal buds in the remote sensing image, obtaining position parameters of the rectangular target frames, converting the position parameters of the rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files and respectively pairing the parameter files with the remote sensing image to obtain a white-spot terminal bud remote sensing image dataset and a non-white-spot terminal bud remote sensing image dataset;
s3, constructing a flower bud target detection network, and training the flower bud target detection network by using a white spot terminal bud remote sensing image data set to obtain a white spot terminal bud detection model; training the flower bud target detection network architecture by using the non-white spot terminal bud remote sensing image data set to obtain a non-white spot terminal bud detection model;
s4, inputting the unmanned aerial vehicle remote sensing image of the area to be detected into a trained white spot terminal bud detection model to obtain a target detection result of the white spot terminal buds of the fruit trees;
s5, separating the white point terminal bud target detection result, inputting the white point terminal bud target detection result after the separation treatment into a non-white point terminal bud detection model to obtain a target detection result of the non-white point terminal bud, and calculating the white point rate of the fruit tree by utilizing the white point terminal bud of the fruit tree and the target detection result of the non-white point terminal bud.
2. The method for detecting flower bud targets and estimating white point rate of fruit trees in complex environment according to claim 1, wherein in step S1, the unmanned aerial vehicle is used for shooting unmanned aerial vehicle remote sensing images of flower bud periods of fruit trees, wherein shooting directions comprise a plurality of shooting conditions including sunny days, cloudy days, forward light and reverse light.
3. The method for detecting flower bud targets and estimating white point rates of fruit trees facing complex environments according to claim 1, wherein step S2 comprises the steps of:
s21, respectively marking a rectangular target frame of a white spot terminal bud and a rectangular target frame of a non-white spot terminal bud in the flower bud period of the fruit tree on the unmanned aerial vehicle remote sensing image;
s22, respectively obtaining data parameters of the rectangular target frames of the white dot terminal buds and the non-white dot terminal buds after marking is finished, wherein the data parameters comprise the upper left corner coordinates and the lower right corner coordinates of the rectangular target frames in the images and the pixel length and width of the rectangular frames;
s23, converting data parameters of rectangular target frames of the white-spot terminal buds and the non-white-spot terminal buds into parameter files, and pairing the parameter files of the white-spot terminal buds and the non-white-spot terminal buds with the corresponding remote sensing images to obtain a white-spot terminal bud remote sensing data set and a non-white-spot terminal bud remote sensing data set.
4. The complex environment-oriented fruit tree flower bud target detection and white point rate estimation method according to claim 3, wherein the flower bud target detection network comprises a feature extraction module, a multi-head attention mechanism module, a cavity convolution pyramid pooling layer, a knowledge distillation-based field self-adaptation module and a detection head;
the characteristic extraction module is used for extracting multi-level characteristics of the input remote sensing image, wherein the multi-level characteristics comprise bottom space characteristics and high-level semantic characteristics;
the cavity convolution pyramid pooling layer is used for primarily positioning and classifying high-level semantic features;
the multi-head attention mechanism module is used for capturing global information of the bottom space features to obtain a new feature map;
the domain self-adaption module carries out knowledge distillation on the feature map and the pre-trained teacher model, and then carries out cascade connection on the features subjected to knowledge distillation and the high-level semantic features subjected to preliminary classification so as to mutually adapt to each other;
the detection head is used for outputting the full-connection layer to obtain the classification category, the positioning frame position and the confidence score of the input remote sensing image.
5. The complex environment-oriented fruit tree flower bud target detection and white point rate estimation method according to claim 4, wherein the feature extraction module adopts a ResNet101 structure as a backbone network, the input end of the multi-head attention mechanism module is connected with a second layer of the ResNet101, and the input end of the cavity convolution pyramid pooling layer is connected with a fourth layer of the ResNet 101.
6. The method for detecting flower bud targets and estimating white point rate of fruit trees facing complex environments according to claim 5, wherein the multi-head attention mechanism module performs position coding on input bottom space features, then performs self-attention mechanism calculation, filters out irrelevant or noise areas in jump connection, and obtains self-attention calculation results as an output feature map.
7. The complex environment-oriented fruit tree flower bud target detection and white point rate estimation method according to claim 5, wherein the knowledge distillation-based field adaptation module comprises a teacher-student model, and the training is performed on the student model by extracting the trained teacher model;
in the training process, constructing a difference value of output predicted values between a teacher model and a student model as a loss function of knowledge distillation; adding the loss function of knowledge distillation and the loss function of the student model to obtain a total loss function; and gradient updating is carried out by using the total loss function until convergence, so that a trained student model is obtained.
8. The method for detecting flower bud targets and estimating white point rate of fruit trees facing to complex environments according to claim 4, wherein the data parameters are converted into a parameter file in YOLO data format, and the detection head is a YOLO detection head.
9. The method for detecting flower bud targets and estimating white point rate of fruit trees facing to complex environments according to any one of claims 1 to 8, wherein in step S3, training parameters of a flower bud target detection network are adjusted by using a white point terminal bud remote sensing image dataset to obtain a trained white point terminal bud detection model; adjusting training parameters of the flower bud target detection network by using the non-white spot terminal bud remote sensing image data set to obtain a trained non-white spot terminal bud detection model;
the training parameters include: iteration times, batch size, optimizer choice, initial learning rate setting, weight decay rate setting, and loss function choice.
10. The method for detecting flower bud targets and estimating white point rates of fruit trees for complex environments according to claim 9, wherein step S5 comprises the steps of:
s51, performing mask operation on a white point positioning frame in a target detection result of a white point terminal bud, and setting all pixel values in the white point positioning frame to 0 to obtain a masked remote sensing image;
s52, inputting the remote sensing image after masking into a trained non-white spot terminal bud detection model to obtain a target detection result of the non-white spot terminal bud;
s53, counting the number of white point locating frames and non-white point locating frames, and obtaining the white point rate of the fruit tree by using the sum of the number of white point locating frames and the number of upper white point locating frames and the number of non-white point locating frames.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311226910.7A CN117392535A (en) | 2023-09-21 | 2023-09-21 | Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311226910.7A CN117392535A (en) | 2023-09-21 | 2023-09-21 | Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117392535A true CN117392535A (en) | 2024-01-12 |
Family
ID=89435219
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311226910.7A Pending CN117392535A (en) | 2023-09-21 | 2023-09-21 | Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117392535A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117649608A (en) * | 2024-01-29 | 2024-03-05 | 阿坝州林业和草原科学技术研究所 | Pine wood nematode disease identification system and method based on remote sensing monitoring |
-
2023
- 2023-09-21 CN CN202311226910.7A patent/CN117392535A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117649608A (en) * | 2024-01-29 | 2024-03-05 | 阿坝州林业和草原科学技术研究所 | Pine wood nematode disease identification system and method based on remote sensing monitoring |
CN117649608B (en) * | 2024-01-29 | 2024-03-29 | 阿坝州林业和草原科学技术研究所 | Pine wood nematode disease identification system and method based on remote sensing monitoring |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2735151C2 (en) | Weeds identification in natural environment | |
CN112560623B (en) | Unmanned aerial vehicle-based rapid mangrove plant species identification method | |
CN110826556A (en) | Broad-spectrum crop weed identification and positioning method for improving deep learning | |
CN110765977A (en) | Method for extracting wheat lodging information based on multi-temporal remote sensing data of unmanned aerial vehicle | |
CN117593766B (en) | Investigation method for wild animal population number based on unmanned aerial vehicle shooting image processing | |
CN117392382A (en) | Single tree fruit tree segmentation method and system based on multi-scale dense instance detection | |
CN117409339A (en) | Unmanned aerial vehicle crop state visual identification method for air-ground coordination | |
CN117392535A (en) | Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment | |
CN108073947A (en) | A kind of method for identifying blueberry kind | |
Zhang et al. | Automatic counting of lettuce using an improved YOLOv5s with multiple lightweight strategies | |
CN114462485A (en) | Red date jujube witches broom initial-stage control method | |
CN119091380A (en) | A tea garden pest and disease identification method and system based on deep learning | |
CN111582035B (en) | Fruit tree age identification method, device, equipment and storage medium | |
CN118396422A (en) | Intelligent peach blossom recognition and fruit yield prediction method based on fusion model | |
CN117079130B (en) | An intelligent information management method and system based on mangrove habitat | |
US12148207B1 (en) | Method and system for intelligent identification of rice growth potential based on UAV monitoring | |
CN113553897A (en) | Crop identification method based on unmanned aerial vehicle and YOLOv3 model | |
CN118691989A (en) | A method for detecting aquatic plant coverage based on airborne hyperspectral | |
CN116052141B (en) | Crop growth period identification method, device, equipment and medium | |
CN118397483A (en) | Fruit tree target positioning and navigation line region segmentation method based on YOLO network | |
CN115294472A (en) | Fruit yield estimation method, model training method, equipment and storage medium | |
CN114663761A (en) | Crop growth condition determining method, device, equipment and storage medium | |
Stanski et al. | Flower detection using object analysis: new ways to quantify plant phenology in a warming tundra biome | |
CN113989253A (en) | Farmland target object information acquisition method and device | |
Bonaria | Grapevine yield estimation using image analysis for the variety Arinto |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |