Nothing Special   »   [go: up one dir, main page]

CN114967763B - Plant protection unmanned aerial vehicle sowing control method based on image positioning - Google Patents

Plant protection unmanned aerial vehicle sowing control method based on image positioning Download PDF

Info

Publication number
CN114967763B
CN114967763B CN202210914951.4A CN202210914951A CN114967763B CN 114967763 B CN114967763 B CN 114967763B CN 202210914951 A CN202210914951 A CN 202210914951A CN 114967763 B CN114967763 B CN 114967763B
Authority
CN
China
Prior art keywords
image
aerial vehicle
unmanned aerial
farmland
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202210914951.4A
Other languages
Chinese (zh)
Other versions
CN114967763A (en
Inventor
蒋一民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210914951.4A priority Critical patent/CN114967763B/en
Publication of CN114967763A publication Critical patent/CN114967763A/en
Application granted granted Critical
Publication of CN114967763B publication Critical patent/CN114967763B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a plant protection unmanned aerial vehicle sowing control method based on image positioning, and belongs to the field of image processing. The control method comprises the following steps: acquiring a first image, wherein the first image is an image of an area to be broadcast; acquiring a job request instruction; extracting feature data in the first image based on the job request instruction, wherein the feature data are pixel point data marked as a broadcast job area; controlling the plant protection unmanned aerial vehicle to perform scattering operation on the area to be scattered according to an operation path based on the characteristic data, wherein the operation path is a path for performing scattering operation on the area to be scattered; the method has the advantages that the method for determining whether to broadcast the current area or not is carried out by identifying the image, positioning the position of the unmanned aerial vehicle and accurately broadcasting according to the planned route, so that the no-repeat one-time broadcasting of the area to be broadcasted is realized, and the seeds to be broadcasted are saved.

Description

Plant protection unmanned aerial vehicle sowing control method based on image positioning
Technical Field
The invention relates to the field of image processing and matching, in particular to a sowing control method based on a plant protection unmanned aerial vehicle.
Background
Plant protection unmanned aerial vehicle, also called unmanned vehicles, the name is an unmanned aircraft who is used for agriculture and forestry plant protection operation as the name implies, and this type unmanned aircraft comprises flight platform (fixed wing, helicopter, multiaxis aircraft), navigation flight control, spraying mechanism triplex, flies through ground remote control or navigation and controls, realizes spraying the operation, can spray medicament, seed, powder etc. in agricultural production, saved the labour, reduced intensity of labour.
However, when the unmanned aerial vehicle in the prior art performs the sowing operation, the area for sowing is usually controlled by a manual control mode, but when the area for sowing is controlled by a manual mode, when the area of the sowing area is too large or the terrain is complex, the situation of missed sowing or repeated sowing usually occurs, so that the seed waste is caused.
In view of this, the present application is specifically proposed.
Disclosure of Invention
The invention aims to solve the technical problem that in the prior art, an unmanned aerial vehicle is manually controlled to broadcast areas, so that missing broadcast or repeated broadcast is easy to cause, and the invention aims to provide a broadcast control method based on a plant protection unmanned aerial vehicle, which can realize automatic non-repeated broadcast in the areas to be broadcast.
The invention is realized by the following technical scheme:
a plant protection unmanned aerial vehicle broadcast control method based on image positioning comprises the following steps:
step 1: acquiring a first image, wherein the first image is a ground image comprising an area to be broadcast;
and 2, step: recognizing the farmland region in the first image by adopting a neural network, and partitioning the farmland according to the actual division of the farmland in the image; the blocking method comprises the following steps: recognizing the boundary of a road, a tree, a ridge, a ditch or a man-made mark in the image, fitting the recognized boundary of the road, the tree, the ridge, the ditch or the man-made mark into straight lines which are regarded as the boundary of a farmland, partitioning the farmland by adopting the straight lines, and storing a network graph formed by all the straight lines, wherein the network graph is called a graph library;
and step 3: manually selecting farmland blocks to be sown, virtually dividing each individual farmland to be sown into a plurality of small squares, wherein each small square is called a second image, the width of each small square is the width of unmanned aerial vehicle sowing, and the unmanned aerial vehicle sowing width is adjustable;
and 4, step 4: planning an unmanned aerial vehicle sowing path with the shortest length and the least repetition according to the virtual small squares divided in the step 3; transmitting the image of the farmland block to be sown and the planned path to an unmanned aerial vehicle;
and 5: the unmanned aerial vehicle takes off after acquiring the image of the farmland blocks to be sown and the planned path, preliminarily sets the flight direction of the unmanned aerial vehicle, enables the unmanned aerial vehicle to fly towards the area to be sown, acquires the front image in real time, matches the image of the farmland blocks to be sown in the image acquired in real time until the image of the farmland blocks to be sown is matched in the image acquired in real time, and flies to the initial sowing position to prepare sowing operation according to the set sowing path;
the method for matching the image of the farmland block to be sown in the image acquired in real time comprises the following steps:
step 5.1: identifying a farmland area in the real-time acquired image, then identifying boundary lines of the farmland by adopting the same method in the step 2, carrying out size normalization on a network graph formed by all the boundary lines, and normalizing to obtain that the size of each pixel representation is the same as the size of each pixel representation of the network graph in the step 2;
step 5.2: the method comprises the steps that a farmland boundary network in an image is obtained in real time through sliding interception and is called as a network 1, the size of the network 1 is counted as C x D, and the interception step length is 5-10 pixels; counting the number a of nodes in the network 1; counting the number b of nodes in the size of C x D around each node in the graph library, selecting the nodes in the graph library corresponding to the number of the nodes around a = b +/-3, and intercepting a graph library area taking the nodes as the center and C x D as the size, wherein the graph library area is called as a network 2;
step 5.3: counting the number of straight lines in the network 1 and the network 2, and selecting the network 2 with the same number of straight lines;
step 5.4: matching the shape of the network 2 selected in the step 5.3 with the shape of the network 1, wherein the shape similarity is greater than a set threshold value, and considering that the matching is successful, otherwise, performing the next matching;
step 5.5: after matching is successful, according to the positions of the network 1 and the network 2 in the respective images, corresponding the image acquired in real time with the first image; determining farmland blocks to be sown in the images acquired in real time;
step 6: identifying the edge of the image of the farmland block to be sown according to the obtained image of the farmland block to be sown, measuring the distances from the unmanned aerial vehicle to all the edges of the image of the farmland block to be sown, selecting the two closest edges as references, and positioning the position of the unmanned aerial vehicle; 4, determining a sowing path of the unmanned aerial vehicle, calculating the distance from each point on the path to two edges of the image of the farmland block to be sown closest to the point in advance, calculating the record of the current two edges of the unmanned aerial vehicle closest to the image of the farmland block to be sown in the actual sowing process, and correcting the deviation of the unmanned aerial vehicle in the flying process according to the predetermined distance;
and 7: recording the farmland area which is sowed, and preventing repeated sowing; after the sowing is finished, the unmanned aerial vehicle flies back to the flying point.
Further, the flight control method of the unmanned aerial vehicle along the broadcast path specified in step 4 is segment control, and the unmanned aerial vehicle flies from one second image center to the next second image center each time, and the specific method is as follows:
the calculated flight direction is:
Figure 37960DEST_PATH_IMAGE001
wherein theta is flight angle information,x 1 for the plant protection unmanned aerial vehicle in the actual central point of the nth second image corresponding to the coordinate of the x axis,y 1 the coordinate of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the nth second image,x 2 corresponding to the coordinate of the x axis at the actual central point of the (n + 1) th second image for the plant protection unmanned aerial vehicle,y 2 coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the calculated flight distance is:
Figure 809607DEST_PATH_IMAGE002
and L is the actual flying distance.
According to the sowing control method based on the plant protection unmanned aerial vehicle, the unmanned aerial vehicle is planned to be a sowing path, the position of the unmanned aerial vehicle is accurately positioned, and sowing is carried out according to the sowing path; the method for judging whether the pixel points corresponding to the area to be sown are marked or not is used for determining whether the current area is sown or not, and the area to be sown is sown, so that the repeated and one-time sowing of the area to be sown is realized, and seeds to be sown are saved.
Drawings
FIG. 1 is a schematic diagram of a control method according to the present invention;
FIG. 2 is a flow chart of a control method of the present invention;
FIG. 3 is a positioning flowchart of the present invention.
Detailed Description
The embodiment provides a sowing control method based on a plant protection unmanned aerial vehicle, as shown in fig. 1, the control method includes the steps of:
s1: acquiring a first image, wherein the first image is an image of an area to be broadcast;
in step S1, before seeding the area to be seeded, it is first required to know the area of the area to be seeded and the size of the edge area, and whether the area to be seeded has been seeded before, and mark the pixel points of the seeded area in the first image to distinguish the seeded area from the unsown area.
S2: and acquiring a job request instruction, wherein the job request instruction refers to an instruction for performing broadcast operation on an area to be broadcast, and triggering the plant protection unmanned aerial vehicle to perform related operations.
S3: extracting feature data in the first image based on the job request instruction, wherein the feature data are pixel point data marked as a spread job area;
in step S3, the marked feature data specifically refers to pixels marked as having colors, and before the area to be broadcast is broadcast, it is first determined whether the area has been broadcast, in this embodiment, it is determined whether the area has been broadcast by determining the color of the pixel corresponding to the area.
The characteristic data marked are specifically: acquiring pixel points of the area after the scattering operation corresponding to the first image; and color marking is carried out on the pixel point.
S4: and controlling the plant protection unmanned aerial vehicle to carry out scattering operation on the area to be scattered according to an operation path based on the characteristic data, wherein the operation path is a path for carrying out scattering operation on the area to be scattered.
In step S4, according to the extracted feature data, a path of the plant protection unmanned aerial vehicle in the area to be broadcast is further planned, and in the process of broadcast, whether a corresponding pixel point of the relevant area is marked or not is also determined, and then whether the area needs to be broadcast or not is determined.
The substep of S4 comprises:
acquiring pixel points corresponding to the first image in an area to be broadcast;
and judging whether the pixel point is the characteristic data, if so, not scattering the current area, and otherwise, scattering the current area.
The specific substep of judging whether the pixel point is the characteristic data is as follows:
acquiring LAB parameter values of pixel points;
and judging whether the LAB parameter value is in the parameter value range of the marked color, if so, taking the pixel point as the feature data.
In this embodiment, an LAB color model is used to detect whether a pixel point is marked, an LAB parameter value of a pixel point corresponding to a region to be broadcast is first acquired, and according to a parameter region range of the marked color of the pixel point, it is determined whether the acquired LAB parameter value falls within the parameter region range of the marked color pixel point, if so, the region is already broadcast, otherwise, the region is not broadcast, and a broadcast operation needs to be performed on the current region. Whether the current area is broadcast or not is determined by judging whether the pixel points are marked or not, the situation that the same area in the area to be broadcast is repeatedly broadcast or broadcast is missed is avoided, and the waste situation of seed broadcast is reduced.
In this embodiment, the control method further includes: when controlling plant protection unmanned aerial vehicle scatters the operation, it is right plant protection unmanned aerial vehicle flight path carries out accurate positioning, as shown in fig. 3, concrete step is:
dividing the first image into n second images;
acquiring first information, wherein the first information is a preset distance and a preset direction from the center point of the nth second image to the center point of the (n + 1) th second image;
calculating second information, wherein the second information is the actual flying distance and flying direction between the nth second image center point and the (n + 1) th second image center point;
the positioning method of the unmanned aerial vehicle in the flight process comprises the following steps:
the main innovation point in full-field positioning is partition positioning, the traditional visual positioning can be carried out only under the condition that the whole map can be seen in the visual field of the unmanned aerial vehicle, the operation area is divided into four areas, different boundaries are recognized in different areas, and the area capable of being positioned is enlarged by four times.
The calculation formula is as follows
Noting the coordinates of the leftmost, rightmost, uppermost and lowermost boundaries as G1, G2, G3, G4,
the visual field center coordinate of the unmanned plane is
Figure 947197DEST_PATH_IMAGE003
The correction factor from camera 2D to real 2D is corr, and the longitudinal widths of the upper and lower half regions are respectively corr
Figure 767385DEST_PATH_IMAGE004
And
Figure 119869DEST_PATH_IMAGE005
the transverse width of the operation area is X;
the transformed positioning coordinates (x, y) when the drone is located in the upper left zone are then:
Figure 750833DEST_PATH_IMAGE006
Figure 126450DEST_PATH_IMAGE007
the transformed positioning coordinates (x, y) when the drone is located in the upper right zone are:
Figure 547067DEST_PATH_IMAGE008
Figure 206588DEST_PATH_IMAGE009
the location coordinate (x, y) after the conversion when unmanned aerial vehicle is located lower left region is:
Figure 320037DEST_PATH_IMAGE010
Figure 182951DEST_PATH_IMAGE011
the transformed location coordinates (x, y) when the drone is located in the lower right region are:
Figure 83956DEST_PATH_IMAGE012
Figure 145453DEST_PATH_IMAGE013
and matching the second information with the first information, if the second information is the same as the first information, performing seeding operation on the (n + 1) th second image through the plant protection unmanned aerial vehicle, and otherwise, adjusting the position of the plant protection unmanned aerial vehicle.
By dividing the first image into a plurality of second images, the corresponding area to be broadcast is divided into a plurality of sub areas to be broadcast, and each sub area to be broadcast corresponds to one second image, as shown in fig. 2, the area to be broadcast can be broadcast in a divided area block manner, a path operation in a divided area manner is adopted, and a certain position deviation may be generated on a path where the unmanned aerial vehicle flies due to the influence of other environmental factors such as wind direction.
The specific expression of the flight direction is as follows:
Figure 367487DEST_PATH_IMAGE014
theta is the information on the flight angle,x 1 for the plant protection unmanned aerial vehicle in the nth second image actual central point corresponding to the x-axis coordinate,y 1 for the plant protection unmanned aerial vehicle, the actual central point of the nth second image corresponds to the coordinate of the y axis,x 2 corresponding to the x-axis coordinate of the actual central point of the n +1 th second image for the plant protection unmanned aerial vehicle,y 2 coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the specific expression of the actual flying distance is as follows:
Figure 966964DEST_PATH_IMAGE015
and L is the actual flying distance.
The control method further comprises the following steps: controlling the plant protection unmanned aerial vehicle to identify the edge of the area to be sowed, wherein the specific identification method comprises the following steps:
acquiring a regional edge image data set;
constructing a first model, and training the region edge image data set by adopting a deep learning network to obtain an optimal model;
and using the optimal model for identifying the edge of the area to be broadcast.
If the colors of the two sides of the edge of the area to be broadcast are obviously different, the edge is identified by identifying the color blocks, otherwise, the image of the edge of the area is collected in advance, a corresponding data set is established, a corresponding deep learning network is built for training, and the deep learning network is deployed on the unmanned aerial vehicle so as to identify the edge of the area.
According to the sowing control method based on the plant protection unmanned aerial vehicle, whether image pixel points corresponding to the area to be sown are marked or not is judged, if the image pixel points are marked, the corresponding area is not sown, otherwise, the corresponding area is sown, so that the accurate sowing operation of the area to be sown is realized, and the condition that the individual area is missed to be sown or is repeatedly sown is avoided.

Claims (2)

1. A plant protection unmanned aerial vehicle broadcast control method based on image positioning is characterized by comprising the following steps:
step 1: acquiring a first image, wherein the first image is a ground image comprising an area to be broadcast;
and 2, step: recognizing the farmland region in the first image by adopting a neural network, and partitioning the farmland according to the actual division of the farmland in the image; the blocking method comprises the following steps: recognizing the boundary of a road, a tree, a ridge, a ditch or a man-made mark in the image, fitting the recognized boundary of the road, the tree, the ridge, the ditch or the man-made mark into straight lines which are regarded as the boundary of a farmland, partitioning the farmland by adopting the straight lines, and storing a network graph formed by all the straight lines, wherein the network graph is called a graph library;
and 3, step 3: manually selecting farmland blocks to be sown, virtually dividing each individual farmland to be sown into a plurality of small squares, wherein each small square is called a second image, the width of each small square is the width of unmanned aerial vehicle sowing, and the unmanned aerial vehicle sowing width is adjustable;
and 4, step 4: planning an unmanned aerial vehicle sowing path with shortest length and least repetition according to the virtual small grids divided in the step 3; transmitting the image of the farmland block to be sowed and the planned path to an unmanned aerial vehicle;
and 5: the unmanned aerial vehicle takes off after acquiring the image of the farmland blocks to be sown and the planned path, preliminarily sets the flight direction of the unmanned aerial vehicle, enables the unmanned aerial vehicle to fly towards the area to be sown, acquires the front image in real time, matches the image of the farmland blocks to be sown in the image acquired in real time until the image of the farmland blocks to be sown is matched in the image acquired in real time, and flies to the initial sowing position to prepare sowing operation according to the set sowing path;
the method for matching the image of the farmland block to be sown in the image acquired in real time comprises the following steps:
step 5.1: identifying a farmland area in the real-time acquired image, then identifying boundary lines of the farmland by adopting the same method in the step 2, carrying out size normalization on a network graph formed by all the boundary lines, and normalizing to obtain that the size of each pixel representation is the same as the size of each pixel representation of the network graph in the step 2;
step 5.2: the method comprises the steps that a farmland boundary network in an image is obtained in real time through sliding interception and is called as a network 1, the size of the network 1 is counted as C x D, and the interception step length is 5-10 pixels; counting the number a of nodes in the network 1; counting the number b of nodes around each node in the size of C x D in the graph library, selecting the nodes in the graph library corresponding to the number of the nodes around the node as a = b +/-3, and intercepting a graph library area which takes the node as the center and takes C x D as the size and is called as a network 2;
step 5.3: counting the number of straight lines in the network 1 and the network 2, and selecting the network 2 with the same number of straight lines;
step 5.4: matching the shape of the network 2 selected in the step 5.3 with the shape of the network 1, wherein the shape similarity is greater than a set threshold value, and considering that the matching is successful, otherwise, performing the next matching;
step 5.5: after matching is successful, corresponding the image acquired in real time with the first image according to the positions of the network 1 and the network 2 in the respective images; determining farmland blocks to be sown in the images acquired in real time;
step 6: identifying the edge of the image of the farmland block to be sown according to the obtained image of the farmland block to be sown, measuring the distances from the unmanned aerial vehicle to all the edges of the image of the farmland block to be sown, selecting the two closest edges as references, and positioning the position of the unmanned aerial vehicle; in the step 4, the unmanned aerial vehicle sowing path is determined, the distance from each point on the path to two nearest edges of the image of the farmland block to be sown is calculated in advance, in the actual sowing process, the distance from the current unmanned aerial vehicle to the two nearest edges of the image of the farmland block to be sown is calculated, and the deviation of the unmanned aerial vehicle in the flying process is corrected according to the distance determined in advance;
and 7: recording the farmland area which is sowed, and preventing repeated sowing; after the sowing is finished, the unmanned aerial vehicle flies back to the flying point;
the method for recording the farmland area after sowing and preventing repeated sowing comprises the following steps: firstly, color marking is carried out on pixel points of a first image corresponding to an area after scattering operation; and during subsequent scattering, firstly, detecting whether the pixel point is marked by adopting an LAB color model, acquiring an LAB parameter value of the pixel point corresponding to a region to be scattered, judging whether the acquired LAB parameter value falls in the parameter region range of the marked color pixel point according to the parameter region range of the marked color pixel point, if so, scattering the region, otherwise, not scattering the region, and needing to perform scattering operation on the region.
2. The method for controlling the spreading of the plant protection unmanned aerial vehicle based on image positioning as claimed in claim 1, wherein the flight control method of the unmanned aerial vehicle along the spreading path specified in step 4 is a segment control, and each time, the unmanned aerial vehicle flies from one second image center to the next second image center, and the method comprises:
the calculated flight direction is:
Figure FDA0003850348920000021
where θ is flight angle information, x 1 For the plant protection unmanned aerial vehicle in the coordinate of the x axis corresponding to the actual central point of the nth second image, y 1 For plant protection unmanned plane at nth secondThe actual center point of the image corresponds to the y-axis coordinate, x 2 Coordinate, y, of the plant protection unmanned aerial vehicle on the x axis corresponding to the actual central point of the (n + 1) th second image 2 Coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the calculated flight distance is:
Figure FDA0003850348920000022
and L is the actual flying distance.
CN202210914951.4A 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning Expired - Fee Related CN114967763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Publications (2)

Publication Number Publication Date
CN114967763A CN114967763A (en) 2022-08-30
CN114967763B true CN114967763B (en) 2022-11-08

Family

ID=82969098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210914951.4A Expired - Fee Related CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Country Status (1)

Country Link
CN (1) CN114967763B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Vehicular image processor
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Unmanned aerial vehicle-mounted SAR image matching method based on contour features
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017100579A1 (en) * 2015-12-09 2017-06-15 Dronesense Llc Drone flight operations
CN109859158A (en) * 2018-11-27 2019-06-07 邦鼓思电子科技(上海)有限公司 A kind of detection system, method and the machinery equipment on the working region boundary of view-based access control model
CN110728745B (en) * 2019-09-17 2023-09-15 上海大学 Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Vehicular image processor
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Unmanned aerial vehicle-mounted SAR image matching method based on contour features
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Boundary Tracking of Continuous Objects Based on Feasible Region Search in Underwater Acoustic Sensor Networks;Li Liu;《IEEE Transactions on Mobile Computing ( Early Access )》;20220726;全文 *
基于图像识别的无人机精准喷雾控制系统的研究;王林惠等;《华南农业大学学报》;20161024(第06期);全文 *
基于视觉的无人机导航定位系统关键技术研究;吴婕;《中国优秀硕士学位论文库 工程科技Ⅱ辑》;20200815;全文 *
基于贝叶斯理论的运动目标检测算法研究;刘晓晨;《中国优秀硕士学位论文库 信息科技辑》;20120715;全文 *
无人机影像定位优化技术研究;薛武;《中国优秀硕博论文库 基础科学辑》;20190115;全文 *
无人机遥感图像拼接关键技术研究;贾银江;《中国优秀硕士学位论文库工程科技Ⅱ辑》;20160815;全文 *

Also Published As

Publication number Publication date
CN114967763A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN107148633B (en) Method for agronomic and agricultural monitoring using unmanned aerial vehicle system
WO2019179270A1 (en) Plant planting data measuring method, working route planning method, device and system
EP3503025B1 (en) Utilizing artificial intelligence with captured images to detect agricultural failure
CN104615146B (en) Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
EP3009794B1 (en) Redundant determining of positional information for an automatic landing system
CN110282135B (en) Accurate pesticide spraying system and method for plant protection unmanned aerial vehicle
US10417753B2 (en) Location identifying device, location identifying method, and program therefor
CN110852282B (en) Farmland disease monitoring system based on machine vision
US11087132B2 (en) Systems and methods for mapping emerged plants
EP3449286B1 (en) Method for detecting agricultural field work performed by a vehicle
US11908074B2 (en) Method of identifying and displaying areas of lodged crops
CN113325872A (en) Plant inspection method, device and system and aircraft
CN113610040B (en) Paddy field weed density real-time statistical method based on improved BiSeNetV2 segmentation network
CN115761535B (en) Soil quality data analysis method and system
CN112612291A (en) Air route planning method and device for unmanned aerial vehicle for oil field surveying and mapping
CN114967763B (en) Plant protection unmanned aerial vehicle sowing control method based on image positioning
CN108007437B (en) Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft
CN115689795A (en) Hillside orchard crop growth analysis method and system based on unmanned aerial vehicle remote sensing
CN117389310B (en) Agricultural unmanned aerial vehicle sprays operation control system
CN115619286B (en) Method and system for evaluating quality of sample plot of breeding field district
CN115861851A (en) Corn emergence rate calculation method based on unmanned aerial vehicle remote sensing image deep learning
CN115407799B (en) Flight control system for vertical take-off and landing aircraft
CN113807128B (en) Seedling shortage marking method and device, computer equipment and storage medium
JP2019046401A (en) Land leveling work object area identification system
CN114485612B (en) Route generation method and device, unmanned operation vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20221108