Nothing Special   »   [go: up one dir, main page]

CN114967763B - Plant protection unmanned aerial vehicle sowing control method based on image positioning - Google Patents

Plant protection unmanned aerial vehicle sowing control method based on image positioning Download PDF

Info

Publication number
CN114967763B
CN114967763B CN202210914951.4A CN202210914951A CN114967763B CN 114967763 B CN114967763 B CN 114967763B CN 202210914951 A CN202210914951 A CN 202210914951A CN 114967763 B CN114967763 B CN 114967763B
Authority
CN
China
Prior art keywords
image
farmland
aerial vehicle
unmanned aerial
sowing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202210914951.4A
Other languages
Chinese (zh)
Other versions
CN114967763A (en
Inventor
蒋一民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210914951.4A priority Critical patent/CN114967763B/en
Publication of CN114967763A publication Critical patent/CN114967763A/en
Application granted granted Critical
Publication of CN114967763B publication Critical patent/CN114967763B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明公开一种基于图像定位的植保无人机的播撒控制方法,属于图像处理领域。控制方法步骤包括:获取第一图像,所述第一图像为待播撒区域图像;获取作业请求指令;基于所述作业请求指令,提取所述第一图像中的特征数据,所述特征数据为被标记为已播撒作业区域的像素点数据;基于所述特征数据,控制植保无人机按照作业路径对待播撒区域进行播撒作业,所述作业路径为对待播撒区域进行播撒作业的路径;本发明的有益效果为通过识别图像,定位无人机位置,精准按照规划路线进行播撒,来决定是否对当前区域进行播撒的方法,实现了对待播撒区域无重复、一次性播撒,节约了播撒的种子。

Figure 202210914951

The invention discloses a seeding control method of a plant protection unmanned aerial vehicle based on image positioning, which belongs to the field of image processing. The control method steps include: acquiring a first image, the first image being an image of an area to be sown; acquiring a job request instruction; extracting feature data in the first image based on the job request instruction, the feature data being the The pixel point data marked as the sowing operation area; based on the characteristic data, the plant protection drone is controlled to perform the sowing operation in the to-be-sown area according to the operation path, and the operation path is the path for the to-be-sown area to perform the sowing operation; the beneficial effect of the present invention is The effect is to determine whether to sow the current area by identifying the image, locating the position of the drone, and accurately sowing according to the planned route.

Figure 202210914951

Description

一种基于图像定位的植保无人机的播撒控制方法A seeding control method of plant protection UAV based on image positioning

技术领域technical field

本发明涉及图像处理,匹配领域,具体而言,涉及基于植保无人机的播撒控制方法。The invention relates to the field of image processing and matching, in particular, to a sowing control method based on a plant protection drone.

背景技术Background technique

植保无人机,又名无人飞行器,顾名思义是用于农林植物保护作业的无人驾驶飞机,该型无人飞机由飞行平台(固定翼、直升机、多轴飞行器)、导航飞控、喷洒机构三部分组成,通过地面遥控或导航飞控,来实现喷洒作业,可以喷洒药剂、种子、粉剂等,在农业生产中,节省了劳动力、降低了劳动强度。Plant protection UAV, also known as unmanned aerial vehicle, as the name implies, is an unmanned aircraft used for agricultural and forestry plant protection operations. Composed of three parts, the spraying operation is realized through ground remote control or navigation flight control, which can spray chemicals, seeds, powder, etc., saving labor and reducing labor intensity in agricultural production.

但是在现有技术中的无人机进行播撒作业的时候,通常采用的是通过人工控制的方式对播撒的区域进行控制播撒,但是在通过人工方式对播撒区域进行控制的时候,当播撒区域面积过大或者地形复杂的时候,通常会出现漏播或者重复播撒的情况,从而造成种子的浪费。However, when the unmanned aerial vehicle in the prior art performs sowing operations, it usually uses manual control to control the sowing area, but when the sowing area is controlled manually, when the area of the sowing area When it is too large or the terrain is complex, there will usually be missed or repeated sowing, resulting in waste of seeds.

有鉴于此,特提出本申请。In view of this, this application is proposed.

发明内容Contents of the invention

本发明所要解决的技术问题是现有技术中,采用人工控制无人机对区域进行播撒,容易造成漏播或重复播撒,目的在于提供基于植保无人机的播撒控制方法,能够实现自动对待播撒区域进行无重复播撒。The technical problem to be solved by the present invention is that in the prior art, artificially controlled drones are used to sow the area, which may easily cause missing or repeated sowing. The purpose is to provide a sowing control method based on plant protection drones, which can realize automatic sowing The area was sown without repetition.

本发明通过下述技术方案实现:The present invention realizes through following technical scheme:

一种基于图像定位的植保无人机的播撒控制方法,该控制方法包括:A sowing control method of a plant protection drone based on image positioning, the control method comprising:

步骤1:获取第一图像,所述第一图像是包括待播撒区域的地面图像;Step 1: Acquiring a first image, the first image is a ground image including the area to be sowed;

步骤2:采用神经网络对第一图像中农田区域进行识别,并根据图像中农田的实际划分对农田进行分块;分块方法为:识别出图像中的道路、树木、田垄、水渠或人为标记的分界线,将识别出的道路、树木、田垄、水渠或人为标记的分界线拟合成直线,这些直线认为是农田的分界线,采用这些直线实现对农田的分块,并保存所有直线组成的网络图形,称为图形库;Step 2: Use the neural network to identify the farmland area in the first image, and divide the farmland into blocks according to the actual division of the farmland in the image; the block method is: identify roads, trees, field ridges, canals or artificial marks in the image The demarcation line of the road, tree, field ridge, water channel or human-marked demarcation line is fitted into a straight line, and these straight lines are considered as the demarcation line of the farmland, and these straight lines are used to divide the farmland into blocks, and all the straight line components are saved network graphics, known as graphics libraries;

步骤3:手动选择待播撒的农田块,对每一块单独的待播撒农田虚拟划分为多个小方格,每个小方格称为第二图像,每个小方格的宽度为无人机播撒的宽度,无人机播撒宽度可调;Step 3: Manually select the farmland block to be sown, and divide each individual piece of farmland to be sowed virtually into multiple small squares, each small square is called the second image, and the width of each small square is UAV Spreading width, drone spreading width is adjustable;

步骤4:根据步骤3划分的虚拟小方格,规划一条长度最短重复最少的无人机播撒路径;将待播撒农田块图像和规划的路径传输给无人机;Step 4: According to the virtual small grid divided in step 3, plan a UAV sowing path with the shortest length and the least repetition; transmit the image of the farmland block to be sown and the planned path to the UAV;

步骤5:无人机获取待播撒农田块图像和规划的路径后起飞,初步设定无人机飞行方向,使无人机朝待播撒区域飞行,并实时获取前方图像,将待播撒农田块图像在实时获取的图像中进行匹配,直到在实时获取的图像中匹配到待播撒农田块图像,无人机根据设定的播撒路径,飞往初始播撒位置准备播撒作业;Step 5: The UAV takes off after obtaining the image of the farmland block to be sowed and the planned path, initially sets the flight direction of the UAV, makes the UAV fly towards the area to be sowed, and obtains the front image in real time, and transfers the image of the farmland block to be sown Matching is performed in the real-time acquired image until the image of the farmland block to be sowed is matched in the real-time acquired image, and the UAV flies to the initial sowing position to prepare for the sowing operation according to the set sowing path;

其中,在实时获取的图像中匹配待播撒农田块图像的方法为:Among them, the method of matching the image of the farmland block to be sown in the image acquired in real time is:

步骤5.1:识别出实时获取图像中的农田区域,然后采用步骤2中相同的方法识别出农田的分界线,将所有分界线组成的网络图形进行尺寸归一化,归一化得到每个像素代表的尺寸与步骤2中网络图形每个像素代表的尺寸相同;Step 5.1: Identify the farmland area in the real-time acquired image, then use the same method in step 2 to identify the dividing line of the farmland, and normalize the size of the network graph composed of all dividing lines, and normalize to obtain the representative of each pixel The size of is the same as the size represented by each pixel of the network graphics in step 2;

步骤5.2:滑动截取实时获取图像中的农田分界线网络,称为网络1,计网络1的尺寸为C*D,截取步长为5到10个像素;统计网络1中的节点个数a;在图形库中统计每个节点的周围在C*D尺寸内的节点个数b,选取周围节点数为a=b±3对应的图形库中的节点,截取以该节点为中心、C*D为尺寸的图形库区域,称为网络2;Step 5.2: Sliding interception to obtain the farmland boundary network in the image in real time, called network 1, the size of network 1 is C*D, and the interception step is 5 to 10 pixels; the number of nodes a in network 1 is counted; In the graphics library, count the number of nodes b around each node within the C*D size, select the nodes in the graphics library corresponding to the number of surrounding nodes as a=b±3, and intercept the node as the center, C*D is the area of the graphics library of size, called Network 2;

步骤5.3:统计网络1和网络2中的直线条数,选择直线条数相等的网络2;Step 5.3: Count the number of straight lines in network 1 and network 2, and select network 2 with the same number of straight lines;

步骤5.4:将步骤5.3选择的网络2与网络1进行形状匹配,形状相似度大于设定阈值认为匹配成功,否则进行下一次匹配;Step 5.4: Perform shape matching on the network 2 selected in step 5.3 and network 1, if the shape similarity is greater than the set threshold, the matching is considered successful, otherwise, the next matching is performed;

步骤5.5:匹配成功后,根据网络1和网络2在各自图像中的位置,将实时获取的图像与第一图像对应;再在实时获取的图像中确定出待播撒的农田块;Step 5.5: After the matching is successful, according to the positions of network 1 and network 2 in the respective images, the real-time acquired image corresponds to the first image; and then the farmland block to be sown is determined in the real-time acquired image;

步骤6:根据获取的待播撒农田块图像,识别出待播撒农田块图像的边缘,测量无人机到待播撒农田块图像的所有边缘的距离,选择其中最近的两条作为参考,定位无人机的位置;步骤4中无人机播撒路径已经确定,事先计算路径上各点到与其最近的待播撒农田块图像的两条边缘的距离,实际播撒过程中,计算当前无人机到待播撒农田块图像最近的两条边缘的记录,根据事先确定的距离对无人机飞行过程的偏差进行修正;Step 6: According to the acquired image of the farmland block to be sown, identify the edges of the image of the farmland block to be sown, measure the distance from the drone to all the edges of the image of the farmland block to be sowed, and select the two closest ones as references, and locate no one The position of the drone; the sowing path of the UAV has been determined in step 4, and the distance from each point on the path to the two edges of the image of the farmland block to be sown nearest to it is calculated in advance. The records of the two nearest edges of the farmland block image are corrected for the deviation of the UAV flight process according to the predetermined distance;

步骤7:记录播撒过的农田区域,防止重复播撒;播撒完毕后,无人机飞回起飞点。Step 7: Record the area of the farmland that has been sown to prevent repeated sowing; after the sowing is completed, the drone flies back to the take-off point.

进一步的,无人机沿着步骤4规定的播撒路径的飞行控制方法为分段控制,每次都从一个第二图像中心飞到下一个第二图像中心,具体方法为:Further, the flight control method of the UAV along the sowing path specified in step 4 is segmented control, and each time it flies from one second image center to the next second image center, the specific method is:

计算飞行方向为:Calculate the flight direction as:

Figure 37960DEST_PATH_IMAGE001
Figure 37960DEST_PATH_IMAGE001

其中,θ为飞行角度信息,x 1为植保无人机在第n个第二图像实际中心点对应x轴的坐标,y 1为植保无人机在第n个第二图像的实际中心点对应y轴的坐标,x 2为植保无人机在第n+1个第二图像的实际中心点对应x轴的坐标,y 2为植保无人机在第n+1个第二图像的实际中心点对应y轴的坐标;Among them, θ is the flight angle information, x 1 is the coordinate of the x-axis corresponding to the actual center point of the plant protection drone in the n second image, y 1 is the corresponding to the actual center point of the plant protection drone in the n second image The coordinates of the y-axis, x 2 is the coordinate of the x-axis corresponding to the actual center point of the n+1 second image of the plant protection drone, and y 2 is the actual center of the n+1 second image of the plant protection drone The coordinates of the point corresponding to the y-axis;

计算飞行距离为:Calculate the flight distance as:

Figure 809607DEST_PATH_IMAGE002
Figure 809607DEST_PATH_IMAGE002

L为实际飞行距离。L is the actual flight distance.

本发明实施例提供的基于植保无人机的播撒控制方法,通过规划无人机播撒路径,精确定位无人机位置,按照播撒路径进行播撒;判断即将播撒区域对应像素点是否被标记的情况,来决定是否对当前区域进行播撒的方法,对待播撒区域进行播撒,实现了对待播撒区域无重复、一次性播撒,节约了播撒的种子。The plant protection UAV-based sowing control method provided by the embodiment of the present invention, by planning the UAV sowing path, accurately locating the UAV position, and sowing according to the sowing path; judging whether the corresponding pixel point of the area to be sown is marked, The method of deciding whether to sow the current area is to sow the area to be sowed, which realizes non-repeated and one-time sowing in the area to be sowed, and saves the seeds to be sown.

附图说明Description of drawings

图1为本发明控制方法示意图;Fig. 1 is a schematic diagram of the control method of the present invention;

图2为本发明控制方法流程图;Fig. 2 is a flow chart of the control method of the present invention;

图3为本发明定位流程图。Fig. 3 is a positioning flowchart of the present invention.

具体实施方式Detailed ways

本实施例提供了基于植保无人机的播撒控制方法,如图1所示,控制方法步骤包括:This embodiment provides a sowing control method based on plant protection drones, as shown in Figure 1, the control method steps include:

S1:获取第一图像,所述第一图像为待播撒区域图像;S1: Acquiring a first image, the first image is an image of an area to be sowed;

在步骤S1中,在对待播撒区域进行播种之前,首先要知道待播撒区域的面积以及边缘区域的大小,以及该待播撒区域之前是否已经播种过,并在第一图像中对播种过的区域像素点进行标记,来区分播种与未播种的区域。In step S1, before sowing the area to be sowed, it is first necessary to know the area of the area to be sowed and the size of the edge area, and whether the area to be sowed has been sown before, and in the first image, the pixels of the sown area Points are marked to distinguish between seeded and unseeded areas.

S2:获取作业请求指令,这里的作业请求指令,指的是对待播撒区域进行播撒作业的指令,去触发植保无人机进行相关的操作。S2: Obtain the operation request instruction. The operation request instruction here refers to the instruction of sowing operation in the area to be sown, to trigger the plant protection drone to perform related operations.

S3:基于所述作业请求指令,提取所述第一图像中的特征数据,所述特征数据为被标记为已播撒作业区域的像素点数据;S3: Extract feature data in the first image based on the job request instruction, where the feature data is pixel data marked as a sowing job area;

在步骤S3中,被标记的特征数据具体指的是,被标记为带有颜色的像素点,在对待播撒区域进行播撒前,首先就要判断该区域是否已经被播撒,在本实施例中,判断该区域是否已经被播撒的方法,采用的是通过判断该区域对应的像素点颜色进行判断,因此,在进行判断之前,就要提取第一图像中被标记颜色的像素点,提取出来的数据可以作为植保无人机对相关区域是否进行播撒进行判断。In step S3, the marked feature data specifically refers to the marked pixel points with color. Before sowing the area to be sowed, it is first necessary to judge whether the area has been sowed. In this embodiment, The method of judging whether the area has been sown is to judge by judging the color of the pixel point corresponding to the area. Therefore, before making the judgment, it is necessary to extract the pixel points marked with the color in the first image, and the extracted data It can be used as a plant protection drone to judge whether to sow in the relevant area.

被标记的所述特征数据具体为:获取播撒作业后的区域对应所述第一图像的像素点;并将该像素点进行颜色标记。The marked feature data specifically includes: obtaining the pixel points corresponding to the first image in the area after the sowing operation; and color marking the pixel points.

S4:基于所述特征数据,控制植保无人机按照作业路径对待播撒区域进行播撒作业,所述作业路径为对待播撒区域进行播撒作业的路径。S4: Based on the feature data, control the plant protection UAV to perform sowing operations in the area to be sown according to the operation path, and the operation path is a path for performing the sowing operation in the area to be sown.

在步骤S4中,根据提取的特征数据,进一步的规划植保无人机在待播撒区域进行播撒作业时候的路径,并且在播撒作业过程中,还需要对相关区域对应像素点是否被标记进行判断,进而判断该区域是否需要进行播撒。In step S4, according to the extracted characteristic data, the path of the plant protection UAV is further planned when the plant protection drone performs the sowing operation in the area to be sown, and during the sowing operation, it is also necessary to judge whether the corresponding pixel points in the relevant area are marked, Then it is judged whether the area needs to be sown.

所述S4的子步骤包括:The sub-steps of said S4 include:

获取待播撒区域中对应所述第一图像的像素点;Acquiring the pixel points corresponding to the first image in the area to be sowed;

判断该像素点是否为特征数据,若是,则不对当前区域进行播撒,否则,对当前区域进行播撒。Judging whether the pixel point is feature data, if so, do not broadcast to the current area, otherwise, broadcast to the current area.

判断该像素点是否为特征数据的具体子步骤为:The specific sub-steps for judging whether the pixel is feature data are:

获取像素点的LAB参数值;Obtain the LAB parameter value of the pixel;

判断所述LAB参数值是否在被标记颜色的参数值范围内,若是,则该像素点为特征数据。It is judged whether the LAB parameter value is within the parameter value range of the marked color, and if so, the pixel point is feature data.

在本实施例中,采用了LAB颜色模型对像素点是否被标记进行检测,首先要获取即将播撒区域对应像素点的LAB参数值,根据像素点被标记颜色的参数区域范围,判断获取的LAB参数值是否落在被标记颜色像素点的参数区域范围内,如果是,则该区域已经被播撒,否则额,该区域没有被播撒,需要对当前区域进行播撒作业。通过判断像素点是否被标记的形式来决定是否对当前区域进行播撒,避免了对待播撒区域中同一个区域进行重复播撒或者漏播的情况出现,减少了播撒种子的浪费情况。In this embodiment, the LAB color model is used to detect whether the pixel is marked. First, the LAB parameter value of the corresponding pixel in the area to be broadcast is obtained, and the obtained LAB parameter is judged according to the parameter area range of the marked color of the pixel. Whether the value falls within the parameter area of the marked color pixel, if yes, the area has been sown, otherwise, the area has not been sown, and the current area needs to be sown. Whether to sow the current area is determined by judging whether the pixel points are marked, which avoids repeated sowing or missed sowing in the same area in the area to be sown, and reduces the waste of sowing seeds.

本实施例中,所述控制方法还包括:在控制所述植保无人机进行播撒作业时,对所述植保无人机飞行路径进行精确定位,如图3所示,具体步骤为:In this embodiment, the control method further includes: accurately positioning the flight path of the plant protection drone when controlling the plant protection drone to perform the sowing operation, as shown in FIG. 3 , the specific steps are:

将所述第一图像划分为n个第二图像;dividing the first image into n second images;

获取第一信息,所述第一信息为第n个所述第二图像中心点到第n+1个所述第二图像中心点之间的预设距离以及方向;Acquiring first information, the first information being the preset distance and direction between the center point of the nth second image and the center point of the n+1th second image;

计算第二信息,所述第二信息为第n个所述第二图像中心点到第n+1个所述第二图像中心点之间实际飞行距离以及飞行方向;Calculating the second information, the second information being the actual flight distance and flight direction between the center point of the nth second image and the center point of the n+1th second image;

在飞行过程中无人机的定位方法为:The positioning method of the drone during the flight is:

在全场定位中主要创新点在于分区定位,传统的视觉定位只能在无人机的视野能够看到地图全图的情况下进行,本发明将作业区划分为四个区域,在不同区域中识别不同的边界,将可以定位的面积扩大了四倍。The main innovation point in the whole field positioning lies in the partition positioning. The traditional visual positioning can only be carried out when the drone can see the whole map in the field of vision. The present invention divides the operation area into four areas. Identifying different boundaries quadruples the area that can be located.

计算式如下The calculation formula is as follows

记最左、最右、最上和最下方边界的坐标为G1,G2,G3,G4,Record the coordinates of the leftmost, rightmost, uppermost and lowermost boundaries as G1, G2, G3, G4,

无人机视野中心坐标为

Figure 947197DEST_PATH_IMAGE003
,相机 2D 到现实 2D 的矫正系数为 corr ,上下 半区的纵向宽度分别为
Figure 767385DEST_PATH_IMAGE004
Figure 119869DEST_PATH_IMAGE005
,作业区横向宽度为X; The coordinates of the drone's field of view center are
Figure 947197DEST_PATH_IMAGE003
, the correction coefficient from camera 2D to real 2D is corr , and the longitudinal widths of the upper and lower halves are respectively
Figure 767385DEST_PATH_IMAGE004
and
Figure 119869DEST_PATH_IMAGE005
, the horizontal width of the work area is X;

则当无人机位于左上区时转换后的定位坐标(x,y)为:Then when the UAV is located in the upper left area, the converted positioning coordinates (x, y) are:

Figure 750833DEST_PATH_IMAGE006
Figure 750833DEST_PATH_IMAGE006

Figure 126450DEST_PATH_IMAGE007
Figure 126450DEST_PATH_IMAGE007

当无人机位于右上区时转换后的定位坐标(x,y)为:When the UAV is in the upper right area, the converted positioning coordinates (x, y) are:

Figure 547067DEST_PATH_IMAGE008
Figure 547067DEST_PATH_IMAGE008

Figure 206588DEST_PATH_IMAGE009
Figure 206588DEST_PATH_IMAGE009

当无人机位于左下区时转换后的定位坐标(x,y)为:When the UAV is located in the lower left area, the converted positioning coordinates (x, y) are:

Figure 320037DEST_PATH_IMAGE010
Figure 320037DEST_PATH_IMAGE010

Figure 182951DEST_PATH_IMAGE011
Figure 182951DEST_PATH_IMAGE011

当无人机位于右下区时转换后的定位坐标(x,y)为:When the UAV is located in the lower right area, the converted positioning coordinates (x, y) are:

Figure 83956DEST_PATH_IMAGE012
Figure 83956DEST_PATH_IMAGE012

Figure 145453DEST_PATH_IMAGE013
Figure 145453DEST_PATH_IMAGE013

将所述第二信息与所述第一信息进行匹配,若两则相同,则通过所述植保无人机对第n+1个所述第二图像进行播种作业,否则,对所述植保无人机的位置进行调整。Matching the second information with the first information, if the two are the same, then use the plant protection drone to perform the sowing operation on the n+1th second image, otherwise, do nothing to the plant protection Adjust the position of the man-machine.

通过将第一图像划分为若干第二图像,对应的待播撒区域被划分为若干个子待播撒区域,且每一个子待播撒区域对应一个第二图像,如图2所示,能够将待播撒区域按照分区域块的方式来进行播撒作业,采用分区域进行路径作业的方式,因为有风向等其他环境因素的影响,可能会造成对无人机所飞行的路径产生一定的位置偏差,因此,本实施例就将待播撒区域划分为若干子待播撒区域,通过分区域块的划分,在对无人机的飞行路径以及飞行的实时方向进行定位,实现对待播撒区域进行精准播撒。By dividing the first image into several second images, the corresponding area to be sowed is divided into several sub-areas to be sowed, and each sub-area to be sowed corresponds to a second image, as shown in Figure 2, the area to be sowed can be The sowing operation is carried out according to the method of sub-regional blocks, and the method of path operation is carried out in sub-regions. Because of the influence of other environmental factors such as wind direction, it may cause a certain position deviation to the path of the UAV. Therefore, this In the embodiment, the area to be sowed is divided into several sub-areas to be sowed, and the flight path and real-time direction of flight of the UAV are positioned by dividing the sub-areas to realize precise sowing in the area to be sown.

所述飞行方向具体表达式为:The specific expression of the flight direction is:

Figure 367487DEST_PATH_IMAGE014
Figure 367487DEST_PATH_IMAGE014

θ为飞行角度信息,x 1为植保无人机在第n个所述第二图像实际中心点对应x轴的坐标,y 1为植保无人机在第n个所述第二图像的实际中心点对应y轴的坐标,x 2为植保无人机在第n+1个所述第二图像的实际中心点对应x轴的坐标,y 2为植保无人机在第n+1个所述第二图像的实际中心点对应y轴的坐标;θ is the flight angle information, x1 is the coordinate of the x-axis corresponding to the actual center point of the plant protection drone in the nth second image, and y1 is the actual center of the plant protection drone in the nth second image Point corresponds to the coordinates of the y-axis, x 2 is the coordinates of the actual center point of the second image of the plant protection drone corresponding to the n+1th, and y 2 is the coordinates of the plant protection drone described in the n+1th The actual center point of the second image corresponds to the coordinates of the y-axis;

所述实际飞行距离的具体表达式为:The concrete expression of described actual flight distance is:

Figure 966964DEST_PATH_IMAGE015
Figure 966964DEST_PATH_IMAGE015

L为实际飞行距离。L is the actual flight distance.

所述控制方法还包括:控制植保无人机对所述待播撒区域的边缘进行识别,具体识别的方法为:The control method also includes: controlling the plant protection drone to identify the edge of the area to be sowed, and the specific identification method is:

获取区域边缘图像数据集;Obtain the region edge image dataset;

构建第一模型,采用深度学习网络对所述区域边缘图像数据集进行训练,获得最优模型;Building a first model, using a deep learning network to train the region edge image data set to obtain an optimal model;

将所述最优模型用于识别对所述待播撒区域的边缘进行识别。The optimal model is used to identify edges of the area to be sown.

若待播撒区域边缘两侧颜色有明显区别,则通过识别色块进行边缘的识别,否则提前采集区域边缘的图片,建立相应数据集,搭建对应的深度学习网络进行训练,并部署到无人机上从而进行区域边缘的识别。If there are obvious differences in the colors on both sides of the edge of the area to be sowed, then identify the edge by identifying the color block, otherwise collect pictures of the edge of the area in advance, establish a corresponding data set, build a corresponding deep learning network for training, and deploy it on the drone In order to identify the edge of the region.

本实施例公开的基于植保无人机的播撒控制方法,通过判断待播撒区域对应的图像像素点是否被标记,若被标记则不播撒对应区域, 否则,对相应区域进行播撒的方式,实现了对待播撒区域进行精准播撒作业,避免了对个别区域漏播或者重复播撒的情况发生。The sowing control method based on the plant protection drone disclosed in this embodiment realizes that by judging whether the image pixel corresponding to the area to be sowed is marked, if it is marked, the corresponding area is not sown, otherwise, the corresponding area is sowed. Precise sowing operations in the area to be sowed avoid the occurrence of missed or repeated sowing in individual areas.

Claims (2)

1. A plant protection unmanned aerial vehicle broadcast control method based on image positioning is characterized by comprising the following steps:
step 1: acquiring a first image, wherein the first image is a ground image comprising an area to be broadcast;
and 2, step: recognizing the farmland region in the first image by adopting a neural network, and partitioning the farmland according to the actual division of the farmland in the image; the blocking method comprises the following steps: recognizing the boundary of a road, a tree, a ridge, a ditch or a man-made mark in the image, fitting the recognized boundary of the road, the tree, the ridge, the ditch or the man-made mark into straight lines which are regarded as the boundary of a farmland, partitioning the farmland by adopting the straight lines, and storing a network graph formed by all the straight lines, wherein the network graph is called a graph library;
and 3, step 3: manually selecting farmland blocks to be sown, virtually dividing each individual farmland to be sown into a plurality of small squares, wherein each small square is called a second image, the width of each small square is the width of unmanned aerial vehicle sowing, and the unmanned aerial vehicle sowing width is adjustable;
and 4, step 4: planning an unmanned aerial vehicle sowing path with shortest length and least repetition according to the virtual small grids divided in the step 3; transmitting the image of the farmland block to be sowed and the planned path to an unmanned aerial vehicle;
and 5: the unmanned aerial vehicle takes off after acquiring the image of the farmland blocks to be sown and the planned path, preliminarily sets the flight direction of the unmanned aerial vehicle, enables the unmanned aerial vehicle to fly towards the area to be sown, acquires the front image in real time, matches the image of the farmland blocks to be sown in the image acquired in real time until the image of the farmland blocks to be sown is matched in the image acquired in real time, and flies to the initial sowing position to prepare sowing operation according to the set sowing path;
the method for matching the image of the farmland block to be sown in the image acquired in real time comprises the following steps:
step 5.1: identifying a farmland area in the real-time acquired image, then identifying boundary lines of the farmland by adopting the same method in the step 2, carrying out size normalization on a network graph formed by all the boundary lines, and normalizing to obtain that the size of each pixel representation is the same as the size of each pixel representation of the network graph in the step 2;
step 5.2: the method comprises the steps that a farmland boundary network in an image is obtained in real time through sliding interception and is called as a network 1, the size of the network 1 is counted as C x D, and the interception step length is 5-10 pixels; counting the number a of nodes in the network 1; counting the number b of nodes around each node in the size of C x D in the graph library, selecting the nodes in the graph library corresponding to the number of the nodes around the node as a = b +/-3, and intercepting a graph library area which takes the node as the center and takes C x D as the size and is called as a network 2;
step 5.3: counting the number of straight lines in the network 1 and the network 2, and selecting the network 2 with the same number of straight lines;
step 5.4: matching the shape of the network 2 selected in the step 5.3 with the shape of the network 1, wherein the shape similarity is greater than a set threshold value, and considering that the matching is successful, otherwise, performing the next matching;
step 5.5: after matching is successful, corresponding the image acquired in real time with the first image according to the positions of the network 1 and the network 2 in the respective images; determining farmland blocks to be sown in the images acquired in real time;
step 6: identifying the edge of the image of the farmland block to be sown according to the obtained image of the farmland block to be sown, measuring the distances from the unmanned aerial vehicle to all the edges of the image of the farmland block to be sown, selecting the two closest edges as references, and positioning the position of the unmanned aerial vehicle; in the step 4, the unmanned aerial vehicle sowing path is determined, the distance from each point on the path to two nearest edges of the image of the farmland block to be sown is calculated in advance, in the actual sowing process, the distance from the current unmanned aerial vehicle to the two nearest edges of the image of the farmland block to be sown is calculated, and the deviation of the unmanned aerial vehicle in the flying process is corrected according to the distance determined in advance;
and 7: recording the farmland area which is sowed, and preventing repeated sowing; after the sowing is finished, the unmanned aerial vehicle flies back to the flying point;
the method for recording the farmland area after sowing and preventing repeated sowing comprises the following steps: firstly, color marking is carried out on pixel points of a first image corresponding to an area after scattering operation; and during subsequent scattering, firstly, detecting whether the pixel point is marked by adopting an LAB color model, acquiring an LAB parameter value of the pixel point corresponding to a region to be scattered, judging whether the acquired LAB parameter value falls in the parameter region range of the marked color pixel point according to the parameter region range of the marked color pixel point, if so, scattering the region, otherwise, not scattering the region, and needing to perform scattering operation on the region.
2. The method for controlling the spreading of the plant protection unmanned aerial vehicle based on image positioning as claimed in claim 1, wherein the flight control method of the unmanned aerial vehicle along the spreading path specified in step 4 is a segment control, and each time, the unmanned aerial vehicle flies from one second image center to the next second image center, and the method comprises:
the calculated flight direction is:
Figure FDA0003850348920000021
where θ is flight angle information, x 1 For the plant protection unmanned aerial vehicle in the coordinate of the x axis corresponding to the actual central point of the nth second image, y 1 For plant protection unmanned plane at nth secondThe actual center point of the image corresponds to the y-axis coordinate, x 2 Coordinate, y, of the plant protection unmanned aerial vehicle on the x axis corresponding to the actual central point of the (n + 1) th second image 2 Coordinates of the plant protection unmanned aerial vehicle on the y axis corresponding to the actual central point of the (n + 1) th second image;
the calculated flight distance is:
Figure FDA0003850348920000022
and L is the actual flying distance.
CN202210914951.4A 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning Expired - Fee Related CN114967763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210914951.4A CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Publications (2)

Publication Number Publication Date
CN114967763A CN114967763A (en) 2022-08-30
CN114967763B true CN114967763B (en) 2022-11-08

Family

ID=82969098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210914951.4A Expired - Fee Related CN114967763B (en) 2022-08-01 2022-08-01 Plant protection unmanned aerial vehicle sowing control method based on image positioning

Country Status (1)

Country Link
CN (1) CN114967763B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Image processing device for vehicles
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Contour feature-based matching method for unmanned aerial vehicle SAR images
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017100579A1 (en) * 2015-12-09 2017-06-15 Dronesense Llc Drone flight operations
CN109859158A (en) * 2018-11-27 2019-06-07 邦鼓思电子科技(上海)有限公司 A kind of detection system, method and the machinery equipment on the working region boundary of view-based access control model
CN110728745B (en) * 2019-09-17 2023-09-15 上海大学 A three-dimensional reconstruction method for underwater binocular stereo vision based on a multi-layer refraction image model

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004310525A (en) * 2003-04-08 2004-11-04 Toyota Motor Corp Image processing device for vehicles
JP2006288467A (en) * 2005-04-06 2006-10-26 Fuji Photo Film Co Ltd Device and method for judging irradiation field and its program
CN103581501A (en) * 2012-07-31 2014-02-12 天津书生软件技术有限公司 Color correction method
CN104615146A (en) * 2015-02-05 2015-05-13 广州快飞计算机科技有限公司 Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
US9429953B1 (en) * 2015-08-25 2016-08-30 Skycatch, Inc. Autonomously landing an unmanned aerial vehicle
CN107633202A (en) * 2017-08-11 2018-01-26 合肥嘉浓航空科技有限公司 A kind of plant protection unmanned plane based on the identification of farmland characteristics of image flies control method and system
CN109358643A (en) * 2018-10-31 2019-02-19 阮镇荣 A kind of multi-mode unmanned plane pesticide spraying system and method based on image procossing
CN110140704A (en) * 2019-05-17 2019-08-20 安徽舒州农业科技有限责任公司 A kind of intelligent pesticide spraying method and system for plant protection drone
CN110929598A (en) * 2019-11-07 2020-03-27 西安电子科技大学 Contour feature-based matching method for unmanned aerial vehicle SAR images
CN112434880A (en) * 2020-12-10 2021-03-02 清研灵智信息咨询(北京)有限公司 Patrol route planning and patrol personnel management system based on deep learning
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113409338A (en) * 2021-06-24 2021-09-17 西安交通大学 Super-pixel method based on probability distribution

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Boundary Tracking of Continuous Objects Based on Feasible Region Search in Underwater Acoustic Sensor Networks;Li Liu;《IEEE Transactions on Mobile Computing ( Early Access )》;20220726;全文 *
基于图像识别的无人机精准喷雾控制系统的研究;王林惠等;《华南农业大学学报》;20161024(第06期);全文 *
基于视觉的无人机导航定位系统关键技术研究;吴婕;《中国优秀硕士学位论文库 工程科技Ⅱ辑》;20200815;全文 *
基于贝叶斯理论的运动目标检测算法研究;刘晓晨;《中国优秀硕士学位论文库 信息科技辑》;20120715;全文 *
无人机影像定位优化技术研究;薛武;《中国优秀硕博论文库 基础科学辑》;20190115;全文 *
无人机遥感图像拼接关键技术研究;贾银江;《中国优秀硕士学位论文库工程科技Ⅱ辑》;20160815;全文 *

Also Published As

Publication number Publication date
CN114967763A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN109845715B (en) Pesticide spraying control method, device, equipment and storage medium
WO2019179270A1 (en) Plant planting data measuring method, working route planning method, device and system
CN104615146B (en) Unmanned aerial vehicle spraying operation automatic navigation method without need of external navigation signal
CN113597874B (en) A weeding robot and its planning method, device and medium for weeding path
CN110456820B (en) Pesticide spraying system based on ultra-bandwidth wireless positioning and control method
WO2021051278A1 (en) Earth surface feature identification method and device, unmanned aerial vehicle, and computer readable storage medium
CN109631903A (en) Cereal handles automated driving system and its automatic Pilot method and paths planning method
CN109117811A (en) A kind of system and method based on low-altitude remote sensing measuring technique estimation urban vegetation coverage rate
CN110282135B (en) Accurate pesticide spraying system and method for plant protection unmanned aerial vehicle
US11908074B2 (en) Method of identifying and displaying areas of lodged crops
CN106643529A (en) Rapid measuring method for growing height of agricultural crops in mountainous areas based on unmanned aerial vehicle photographed image
CN110852282A (en) A farmland disease monitoring system based on machine vision
CN113325872A (en) Plant inspection method, device and system and aircraft
CN117389310B (en) Agricultural unmanned aerial vehicle sprays operation control system
CN114184175A (en) A method for constructing 3D model of complex terrain based on UAV video streaming route
CN109814551A (en) Grain processing automatic driving system, automatic driving method and automatic identification method
CN114967763B (en) Plant protection unmanned aerial vehicle sowing control method based on image positioning
CN115443845A (en) UAV-based monitoring method for tea tree disease and growth in tea garden
CN117268399B (en) Unmanned aerial vehicle colored drawing plant sowing method and system based on homodromous path planning
CN113870278A (en) Improved Mask R-CNN model-based satellite remote sensing image farmland block segmentation method
CN115619286B (en) A method and system for assessing the quality of breeding field plots
CN113807128A (en) Seedling shortage marking method and device, computer equipment and storage medium
CN117274844B (en) Rapid extraction method of field peanut emergence status using UAV remote sensing images
CN112414373A (en) Automatic interpretation method based on unmanned aerial vehicle and VR technology
CN114485612B (en) Route generation method and device, unmanned operation vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20221108

CF01 Termination of patent right due to non-payment of annual fee