Nothing Special   »   [go: up one dir, main page]

CN108305264B - A kind of unmanned plane precision landing method based on image procossing - Google Patents

A kind of unmanned plane precision landing method based on image procossing Download PDF

Info

Publication number
CN108305264B
CN108305264B CN201810610953.8A CN201810610953A CN108305264B CN 108305264 B CN108305264 B CN 108305264B CN 201810610953 A CN201810610953 A CN 201810610953A CN 108305264 B CN108305264 B CN 108305264B
Authority
CN
China
Prior art keywords
unmanned plane
land mark
coordinate
camera
profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810610953.8A
Other languages
Chinese (zh)
Other versions
CN108305264A (en
Inventor
高含
王伟
杜浩
肖冉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zhongke Intelligent Science And Technology Application Research Institute
Original Assignee
Intelligence Science Technology Application Study Institute Of Institute Of Jiangsu Wisoft Softuare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelligence Science Technology Application Study Institute Of Institute Of Jiangsu Wisoft Softuare Co Ltd filed Critical Intelligence Science Technology Application Study Institute Of Institute Of Jiangsu Wisoft Softuare Co Ltd
Priority to CN201810610953.8A priority Critical patent/CN108305264B/en
Publication of CN108305264A publication Critical patent/CN108305264A/en
Application granted granted Critical
Publication of CN108305264B publication Critical patent/CN108305264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Processing (AREA)

Abstract

The present invention provides a kind of unmanned plane precision landing method based on image procossing, including:Establish the position relationship model between unmanned plane camera and land mark;Three-dimensional coordinate and orientation angles information of the unmanned plane camera with respect to land mark is calculated using linear transformation;Three-dimensional coordinate by unmanned plane camera relative to land mark is transformed into world coordinate system with orientation angles information by rotation, translation.Both the disadvantage for having overcome GPS positioning positioning accuracy deficiency has been greatly reduced cost compared with RTK;Land mark position in the picture is detected, calculates the relative dimensional position between label and unmanned plane in real time, meets the requirement to real-time and accuracy in real use, and method design is simple, ensures to be easy to implement while reliability;Compared with the existing unmanned aerial vehicle mark detection method based on camera, label can customize in a certain range, increase the degree of freedom used.

Description

A kind of unmanned plane precision landing method based on image procossing
Technical field
The invention belongs to UAV Landing mode technical field more particularly to a kind of unmanned plane based on image procossing are accurate Landing concept.
Background technology
In recent years, unmanned plane progresses into the visual field of people, but technically there is also many problems, and precisely landing is exactly One of most important one problem.
Original research personnel use traditional GPS positioning, but precision can only achieve within ten meters, cannot achieve accurate drop It falls, RTK can get centimetre class precision, but RTK costs as a kind of new GPS technology using carrier phase difference technology at this time It is higher.To solve cost problem, those skilled in the art are detected ground label using camera, utilize image procossing skill in real time Art detects the mark position in video, while the fixed high meter of unmanned plane installation air pressure, utilizes two-dimensional marker in the image of detection in real time Yaw angle is corrected in position in real time, is realized and is landed using barometer and ultrasonic radar.But this mode does not calculate label Relative dimensional position between unmanned plane, is only limited to the two-dimensional position of target in image, as a result depends on carriage angle, can It is relatively low by property.
In addition to this, vision positioning system apriltag technologies are widely used in robot, unmanned plane positioning guiding, The technology is suitable for specific Quick Response Code, detects Quick Response Code first, makees quadrangle approach using image processing techniques, detection is square Shape, square interior includes specific coding, after the completion of detection, estimation flag code and unmanned plane relative dimensional position, to realize Location navigation.This method is only limitted to specific apriltag labels the disadvantage is that detecting, can not be self-defined.
Invention content
In order to solve the above technical problems, the present invention provides a kind of unmanned plane precision landing method based on image procossing.For To some aspects of the embodiment of disclosure, there are one basic understandings, are shown below simple summary.The summarized section is not It is extensive overview, nor to determine key/critical component or describe the protection domain of these embodiments.Its sole purpose It is that some concepts are presented with simple form, in this, as the preamble of following detailed description.
The present invention adopts the following technical scheme that:
In some optional embodiments, a kind of unmanned plane precision landing method based on image procossing is provided, including:It builds Vertical position relationship model between unmanned plane camera and land mark;According to land mark position in the picture and institute Rheme sets relational model, and the three-dimensional coordinate of the relatively described land mark of the unmanned plane camera is calculated using linear transformation With orientation angles information;The three-dimensional coordinate of the relatively described land mark of the unmanned plane camera and orientation angles information are passed through Rotation, translation are transformed into world coordinate system.
In some optional embodiments, further include before this method:Unmanned plane is set to be in the ground using GPS navigation Near label and read the unmanned plane camera;Obtain the position of the land mark in the picture.
In some optional embodiments, a kind of unmanned plane precision landing method based on image procossing is also wrapped It includes:Fly control center to be sat according to the altitude information of acquisition and the real-time three-dimensional of the relatively described land mark of the unmanned plane camera Mark and orientation angles information are sent adjustment order to unmanned plane, are landed using the coordinate of acquisition.
In some optional embodiments, the process for obtaining the position of land mark in the picture includes:
The cromogram that the unmanned plane camera is shot is converted into gray-scale map and carries out binaryzation, utilizes the edges canny Detection algorithm completes edge detection;
Edge graph is analyzed, relationship of each profile respectively with profile at the same level and the superior and the subordinate's profile is recorded, sub-argument goes out Meet the profile of condition, the profile for meeting condition refers to the profile that subordinate's number of contours is more than threshold value T;
The external frame of rectangle for calculating the profile for meeting condition and corresponding sub- profile, passes through Auto-proportion, upper and lower step cone Wide ratio-dependent whether be land mark profile;
Three points of white area avris in detection label figure, and distinguished by the position relationship between three points Three points;
In conjunction with the position relationship between three points and three points, correspondence is determined, to obtain ground mark The position of note in the picture.
In some optional embodiments, the position relationship model established between unmanned plane camera and land mark Process include:
Unmanned plane camera coordinate system is set as XcYcZc, wherein OcFor origin, i.e. camera photocentre position, land mark is put On ground level λ;
World coordinate system is set as XwYwZw, OwFor world coordinates origin, ZwPerpendicular to ground level λ, origin is arranged on ground The middle position of label, land mark position is fixed, to obtain relationship between unmanned plane camera coordinate and world coordinates It is as follows:
Wherein, R is 3 × 3 spin matrix, and t is 3 × 1 translation matrix.
It is described to be calculated the three of the relatively described land mark of the unmanned plane camera in some optional embodiments Dimension coordinate and the process of orientation angles information include:
Pixel coordinate u-v is transformed into image coordinate x-y, each size of the pixel in u-v coordinate systems in pixel coordinate Respectively dx、dy, u0、v0It is the center of pixel planes, you can obtain:
According to camera imaging principle, obtainI.e.:
Wherein, x, y are the coordinates at photo coordinate system midpoint, and f is the focal length of camera, Xc、Yc、ZcIt is under camera coordinates system The coordinate of point;
Image coordinate u-v is corresponded into world coordinates XwYwZw, you can it obtains:
It is described by the three-dimensional coordinate of the relatively described land mark of unmanned plane camera and side in some optional embodiments Position angle information is transformed into the process of world coordinate system by rotation, translation:
Four points A, B, C, D in land mark are selected, the position of four points in the picture is detected, calculates nobody The spin matrix and translation matrix of machine camera;
Obtain coordinate of the world coordinate system origin in unmanned plane camera coordinate system;
Unmanned plane camera coordinate system is rotated to parallel with world coordinates, rotates θ about the z axis firstz, then revolved around Y-axis Turn θy, finally θ is rotated around X-axisx, wherein θx、θy、θzIt is obtained by the spin matrix;
Two coordinate system origin is overlapped using the translation matrix, the final unmanned plane camera that obtains is in world coordinate system Three-dimensional coordinate
Advantageous effect caused by the present invention:Both the disadvantage for having overcome GPS positioning positioning accuracy deficiency, compared with RTK, Greatly reduce cost;Land mark position in the picture is detected, calculates the relative dimensional position between label and unmanned plane in real time It sets, the requirement in the real use of satisfaction to real-time and accuracy, and method design is simple, ensures to be convenient for while reliability real It is existing;Compared with the existing unmanned aerial vehicle mark detection method based on camera, label can customize in a certain range, increase The degree of freedom used.
For above-mentioned and relevant purpose, one or more embodiments include being particularly described below and in claim In the feature that particularly points out.Certain illustrative aspects are described in detail in the following description and the annexed drawings, and its instruction is only Some modes in the utilizable various modes of principle of each embodiment.Other benefits and novel features will be under The detailed description in face is considered in conjunction with the accompanying and becomes apparent, the disclosed embodiments be all such aspects to be included and they Be equal.
Description of the drawings
Fig. 1 is a kind of flow diagram of the unmanned plane precision landing method based on image procossing of the present invention;
Fig. 2 is land mark schematic diagram of the present invention;
Fig. 3 is position relationship model schematic of the present invention.
Specific implementation mode
The following description and drawings fully show specific embodiments of the present invention, to enable those skilled in the art to Put into practice them.Other embodiments may include structure, logic, it is electrical, process and other change.Embodiment Only represent possible variation.Unless explicitly requested, otherwise individual components and functionality is optional, and the sequence operated can be with Variation.The part of some embodiments and feature can be included in or replace part and the feature of other embodiments.This hair The range of bright embodiment includes equivalent obtained by the entire scope of claims and all of claims Object.
As shown in Fig. 2, the land mark is square-outside and round-inside shape, outside is square portion 1, in 1 region of square portion It is set as black, the inside of square portion 1 is circular portion 2, and color is set as white, mainly for compared with square portion 1 There is higher contrast, and square portion 1 and circular portion 2 are concentric in Fig. 1.There are three rectangles three in circular portion 2 Angle point, respectively puts 3, point 4 and point 5, and point 3, point 4 and point 5 are in isosceles right triangle shape, the tag image side of judgement accordingly To circular portion 2 also has internal custom figure 6, and the internal custom figure in Fig. 2 is " H ".
Bottomside mark in the present invention may be designed as the shape of Fang Huan, annulus and square-outside and round-inside, and gray scale is carried out to image Change, edge detection, analyze edge, find the region of closure, annular, and be further qualified according to indicia patterns, item will be met Part region is classified as candidate regions.Label is internal, and the present invention devises rectangular triangulation point, for identifying the direction of land mark so that Unmanned plane at an arbitrary position all can the direction of accurate judgement land mark can design arbitrary graphic inside rectangular triangulation point, it is interior The self-defined figure 6 in portion is to the asymmetric no requirement (NR) of rotation.
As shown in Figure 1, in some illustrative embodiments, providing a kind of unmanned plane precision landing based on image procossing Method, including:
S1:Unmanned plane is set to be near land mark using GPS navigation.
S2:Read unmanned plane camera.
S3:Using image processing method obtain outside land mark position in the picture, including land mark angle point with The position of the internal small rectangle frame of label.
S4:Judge whether to recognize land mark, if recognizing land mark, carries out step S5, otherwise carry out step S2。
S5:Establish the position relationship model between unmanned plane camera and land mark, position relationship model such as Fig. 3 institutes Show, unmanned plane camera coordinate system is XcYcZc,OcFor origin, i.e. camera photocentre position, land mark is placed on ground level λ, generation Boundary's coordinate system is XwYwZw, OwFor world coordinates origin, ZwPerpendicular to ground level λ, origin is arranged in the middle position of land mark, Mark position is fixed.
S6:According to land mark position in the picture and position relationship model, nobody is calculated using linear transformation Three-dimensional coordinate and orientation angles information of the machine camera with respect to land mark.
S7:Three-dimensional coordinate by unmanned plane camera relative to land mark is grasped with orientation angles information by rotation, translation It is transformed into world coordinate system.
S8:Fly control center and judge whether flying height exceeds preset value H according to the altitude information of acquisition, preset value H can root It is set with environment according to the case where unmanned plane practical flight, if exceeding preset value H, carries out step S9, otherwise carry out step S2。
S9:Fly control center according to real-time the three of the relatively described land mark of altitude information and unmanned plane camera of acquisition Dimension coordinate and orientation angles information, adjustment order is sent to unmanned plane, and the operations such as slow drop, hovering are carried out to unmanned plane, using obtaining The coordinate taken lands.
In some illustrative embodiments, step S3 is specifically included:
Edge detection is carried out to the image of unmanned plane camera shooting, i.e., the cromogram shot unmanned plane camera is converted For gray-scale map and binaryzation is carried out, edge detection is completed using canny edge detection algorithms;
Edge graph is analyzed, relationship of each profile respectively with profile at the same level and the superior and the subordinate's profile is recorded, sub-argument goes out Meet the profile of condition, the profile for meeting condition refers to the profile that subordinate's number of contours is more than threshold value T, and threshold value T is according to reality Calculating demand set;
The external frame of rectangle for calculating the profile for meeting condition and corresponding sub- profile, passes through Auto-proportion, upper and lower step cone Wide ratio-dependent whether be land mark profile;
Using three points of white area avris in the method detection label figure of image procossing:Point p1, point p2, point p3, and make Further judge, and distinguish three points by the position relationship between three points, mainly passes through point p1, point p2, point p3With The relationship on the external frame vertex of profile, the length-width ratio of itself, the distance of each spot to mark center and external frame length and width relationship etc. To judge;
It after determining three points, is distinguished by the position relationship between three points, isosceles are straight between establishing three points The model of angle triangle, if M points are point p1, point p3Midpoint on line, point p3Positioned at square position, by between each point of calculating Distance, to other two point between distance approximately point can quickly be confirmed as point p3, for point p1, point p2, analyze M points and point p3 Position relationship, binding site p1With point p3Overlying relation, a point situation discusses, its correspondence is determined, to obtain The position of land mark in the picture.
In some illustrative embodiments, step S5 is specifically included:
Unmanned plane camera coordinate system is set as XcYcZc, wherein OcFor origin, i.e. camera photocentre position, land mark is put On ground level λ;
World coordinate system is set as XwYwZw, OwFor world coordinates origin, ZwPerpendicular to ground level λ, origin is arranged on ground The middle position of label, land mark position is fixed, to obtain relationship between unmanned plane camera coordinate and world coordinates It is as follows:
Wherein, R is 3 × 3 spin matrix, and t is 3 × 1 translation matrix.
In some illustrative embodiments, step S6 is specifically included:
Image coordinate is established to the correspondence of world coordinates, pixel coordinate u-v is transformed into image coordinate x-y first, Each size of the pixel in u-v coordinate systems is respectively d in pixel coordinatex、dy, u0、v0It is the center of pixel planes, you can It arrives:
According to camera imaging principle, obtainI.e.:
Wherein, x, y are the coordinates at photo coordinate system midpoint, and f is the focal length of camera, Xc、Yc、ZcIt is under camera coordinates system The coordinate of point;
Image coordinate u-v is corresponded into world coordinates XwYwZw, you can it obtains:
In some illustrative embodiments, step S7 is specifically included:
First, select land mark in four points, point A, point B, point C, point D, it is known that point A, point B, point C, point D the world Coordinate, the position of four points of detection in the picture are closed by the image processing operations of step S3 in conjunction with the obtained coordinates of step S6 System, calculates the spin matrix R and translation matrix t of unmanned plane camera.
Wherein, specific image processing method is:The boundary rectangle frame that detection is marked is first passed around, according to step S3 The point p of acquisition1, point p2, point p3The correspondence that can further determine that external frame angle point and point A, point B, point C, point D, by point A, point B, point C, the two-dimensional coordinate of point D and three-dimensional coordinate store in order, are incorporated in the camera inside and outside parameter obtained when camera calibration, Calculate the rotation and translation matrix of camera.
Secondly, coordinate of the world coordinate system origin in unmanned plane camera coordinate system, the spin moment being calculated are obtained Battle array R contain angle information, camera relative index location information, translation matrix t represents the distance of the two coordinate origins, Coordinate of the world coordinate system origin in unmanned plane camera coordinate system can be obtained, since unmanned plane camera coordinate system is with nothing Man-machine flight is constantly changing, therefore selects world coordinate system for reference frame, and camera coordinates, which are converted into the world, sits Mark.
Finally, unmanned plane camera coordinate system is rotated to parallel with world coordinates, rotates θ about the z axis firstz, then around Y Axis rotates θy, finally θ is rotated around X-axisx, wherein θx、θy、θzIt is obtained by the spin matrix.Using translation matrix t by Two coordinate system Origin overlaps, three-dimensional coordinate of the final acquisition unmanned plane camera in world coordinate system.
It should also be appreciated by one skilled in the art that various illustrative logical boxs, mould in conjunction with the embodiments herein description Electronic hardware, computer software or combinations thereof may be implemented into block, circuit and algorithm steps.In order to clearly demonstrate hardware and Interchangeability between software surrounds various illustrative components, frame, module, circuit and step its function above and carries out It is generally described.It is implemented as hardware as this function and is also implemented as software, depends on specific application and to entire The design constraint that system is applied.Those skilled in the art can be directed to each specific application, be realized in a manner of flexible Described function, it is still, this to realize that decision should not be construed as the protection domain away from the disclosure.

Claims (3)

1. a kind of unmanned plane precision landing method based on image procossing, which is characterized in that including:
Establish the position relationship model between unmanned plane camera and land mark;
According to land mark position in the picture and the position relationship model, it is calculated using linear transformation described The three-dimensional coordinate of the relatively described land mark of unmanned plane camera and orientation angles information;
Three-dimensional coordinate by the unmanned plane camera relative to the land mark is grasped with orientation angles information by rotation, translation It is transformed into world coordinate system;
Further include before this method:
So that unmanned plane is near the land mark using GPS navigation and reads the unmanned plane camera;
Obtain the position of the land mark in the picture;
Further include:
Fly control center according to the altitude information of acquisition and the real-time three-dimensional of the relatively described land mark of the unmanned plane camera Coordinate and orientation angles information are sent adjustment order to unmanned plane, are landed using the coordinate of acquisition;
The process for obtaining land mark position in the picture includes:
The cromogram that the unmanned plane camera is shot is converted into gray-scale map and carries out binaryzation, utilizes canny edge detections Algorithm completes edge detection;
Edge graph is analyzed, relationship of each profile respectively with profile at the same level and the superior and the subordinate's profile is recorded, sub-argument goes out to meet The profile of condition, the profile for meeting condition refer to the profile that subordinate's number of contours is more than threshold value T;
The external frame of rectangle that calculates the profile for meeting condition and corresponding sub- profile passes through Auto-proportion, the superior and the subordinate's profile Ratio-dependent whether be land mark profile;
Three points of white area avris in detection label figure, and distinguished by the position relationship between three points described Three points;
In conjunction with the position relationship between three points and three points, determines correspondence, exist to obtain land mark Position in image;
The process for the position relationship model established between unmanned plane camera and land mark includes:
Unmanned plane camera coordinate system is set as XcYcZc, wherein OcFor origin, i.e. camera photocentre position, land mark is placed on ground On plane λ;
World coordinate system is set as XwYwZw, OwFor world coordinates origin, ZwPerpendicular to ground level λ, origin is arranged in land mark Middle position, land mark position is fixed, as follows to obtain relationship between unmanned plane camera coordinate and world coordinates:
Wherein, R is 3 × 3 spin matrix, and t is 3 × 1 translation matrix.
2. a kind of unmanned plane precision landing method based on image procossing according to claim 1, which is characterized in that described The three-dimensional coordinate and the process of orientation angles information that the relatively described land mark of the unmanned plane camera is calculated include:
Pixel coordinate u-v is transformed into image coordinate x-y, size of each pixel in u-v coordinate systems is distinguished in pixel coordinate For dx、dy, u0、v0It is the center of pixel planes, you can obtain:
According to camera imaging principle, obtain I.e.:
Wherein, x, y are the coordinates at photo coordinate system midpoint, and f is the focal length of camera, Xc、Yc、ZcIt is to be put under camera coordinates system Coordinate;
Image coordinate u-v is corresponded into world coordinates XwYwZw, you can it obtains:
3. a kind of unmanned plane precision landing method based on image procossing according to claim 2, which is characterized in that described Three-dimensional coordinate by unmanned plane camera relative to the land mark is transformed into orientation angles information by rotation, translation The process of world coordinate system includes:
Four points A, B, C, D in land mark are selected, the position of four points in the picture is detected, calculates unmanned plane and take the photograph As the spin matrix and translation matrix of head;
Obtain coordinate of the world coordinate system origin in unmanned plane camera coordinate system;
Unmanned plane camera coordinate system is rotated to parallel with world coordinates, rotates θ about the z axis firstz, then θ is rotated around Y-axisy, Finally θ is rotated around X-axisx, wherein θx、θy、θzIt is obtained by the spin matrix;
Two coordinate system origin is overlapped using the translation matrix, three-dimensional of the final acquisition unmanned plane camera in world coordinate system Coordinate.
CN201810610953.8A 2018-06-14 2018-06-14 A kind of unmanned plane precision landing method based on image procossing Active CN108305264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810610953.8A CN108305264B (en) 2018-06-14 2018-06-14 A kind of unmanned plane precision landing method based on image procossing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810610953.8A CN108305264B (en) 2018-06-14 2018-06-14 A kind of unmanned plane precision landing method based on image procossing

Publications (2)

Publication Number Publication Date
CN108305264A CN108305264A (en) 2018-07-20
CN108305264B true CN108305264B (en) 2018-10-02

Family

ID=62846527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810610953.8A Active CN108305264B (en) 2018-06-14 2018-06-14 A kind of unmanned plane precision landing method based on image procossing

Country Status (1)

Country Link
CN (1) CN108305264B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109598758A (en) * 2018-11-21 2019-04-09 三峡大学 It is a kind of can vision positioning unmanned plane landing platform and unmanned plane drop point modification method
CN109523580B (en) * 2018-12-13 2021-01-26 法兰泰克重工股份有限公司 Calculation method of image acquisition module, image acquisition module and sorting system
SE1851620A1 (en) * 2018-12-20 2020-06-21 Epiroc Rock Drills Ab Method and device for determining the position of a mining and/or construction machine
CN109658461B (en) * 2018-12-24 2023-05-26 中国电子科技集团公司第二十研究所 Unmanned aerial vehicle positioning method based on cooperation two-dimensional code of virtual simulation environment
CN111982291B (en) * 2019-05-23 2022-11-04 杭州海康机器人技术有限公司 Fire point positioning method, device and system based on unmanned aerial vehicle
CN111796605A (en) * 2019-05-23 2020-10-20 北京京东尚科信息技术有限公司 Unmanned aerial vehicle landing control method, controller and unmanned aerial vehicle
CN110221625B (en) * 2019-05-27 2021-08-03 北京交通大学 Autonomous landing guiding method for precise position of unmanned aerial vehicle
CN109992006B (en) * 2019-05-31 2019-08-16 江苏方天电力技术有限公司 A kind of accurate recovery method and system of power patrol unmanned machine
JP7006847B2 (en) * 2019-07-04 2022-01-24 三菱電機株式会社 Mobile positioning device and mobile positioning system
CN110345937A (en) * 2019-08-09 2019-10-18 东莞市普灵思智能电子有限公司 Appearance localization method and system are determined in a kind of navigation based on two dimensional code
CN113570668A (en) * 2020-04-28 2021-10-29 富华科精密工业(深圳)有限公司 Camera external parameter calibration method based on three-line calibration and computer device
CN112434120A (en) * 2020-11-18 2021-03-02 北京京东乾石科技有限公司 Position marking method, device and computer readable storage medium
CN112660011A (en) * 2020-12-23 2021-04-16 海南电网有限责任公司琼海供电局 Unmanned aerial vehicle intelligent inspection operation vehicle for power transmission line
CN114115233A (en) * 2021-10-26 2022-03-01 燕山大学 Unmanned aerial vehicle autonomous landing method based on unmanned ship attitude active feedback
CN113989390A (en) * 2021-11-11 2022-01-28 国网天津市电力公司 Unmanned aerial vehicle landing optimization method and device based on image recognition
CN118226870B (en) * 2024-05-20 2024-08-20 南方海洋科学与工程广东省实验室(珠海) Unmanned aerial vehicle autonomous landing method based on three-stage visual positioning

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090306840A1 (en) * 2008-04-08 2009-12-10 Blenkhorn Kevin P Vision-based automated landing system for unmanned aerial vehicles
KR20150019771A (en) * 2013-08-16 2015-02-25 한국항공우주연구원 Method and System for Landing of Unmanned Aerial Vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063261A (en) * 2017-03-29 2017-08-18 东北大学 The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Vision-based Autonomous Landing of an Unmanned Aerial Vehicle;Srikanth Saripalli等;《Proceedings of IEEE International Conference on Robotics and Automation》;20011130;第2250-2256页 *
基于视觉的四旋翼无人机着陆算法研究;罗哲等;《电脑知识与技术》;20150930;第11卷(第25期);第122-124页 *
无人机自动着陆中计算机视觉与激光扫描的应用研究;雷浩;《工程技术研究》;20161231(第5期);第16-17页 *
无人机视觉导航算法;黄楠楠等;《红外与激光工程》;20160731;第45卷(第7期);第1-9页 *

Also Published As

Publication number Publication date
CN108305264A (en) 2018-07-20

Similar Documents

Publication Publication Date Title
CN108305264B (en) A kind of unmanned plane precision landing method based on image procossing
CN107194399B (en) Visual calibration method, system and unmanned aerial vehicle
CN109270953B (en) Multi-rotor unmanned aerial vehicle autonomous landing method based on concentric circle visual identification
EP3407294B1 (en) Information processing method, device, and terminal
CN106054929B (en) A kind of unmanned plane based on light stream lands bootstrap technique automatically
Romero-Ramire et al. Fractal markers: A new approach for long-range marker pose estimation under occlusion
CN111968128B (en) Unmanned aerial vehicle visual attitude and position resolving method based on image markers
CN110991207A (en) Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN106529587B (en) Vision course recognition methods based on object detection
CN107063261B (en) Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
Štěpán et al. Vision techniques for on‐board detection, following, and mapping of moving targets
CN106017458B (en) Mobile robot combined navigation method and device
CN106197265B (en) A kind of space free flight simulator precision visual localization method
CN106774386A (en) Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
US20210342620A1 (en) Geographic object detection apparatus and geographic object detection method
CN109885086A (en) A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
CN109725645A (en) Nested unmanned aerial vehicle landing cooperation sign design and relative pose acquisition method
CN106326892A (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN110068321B (en) UAV relative pose estimation method of fixed-point landing sign
CN110083177A (en) A kind of quadrotor and control method of view-based access control model landing
CN109739257A (en) Merge the patrol unmanned machine closing method and system of satellite navigation and visual perception
CN105631852B (en) Indoor human body detection method based on depth image contour
CN105335973A (en) Visual processing method for strip steel processing production line
CN108171715A (en) A kind of image partition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 213000 South end of Sany Road, Changzhou Science and Education City, Wujin District, Changzhou City, Jiangsu Province

Patentee after: Jiangsu Zhongke intelligent science and Technology Application Research Institute

Address before: 213164 Sany Road, kejiaocheng, Wujin District, Changzhou City, Jiangsu Province

Patentee before: INSTITUTE OF INTELLIGENT SCIENCE AND TECHNOLOGY APPLICATION RESEARCH, JIANGSU AND CHINESE ACADEMY OF SCIENCES