Nothing Special   »   [go: up one dir, main page]

CN111988524A - Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium - Google Patents

Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium Download PDF

Info

Publication number
CN111988524A
CN111988524A CN202010851105.3A CN202010851105A CN111988524A CN 111988524 A CN111988524 A CN 111988524A CN 202010851105 A CN202010851105 A CN 202010851105A CN 111988524 A CN111988524 A CN 111988524A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
camera
image
follow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010851105.3A
Other languages
Chinese (zh)
Inventor
朱泽锋
汪密
胡金磊
林孝斌
黎阳羊
华耀
翁东鹏
张月华
钱同海
温灵锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingyuan Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Qingyuan Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingyuan Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Qingyuan Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority to CN202010851105.3A priority Critical patent/CN111988524A/en
Publication of CN111988524A publication Critical patent/CN111988524A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a collaborative obstacle avoidance method for an unmanned aerial vehicle and a camera, a server and a storage medium. The method comprises the following steps: distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to the received routing inspection task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera; receiving image data shot by the unmanned aerial vehicle and the follow-up camera; judging whether an obstacle exists on the original flight path of the unmanned aerial vehicle or not according to the image data; if the obstacle exists on the original flight path of the unmanned aerial vehicle, acquiring the position information of the obstacle; according to the position information of the barrier, an unmanned aerial vehicle flight optimization path is generated, and the unmanned aerial vehicle flight optimization path is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path. The scheme provided by the invention can analyze and obtain the complete information data of the barrier, thereby planning a safer flight line for the unmanned aerial vehicle.

Description

Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium
Technical Field
The embodiment of the invention relates to the technical field of data fusion of the Internet of things, in particular to a collaborative obstacle avoidance method for an unmanned aerial vehicle and a camera, a server and a storage medium.
Background
Adopt unmanned aerial vehicle to patrol and examine in the transformer substation and can improve and patrol and examine density, reduce and patrol and examine working strength.
At present, in the transformer substation, unmanned aerial vehicle patrols and examines the operation by airborne radar or camera, with surveillance video data transfer to central control room at the operation in-process, the technical staff carries out remote monitoring and control to equipment operation conditions through the video data that transmit. Because equipment is numerous in the transformer substation, and the space is narrow and small, unmanned aerial vehicle only relies on airborne radar or camera, can't the whole situation of accurate judgement barrier to be difficult to guarantee camera coverage, rationally optimize flight path.
Disclosure of Invention
The embodiment of the invention provides an unmanned aerial vehicle and camera collaborative obstacle avoidance method, a server and a storage medium, which can analyze and obtain complete information data of an obstacle, thereby planning a safer flight line.
In a first aspect, an embodiment of the present invention provides a method for avoiding an obstacle by using an unmanned aerial vehicle and a camera in cooperation, where the method includes:
distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to the received routing inspection task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera;
receiving image data shot by the unmanned aerial vehicle and the follow-up camera;
judging whether an obstacle exists on the original flight path of the unmanned aerial vehicle or not according to the image data;
if the obstacle exists on the original flight path of the unmanned aerial vehicle, acquiring the position information of the obstacle;
according to the position information of the barrier, an unmanned aerial vehicle flight optimization path is generated, and the unmanned aerial vehicle flight optimization path is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path.
Optionally, after sending the original route of unmanned aerial vehicle flight, sending camera follow-up scheme to the follow-up camera head respectively to unmanned aerial vehicle, still include:
judging whether the number of the follow-up cameras is larger than or equal to a first preset threshold value or not;
if the number of the follow-up cameras is smaller than a first preset threshold value, the follow-up cameras are added, camera follow-up schemes are sent to the added follow-up cameras, and the total number of the follow-up cameras after the follow-up cameras are added is larger than or equal to the first preset threshold value.
Optionally, for a first image and a second image in the image data, determining whether an obstacle exists on an original flight path of the unmanned aerial vehicle according to the image data, including:
performing fusion splicing on the first image and the second image, and determining the repeated area of the first image and the second image;
and carrying out difference analysis on the repeated area and the standard area corresponding to the repeated area, and judging whether the obstacle exists on the original flight path of the unmanned aerial vehicle.
Optionally, the fusion and stitching the first image and the second image to determine the repetition region of the first image and the second image includes:
dividing the first image and the second image into a plurality of image blocks with equal size;
calculating the difference G of the image blocksdiff(l) Wherein, in the step (A),
Figure BDA0002644762410000021
h and L are the length and width of the pattern block, L is the serial number of the pattern block, i and j are the coordinates of the pixel point in the pattern block, A1Representing a first image, A2Representing a second image;
judging the difference G of the image blocksdiff(l) Whether the value is smaller than a second preset threshold value;
difference degree of image blocks Gdiff(l) And if the value is less than the second preset threshold value, the image block is positioned in the repeated area.
Optionally, the difference method analysis is performed on the repetition region and the standard region corresponding to the repetition region, and whether an obstacle exists on the original flight path of the unmanned aerial vehicle is determined, including:
carrying out difference value analysis on the repeated region and the standard region corresponding to the repeated region, and calculating the standard difference degree G between the repeated region and the standard region corresponding to the repeated regiondiff(s) wherein,
Figure BDA0002644762410000031
Asindicating a standard area corresponding to the repeated area;
degree of standard deviation Gdiff(s) whether it is greater than a third preset threshold;
if the standard deviation degree Gdiff(s) if the second preset threshold value is larger than the third preset threshold value, determining that the obstacle exists on the original flight path of the unmanned aerial vehicle.
Optionally, the obtaining of the position information of the obstacle includes:
analyzing the positions of key points of the obstacles by using a binocular recognition algorithm;
and generating the contour line of the obstacle according to the key points of the obstacle.
Optionally, analyzing the positions of the key points of the obstacle by using a binocular recognition algorithm, including:
constructing a heterogeneous binocular camera by using an unmanned aerial vehicle and one follow-up camera or one follow-up camera and the other follow-up camera;
calculating a distance Z between a heterogeneous binocular camera and a key point of an obstaclecWherein, in the step (A),
Figure BDA0002644762410000032
f is the focal length of the heterogeneous binocular camera, LbThe length of a fixed base line between the left camera and the right camera of the heterogeneous binocular camera is d, and the d is the parallax between the left camera and the right camera of the heterogeneous binocular camera.
Optionally, a distance between the flight optimization path of the unmanned aerial vehicle and the obstacle is greater than or equal to a safety distance m, where m is 3 × v, and v is a flight speed of the unmanned aerial vehicle.
In a second aspect, an embodiment of the present invention further provides a server, including: a processor for implementing the method of any of the above embodiments when executing the computer program.
In a third aspect, an embodiment of the present invention further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method of any one of the above embodiments.
The invention provides an unmanned aerial vehicle and camera collaborative obstacle avoidance method, a server and a storage medium. The method comprises the following steps: distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to the received routing inspection task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera; receiving image data shot by the unmanned aerial vehicle and the follow-up camera; judging whether an obstacle exists on the original flight path of the unmanned aerial vehicle or not according to the image data; if the obstacle exists on the original flight path of the unmanned aerial vehicle, acquiring the position information of the obstacle; according to the position information of the barrier, an unmanned aerial vehicle flight optimization path is generated, and the unmanned aerial vehicle flight optimization path is sent to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path. Compared with the existing inspection operation method for the airborne radar unmanned aerial vehicle, the unmanned aerial vehicle and the camera cooperate to avoid obstacle operation, and through acquisition and processing of surrounding environment images, the method can analyze the situation of a blocking flight path of the obstacle to obtain an optimal flight monitoring route, can avoid the obstacle more efficiently and flexibly, selects the flight route more intelligently, and completes the inspection task safely.
Drawings
Fig. 1 is a schematic flowchart of a method for avoiding an obstacle in cooperation between an unmanned aerial vehicle and a camera according to an embodiment;
fig. 2 is a schematic flowchart of a method for avoiding an obstacle in cooperation between an unmanned aerial vehicle and a camera according to a second embodiment;
FIG. 3 is a first image A provided in the second embodiment1And a second image A2An overlay schematic of (a);
fig. 4 is a schematic diagram of a method for recognizing a distance between key points of an obstacle by using a binocular camera according to the second embodiment;
fig. 5 is a schematic diagram of an obstacle avoidance optimized path of an unmanned aerial vehicle according to the second embodiment;
fig. 6 is a schematic structural diagram of an obstacle avoidance device in cooperation between an unmanned aerial vehicle and a camera provided in the third embodiment;
fig. 7 is a schematic structural diagram of a server according to the fifth embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Fig. 1 is a schematic flow diagram of a method for avoiding an obstacle in cooperation between an unmanned aerial vehicle and a camera according to an embodiment, which is applicable to a case where an unmanned aerial vehicle inspection apparatus is used in a substation, and the method may be executed by a server, a computer, or the like, and the method may be implemented by hardware and/or software. The method specifically comprises the following steps:
s101, distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to received routing inspection task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera;
wherein, patrol and examine the task data and include: patrol and examine equipment, patrol and examine the follow-up camera of equipment, every task data of patrolling and examining have the original route of unmanned aerial vehicle flight and the camera follow-up scheme that corresponds, camera follow-up scheme and the original route of unmanned aerial vehicle flight are the one-to-one.
Because the installation position of the follow-up camera in the substation is fixed and unchanged, the lens of the follow-up camera (namely the follow-up camera in the invention) needs to be continuously adjusted in the direction of the lens along with the movement of the unmanned aerial vehicle, and therefore the follow-up scheme of the follow-up camera is determined according to the flight path of the unmanned aerial vehicle.
The original flight path of the unmanned aerial vehicle is a flight path of the unmanned aerial vehicle under the normal inspection condition, and the camera follow-up scheme is a scheme that a lens of the unmanned aerial vehicle rotates along with the lens of the unmanned aerial vehicle under the normal inspection condition. After the unmanned aerial vehicle and the follow-up camera receive the patrol task data, the unmanned aerial vehicle and the follow-up camera are in linkage fit with each other according to a normal patrol route and a follow-up scheme to carry out patrol operation on the equipment.
S102, receiving image data shot by the unmanned aerial vehicle and the follow-up camera;
unmanned aerial vehicle and follow-up camera carry out linkage complex to equipment and patrol and examine the in-process and can carry out image data acquisition to equipment.
Wherein the acquired image data comprises: a panoramic image of the device and a close-up image of a particular part of the device. The panoramic image of the device may be an image that captures a plurality of different orientations for the device. The close-up image of the specific portion of the device may be an image obtained by capturing an image of each feature of the specific portion of the device in different orientations, so that the number of captured images is determined by the number of features of the specific portion of the device (each image records only one feature information data of the specific portion of the device). Through analyzing the panoramic image of each position of the equipment and the image data of each specific position of the equipment, more detailed information such as the position, the shape, the size, the surrounding environment, the working characteristics, the working state and the like of the equipment is obtained, and the unmanned aerial vehicle is ensured to obtain more complete information data for the equipment and the surrounding environment in the inspection process.
S103, judging whether an obstacle exists on the original flight path of the unmanned aerial vehicle or not according to the image data;
the object which exists on the original flight path of the unmanned aerial vehicle and blocks the unmanned aerial vehicle from flying according to the original normal routing inspection route is defined as an obstacle. The obstacle may be a physical object such as substation equipment (e.g., a detection instrument, a switch, a transformer, a cable, a wave trap, etc.), a repair tool (e.g., a ladder, a glove, a clamp, a multimeter, etc.), a sundry (e.g., a chair, a stool, etc.), and the like.
Through image data, judge out whether there is the barrier on the original route of unmanned aerial vehicle flight, can confirm whether unmanned aerial vehicle flight route is fit for the unmanned aerial vehicle flight in advance, whether foresee in advance can take place to hit the quick-witted accident, increased the security that unmanned aerial vehicle flies, avoid the emergence of hitting the quick-witted accident.
If the obstacle exists on the original flight path of the unmanned aerial vehicle, the unmanned aerial vehicle reselects the optimal flight path planned by the server for the unmanned aerial vehicle, and the unmanned aerial vehicle bypasses the obstacle to continue routing inspection operation.
If no obstacle exists on the original flight path of the unmanned aerial vehicle, the unmanned aerial vehicle can continue to carry out flight operation according to the distributed normal routing inspection path.
S104, if the obstacle exists on the original flight path of the unmanned aerial vehicle, acquiring position information of the obstacle;
the position information of the obstacle includes: the barrier is apart from unmanned aerial vehicle, apart from equipment and barrier position coordinate. The position information of the obstacle is obtained through accurate positioning of the obstacle, accurate data are provided for follow-up path re-planning of the unmanned aerial vehicle, and safe flight of the unmanned aerial vehicle is achieved.
And S105, generating an unmanned aerial vehicle flight optimization path according to the position information of the obstacle, and sending the unmanned aerial vehicle flight optimization path to the unmanned aerial vehicle so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path.
When the obstacle exists on the original flight path of the unmanned aerial vehicle, the flight optimization path of the unmanned aerial vehicle is a planned obstacle avoidance path combined with position information data of the obstacle on the basis of the original flight path. The unmanned aerial vehicle flies according to the optimized path, and can continue to complete the inspection task of the equipment in the process of avoiding the flight of the barrier.
The technical scheme of this embodiment keeps away the barrier operation through unmanned aerial vehicle and camera in coordination, carry out image acquisition to surrounding environment and equipment, to having the barrier to block the flight path condition, can analyze out the accurate position of barrier on the original route, plan out the preferred flight and patrol and examine the route, provide the safety guarantee for unmanned aerial vehicle's the task of patrolling and examining, make unmanned aerial vehicle can be more high-efficient, avoid the barrier in a flexible way, select the flight route more intelligently, accomplish safely and patrol and examine the task.
Example two
Fig. 2 is a schematic flow chart of a cooperative obstacle avoidance method for an unmanned aerial vehicle and a camera provided in the second embodiment of the present invention, and as shown in fig. 2, the method includes the specific steps of:
s201, distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to received polling task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera.
S202, judging whether the number of the follow-up cameras is larger than or equal to a first preset threshold value.
Supposing that the unmanned aerial vehicle inspection task needs to carry out inspection on N devices, each inspection device corresponds to one step S E [1, N ∈](one device for each step). When the unmanned aerial vehicle inspects one device, T levels of O (T (i), n (j)) element targets are required to be identified in each step, wherein i belongs to [1, T ]]T (i) represents the corresponding level; n (j) corresponds to the number of elements on the equipment which can be inspected by the unmanned aerial vehicle, and j is a natural number. Target information needing to be detected of the object to be patrolled is D (t (i), n (i), k (l)), wherein l is belonged to [1, C (x)]Where C (x) represents the feature to be detected by the object. For the nth equipment, if the mth camera existsWherein m is [1, M(s) ]]M(s) is a camera participating in obstacle avoidance data acquisition in the step s to obtain OJm(n (j)) target element information corresponding to detectable target situation data Gm(n (j), k (l)), then the constraint conditions for target completeness for step s are:
Figure BDA0002644762410000071
if the target completeness constraint condition is met, the inspection can be normally carried out, otherwise, the inspection is not allowed.
Therefore, the first preset threshold value is 2, so that the number of the follow-up cameras is at least distributed by two, and the unmanned aerial vehicle inspection task can be normally carried out.
S203, if the number of the follow-up cameras is smaller than a first preset threshold value, the follow-up cameras are added, camera follow-up schemes are sent to the added follow-up cameras, and the total number of the follow-up cameras after the follow-up cameras are added is larger than or equal to the first preset threshold value.
The first preset threshold is 2, so that the number of the follow-up cameras is at least two. And when the distribution number of the follow-up cameras does not meet the first preset threshold value, the follow-up cameras send problem feedback to the servers for distributing the tasks, and request to increase the number of the follow-up cameras and resend the follow-up schemes of the cameras. The unmanned aerial vehicle can start the inspection task until the total number of the follow-up cameras is larger than or equal to a first preset threshold value. Of course, if the number of the slave cameras is greater than or equal to the first preset threshold, the following step S204 is directly executed.
Through judging the number of the distributed follow-up cameras, the unmanned aerial vehicle is ensured to be on a flight path, the coverage rate of the cameras and the richness of image acquisition data can be analyzed to obtain complete information data of the barrier, so that a safer flight line is planned for the unmanned aerial vehicle.
S204, receiving image data shot by the unmanned aerial vehicle and the follow-up camera.
And S205, analyzing whether the obstacle exists on the original path by using an interpolation method.
Judging whether an original path has an obstacle or not, firstly, processing an acquired image, performing fusion splicing on a first image and a second image by using a splicing method in pixel-level image fusion, and determining a repeating region of the first image and the second image; and then, carrying out difference analysis on the repeated area and the standard area corresponding to the repeated area, and judging whether the obstacle exists on the original flight path of the unmanned aerial vehicle.
Exemplarily, fig. 3 is a first image a provided in the second embodiment1And a second image A2An overlay schematic of (a); when the number of the image acquisition devices is greater than or equal to 2, the first image A can be acquired for the image data acquired by any two follow-up cameras1And a second image A2Selecting A1∩A2The portion serves as a repeat region of the first image and the second image. Repeat the image in region A1∩A2And comparing the standard pictures corresponding to the repeated areas. By selecting the picture repetition region, the accuracy of the image data is increased, and more accurate actual image data is provided for the contrast processing of subsequent pictures.
Specifically, determining the repetition region of the first image and the repetition region of the second image, and performing a difference analysis on the repetition region and the standard region corresponding to the repetition region may include the following steps:
a. dividing the first image and the second image into a plurality of image blocks with equal size;
the image block is divided into a plurality of small units with the size H x L by the finger image, and each small unit can be an image block.
b. Calculating the difference G of the image blocksdiff(l) Wherein, in the step (A),
Figure BDA0002644762410000091
h and L are the length and width of the pattern block, L is the serial number of the pattern block, i and j are the coordinates of the pixel point in the pattern block, A1Representing a first image, A2Representing a second image;
c. judging the difference G of the image blocksdiff(l) Whether or not it is smaller than the secondPresetting a threshold;
d. difference degree of image blocks Gdiff(l) If the image block is smaller than the second preset threshold value, the image block is positioned in the repeated area;
wherein the second predetermined threshold is 5%, so Gdiff(l)<The 5% part is the picture part that the camera repeatedly gathered.
e. Performing difference value analysis on the repeated region and the standard region corresponding to the repeated region without obstacles, and calculating the standard difference degree G between the repeated region and the standard region corresponding to the repeated regiondiff(s) wherein,
Figure BDA0002644762410000092
Asindicating a standard area corresponding to the repeated area;
f. degree of standard deviation Gdiff(s) whether it is greater than a third preset threshold;
g. if the standard deviation degree Gdiff(s) if the second preset threshold value is larger than the third preset threshold value, determining that the obstacle exists on the original flight path of the unmanned aerial vehicle.
Wherein the third preset threshold is 10%, when G isdiff(s)>And 10%, determining that the obstacle exists on the original flight path of the unmanned aerial vehicle.
And S206, if the obstacle exists on the original flight path of the unmanned aerial vehicle, analyzing the position of the key point of the obstacle by using a binocular recognition algorithm.
And S207, generating a contour line of the obstacle according to the key point of the obstacle.
Wherein, G isdiffMore than 10% of the points in(s) are critical points for analyzing the obstacle. The contour line of the obstacle is GdiffThe series of points of the largest 10% of(s) serves as the principal points of the envelope of the obstacle, and the principal points are connected together to outline the general outline of the obstacle.
Specifically, the method for analyzing the position of the key point of the obstacle using the binocular recognition algorithm may include:
heterogeneous dual-camera construction by utilizing unmanned aerial vehicle and one follow-up camera or one follow-up camera and another follow-up cameraA camera; calculating a distance Z between a heterogeneous binocular camera and a key point of an obstaclecWherein, in the step (A),
Figure BDA0002644762410000101
f is the focal length of the heterogeneous binocular camera, LbThe length of a fixed base line between the left camera and the right camera of the heterogeneous binocular camera is d, and the d is the parallax between the left camera and the right camera of the heterogeneous binocular camera.
For example, fig. 4 is a schematic diagram of a method for recognizing a distance between key points of an obstacle by using a binocular camera according to the second embodiment; the length of the fixed base line between the left camera and the right camera is LbThe focal length of the camera is f, and the abscissa of the projection coordinate of the point P on the left camera and the right camera is ulAnd urThe projection parallax d ═ u of the obstacle on the cameral-urAnd the arbitrary combination of the cameras on the unmanned aerial vehicle camera and/or the substation inspection route is SC1 and SC 2. The distance between the heterogeneous binocular camera and the key point of the obstacle is calculated through binocular recognition of the virtual cameras, and the distance Z of the obstacle points on the obstacle can be judgedC. After the distance is transmitted to the server, the three-dimensional size of the obstacle can be calculated, the position coordinate of the obstacle can be analyzed, and a preferred obstacle avoidance flight path can be planned for the unmanned aerial vehicle in advance.
S208, generating an unmanned aerial vehicle flight optimization path according to the position information of the obstacle, and sending the unmanned aerial vehicle flight optimization path to the unmanned aerial vehicle so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path.
The distance between the unmanned aerial vehicle flight optimization path and the obstacle is greater than or equal to a safe distance m, wherein m is 3 × v, and v is the flying speed of the unmanned aerial vehicle.
Fig. 5 is a schematic diagram of an obstacle avoidance optimized path of an unmanned aerial vehicle according to the second embodiment; unmanned aerial vehicle starts the operation of patrolling and examining after receiving original flight path 1, patrols and examines the operation according to original flight path 1, when meetting the barrier, unmanned aerial vehicle can receive the optimization flight path 2 that the server sent, and unmanned aerial vehicle just can avoid the barrier according to optimizing flight path 2, and wherein unmanned aerial vehicle carries out optimization path 2 before apart from barrier more than or equal to m, at the in-process of detouring the barrier, keeps the safe distance more than m with the barrier all the time. After the unmanned aerial vehicle bypasses the obstacle, the remaining inspection operation can be continuously completed according to the remaining original flight route.
The technical scheme of this embodiment keeps away the barrier operation through unmanned aerial vehicle and camera in coordination, carry out image acquisition to surrounding environment and equipment, to having the barrier to block the flight path condition, can analyze out the accurate position of barrier on the original route, plan out the preferred flight and patrol and examine the route, provide the safety guarantee for unmanned aerial vehicle's the task of patrolling and examining, make unmanned aerial vehicle can be more high-efficient, avoid the barrier in a flexible way, select the flight route more intelligently, accomplish safely and patrol and examine the task.
EXAMPLE III
Fig. 6 is a schematic structural diagram of an unmanned aerial vehicle and camera collaborative obstacle avoidance device provided in the third embodiment, as shown in fig. 6, including: the system comprises a patrol task module 601, an image processing module 602, an obstacle judgment module 603, an obstacle positioning module 604 and a path optimization module 605.
The routing inspection task module 601 is used for distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme according to received routing inspection task data, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the follow-up camera;
the image processing module 602 is configured to receive image data captured by the unmanned aerial vehicle and the slave camera;
the obstacle judgment module 603 is configured to judge whether an obstacle exists on an original flight path of the unmanned aerial vehicle according to the image data;
the obstacle positioning module 604 is used for acquiring position information of an obstacle if the obstacle exists on the original flight path of the unmanned aerial vehicle;
and a path optimization module 605, configured to generate an unmanned aerial vehicle flight optimization path according to the position information of the obstacle, and send the unmanned aerial vehicle flight optimization path to the unmanned aerial vehicle, so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimization path.
According to the technical scheme of the embodiment, the unmanned aerial vehicle and the camera cooperate with the obstacle avoidance device, and the image processing module is applied to acquire images of the surrounding environment and equipment; the obstacle judgment module can accurately judge whether an obstacle exists in the flight path of the unmanned aerial vehicle; the obstacle positioning module can rapidly analyze the accurate position of an obstacle on the original route aiming at the condition that the obstacle blocks the flight path; the route optimization module plans out preferred flight route of patrolling and examining according to barrier position data, provides the safety guarantee for unmanned aerial vehicle's the task of patrolling and examining, makes unmanned aerial vehicle can keep away from the barrier more high-efficiently, in a flexible way, selects the flight route more intelligently, accomplishes the task of patrolling and examining safely.
Optionally, the routing inspection task module 601 is further configured to determine whether the number of the follow-up cameras is greater than or equal to a first preset threshold; if the number of the follow-up cameras is smaller than a first preset threshold value, the follow-up cameras are added, camera follow-up schemes are sent to the added follow-up cameras, and the total number of the follow-up cameras after the follow-up cameras are added is larger than or equal to the first preset threshold value.
Optionally, the image processing module 602 is configured to perform fusion splicing on the first image and the second image, and determine a repetition region of the first image and the second image; and carrying out difference analysis on the repeated area and the standard area corresponding to the repeated area, and judging whether the obstacle exists on the original flight path of the unmanned aerial vehicle.
Optionally, the image processing module 602 is specifically configured to divide the first image and the second image into a plurality of tiles with equal size; calculating the difference G of the image blocksdiff(l) Wherein, in the step (A),
Figure BDA0002644762410000121
h and L are the length and width of the pattern block, L is the serial number of the pattern block, i and j are the coordinates of the pixel point in the pattern block, A1Representing a first image, A2Representing a second image; judging the difference G of the image blocksdiff(l) Whether the value is smaller than a second preset threshold value; difference degree of image blocks Gdiff(l) And if the value is less than the second preset threshold value, the image block is positioned in the repeated area.
Optionally, an obstacle judging module 603, configured to judge whether the obstacle is a collision or notCarrying out difference value analysis on the repeated region and the standard region corresponding to the repeated region, and calculating the standard difference degree G between the repeated region and the standard region corresponding to the repeated regiondiff(s) wherein,
Figure BDA0002644762410000131
Asindicating a standard area corresponding to the repeated area; degree of standard deviation Gdiff(s) whether it is greater than a third preset threshold; if the standard deviation degree Gdiff(s) if the second preset threshold value is larger than the third preset threshold value, determining that the obstacle exists on the original flight path of the unmanned aerial vehicle.
Optionally, the obstacle positioning module 604 is configured to analyze positions of key points of the obstacle by using a binocular recognition algorithm; and generating the contour line of the obstacle according to the key points of the obstacle.
Optionally, the obstacle positioning module 604 specifically uses an unmanned aerial vehicle and a following camera, or a following camera and another following camera to construct a heterogeneous binocular camera; calculating a distance Z between a heterogeneous binocular camera and a key point of an obstaclecWherein, in the step (A),
Figure BDA0002644762410000132
f is the focal length of the heterogeneous binocular camera, LbThe length of a fixed base line between the left camera and the right camera of the heterogeneous binocular camera is d, and the d is the parallax between the left camera and the right camera of the heterogeneous binocular camera.
Optionally, the path optimization module 605 is further configured to set a distance between the flight optimization path of the unmanned aerial vehicle and the obstacle to be greater than or equal to a safe distance m, where m is 3 × v, and v is a flight speed of the unmanned aerial vehicle.
The unmanned aerial vehicle and camera collaborative obstacle avoidance device provided by the embodiment can execute the method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 7 is a schematic structural diagram of a server according to a fourth embodiment, and as shown in fig. 7, the server includes a processor 703, a memory 701, and a communication interface 702; the number of the processors 703 in the server may be one or more, and one processor 703 is taken as an example in fig. 7; the processor 703, the memory 701 and the communication interface 702 in the server may be connected by a bus or other means, and fig. 7 illustrates the connection by the bus as an example. A bus represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
The memory 701, which is a computer-readable storage medium, may be configured to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the methods in the embodiments of the present invention. The processor 703 executes at least one functional application of the server and data processing by executing software programs, instructions and modules stored in the memory 701, thereby implementing the above-described method.
The memory 701 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the server, and the like. Further, the memory 701 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 701 may include memory located remotely from the processor 703, which may be connected to a server over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Communication interface 702 may be configured to receive and transmit data.
EXAMPLE five
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method provided in any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. Computer-readable storage media include (a non-exhaustive list): an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, Ruby, Go, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the internet using an internet service provider).
It will be clear to a person skilled in the art that the term user terminal covers any suitable type of wireless user equipment, such as a mobile phone, a portable data processing device, a portable web browser or a car mounted mobile station.
In general, the various embodiments of the invention may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
Embodiments of the invention may be implemented by a data processor of a mobile device executing computer program instructions, for example in a processor entity, or by hardware, or by a combination of software and hardware. The computer program instructions may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages.
Any logic flow block diagrams in the figures of the present invention may represent program steps, or may represent interconnected logic circuits, modules, and functions, or may represent a combination of program steps and logic circuits, modules, and functions. The computer program may be stored on a memory. The memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), optical storage devices and systems (digital versatile disks, DVDs, or CD discs), etc. The computer readable medium may include a non-transitory storage medium. The data processor may be of any type suitable to the local technical environment, such as but not limited to general purpose computers, special purpose computers, microprocessors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Programmable logic devices (FGPAs), and processors based on a multi-core processor architecture.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An unmanned aerial vehicle and camera collaborative obstacle avoidance method is characterized by comprising the following steps:
according to the received routing inspection task data, distributing an original flight path of the unmanned aerial vehicle and a camera follow-up scheme, and respectively sending the original flight path of the unmanned aerial vehicle and the camera follow-up scheme to the unmanned aerial vehicle and the follow-up camera;
receiving image data shot by the unmanned aerial vehicle and the follow-up camera;
judging whether an obstacle exists on the original flight path of the unmanned aerial vehicle or not according to the image data;
if the obstacle exists on the original flight path of the unmanned aerial vehicle, acquiring the position information of the obstacle;
and generating an unmanned aerial vehicle flight optimized path according to the position information of the barrier, and sending the unmanned aerial vehicle flight optimized path to the unmanned aerial vehicle so that the unmanned aerial vehicle flies according to the unmanned aerial vehicle flight optimized path.
2. The method of claim 1, further comprising, after sending the drone flight raw path to a drone and the camera follow-up scheme to a follow-up camera, respectively:
judging whether the number of the follow-up cameras is larger than or equal to a first preset threshold value or not;
if the number of the follow-up cameras is smaller than a first preset threshold value, the follow-up cameras are added, the camera follow-up schemes are sent to the added follow-up cameras, and the total number of the follow-up cameras after the follow-up cameras are added is larger than or equal to the first preset threshold value.
3. The method of claim 1, wherein for a first image and a second image in the image data, the determining whether an obstacle exists on the original flight path of the unmanned aerial vehicle according to the image data comprises:
performing fusion splicing on the first image and the second image, and determining a repeated region of the first image and the second image;
and carrying out difference analysis on the repeated area and the standard area corresponding to the repeated area, and judging whether the obstacle exists on the original flight path of the unmanned aerial vehicle.
4. The method according to claim 3, wherein the fusion splicing of the first image and the second image, and the determining of the repeated region of the first image and the second image comprises:
dividing the first image and the second image into a number of tiles of equal size;
calculating the difference degree G of the image blocksdiff(l) Wherein, in the step (A),
Figure FDA0002644762400000021
h and L are respectively the length and width of the image block, L is the serial number of the image block, i and j are respectively the coordinates of the pixel point in the image block, A1Representing said first image, A2Representing the second image;
judging the difference degree G of the image blocksdiff(l) Whether the value is smaller than a second preset threshold value;
if the difference degree G of the image blocksdiff(l) And if the image block is smaller than a second preset threshold, the image block is positioned in the repeated area.
5. The method according to claim 4, wherein the performing difference analysis on the repeating area and a standard area corresponding to the repeating area to determine whether an obstacle exists on the original flight path of the unmanned aerial vehicle comprises:
performing difference analysis on the repeated region and the standard region corresponding to the repeated region, and calculating the standard difference degree G between the repeated region and the standard region corresponding to the repeated regiondiff(s) wherein,
Figure FDA0002644762400000022
Asrepresenting a standard area corresponding to the repeated area;
judging the standard difference degree Gdiff(s) whether it is greater than a third preset threshold;
if the standard deviation degree Gdiff(s) if the distance is greater than a third preset threshold value, determining that an obstacle exists on the original flight path of the unmanned aerial vehicle.
6. The method of claim 1, wherein the obtaining the location information of the obstacle comprises:
analyzing the positions of key points of the obstacles by using a binocular recognition algorithm;
and generating a contour line of the obstacle according to the key points of the obstacle.
7. The method of claim 6, wherein analyzing the location of the key points of the obstacle using a binocular recognition algorithm comprises:
constructing a heterogeneous binocular camera by using the unmanned aerial vehicle and one follow-up camera or one follow-up camera and the other follow-up camera;
calculating a distance Z between the heterogeneous binocular camera and a key point of the obstaclecWherein, in the step (A),
Figure FDA0002644762400000031
f is the focal length of the heterogeneous binocular camera, LbThe length of a fixed baseline between the left camera and the right camera of the heterogeneous binocular camera is d, and the d is the parallax between the left camera and the right camera of the heterogeneous binocular camera.
8. The method of claim 1, wherein the distance between the drone flight optimization path and the obstacle is greater than or equal to a safe distance m, where m is 3 x v, v being the drone flight speed.
9. A server, comprising: a processor for implementing the unmanned aerial vehicle and camera collaborative obstacle avoidance method of any of claims 1-8 when executing a computer program.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the unmanned aerial vehicle and camera collaborative obstacle avoidance method according to any one of claims 1-8.
CN202010851105.3A 2020-08-21 2020-08-21 Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium Pending CN111988524A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010851105.3A CN111988524A (en) 2020-08-21 2020-08-21 Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010851105.3A CN111988524A (en) 2020-08-21 2020-08-21 Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium

Publications (1)

Publication Number Publication Date
CN111988524A true CN111988524A (en) 2020-11-24

Family

ID=73442902

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010851105.3A Pending CN111988524A (en) 2020-08-21 2020-08-21 Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium

Country Status (1)

Country Link
CN (1) CN111988524A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506228A (en) * 2020-12-28 2021-03-16 广东电网有限责任公司中山供电局 Substation unmanned aerial vehicle optimal emergency hedge path selection method
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113706691A (en) * 2021-08-24 2021-11-26 广东电网有限责任公司 Three-dimensional modeling method and device for transformer substation
CN114740901A (en) * 2022-06-13 2022-07-12 深圳联和智慧科技有限公司 Unmanned aerial vehicle cluster flight method and system and cloud platform
CN115562348A (en) * 2022-11-03 2023-01-03 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle image technology method based on transformer substation
CN115951718A (en) * 2023-03-14 2023-04-11 风脉能源(武汉)股份有限公司 Fan blade inspection local dynamic path planning method and system based on unmanned aerial vehicle
CN116736879A (en) * 2023-08-16 2023-09-12 成都飞航智云科技有限公司 Unmanned aerial vehicle automatic obstacle avoidance method and obstacle avoidance system based on cloud computing
CN117170411A (en) * 2023-11-02 2023-12-05 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070058838A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Object position detecting apparatus, map creating apparatus, autonomous mobile apparatus, object position detecting method, and computer program product for object position detection
CN102707724A (en) * 2012-06-05 2012-10-03 清华大学 Visual localization and obstacle avoidance method and system for unmanned plane
CN103984357A (en) * 2014-05-30 2014-08-13 中国人民解放军理工大学 Unmanned aerial vehicle automatic obstacle avoidance flight system based on panoramic stereo imaging device
CN107203970A (en) * 2017-06-20 2017-09-26 长沙全度影像科技有限公司 A kind of video-splicing method based on dynamic optimal suture
EP3332299A1 (en) * 2015-08-07 2018-06-13 Institut de Recherche Technologique Jules Verne Device and method for detecting obstacles suitable for a mobile robot
CN108184075A (en) * 2018-01-17 2018-06-19 百度在线网络技术(北京)有限公司 For generating the method and apparatus of image
CN110182509A (en) * 2019-05-09 2019-08-30 盐城品迅智能科技服务有限公司 A kind of track guidance van and the barrier-avoiding method of logistic storage intelligent barrier avoiding
CN110231833A (en) * 2019-06-14 2019-09-13 渤海大学 A kind of oil field inspection fixed-point data acquisition system and method based on multiple no-manned plane
CN110362117A (en) * 2019-08-19 2019-10-22 广东电网有限责任公司 A kind of unmanned plane paths planning method, equipment, unmanned aerial vehicle (UAV) control device and storage medium
CN110738088A (en) * 2018-07-18 2020-01-31 丰田自动车株式会社 Image processing apparatus
CN111476818A (en) * 2020-02-28 2020-07-31 江苏理工学院 Multi-camera low-altitude unmanned aerial vehicle target tracking device and tracking method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070058838A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Object position detecting apparatus, map creating apparatus, autonomous mobile apparatus, object position detecting method, and computer program product for object position detection
CN102707724A (en) * 2012-06-05 2012-10-03 清华大学 Visual localization and obstacle avoidance method and system for unmanned plane
CN103984357A (en) * 2014-05-30 2014-08-13 中国人民解放军理工大学 Unmanned aerial vehicle automatic obstacle avoidance flight system based on panoramic stereo imaging device
EP3332299A1 (en) * 2015-08-07 2018-06-13 Institut de Recherche Technologique Jules Verne Device and method for detecting obstacles suitable for a mobile robot
CN107203970A (en) * 2017-06-20 2017-09-26 长沙全度影像科技有限公司 A kind of video-splicing method based on dynamic optimal suture
CN108184075A (en) * 2018-01-17 2018-06-19 百度在线网络技术(北京)有限公司 For generating the method and apparatus of image
CN110738088A (en) * 2018-07-18 2020-01-31 丰田自动车株式会社 Image processing apparatus
CN110182509A (en) * 2019-05-09 2019-08-30 盐城品迅智能科技服务有限公司 A kind of track guidance van and the barrier-avoiding method of logistic storage intelligent barrier avoiding
CN110231833A (en) * 2019-06-14 2019-09-13 渤海大学 A kind of oil field inspection fixed-point data acquisition system and method based on multiple no-manned plane
CN110362117A (en) * 2019-08-19 2019-10-22 广东电网有限责任公司 A kind of unmanned plane paths planning method, equipment, unmanned aerial vehicle (UAV) control device and storage medium
CN111476818A (en) * 2020-02-28 2020-07-31 江苏理工学院 Multi-camera low-altitude unmanned aerial vehicle target tracking device and tracking method

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506228A (en) * 2020-12-28 2021-03-16 广东电网有限责任公司中山供电局 Substation unmanned aerial vehicle optimal emergency hedge path selection method
CN112506228B (en) * 2020-12-28 2023-11-07 广东电网有限责任公司中山供电局 Optimal emergency risk avoiding path selection method for unmanned aerial vehicle of transformer substation
CN112816939B (en) * 2020-12-31 2023-08-01 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN112816939A (en) * 2020-12-31 2021-05-18 广东电网有限责任公司 Substation unmanned aerial vehicle positioning method based on Internet of things
CN113706691A (en) * 2021-08-24 2021-11-26 广东电网有限责任公司 Three-dimensional modeling method and device for transformer substation
CN114740901A (en) * 2022-06-13 2022-07-12 深圳联和智慧科技有限公司 Unmanned aerial vehicle cluster flight method and system and cloud platform
CN114740901B (en) * 2022-06-13 2022-08-19 深圳联和智慧科技有限公司 Unmanned aerial vehicle cluster flight method and system and cloud platform
CN115562348A (en) * 2022-11-03 2023-01-03 国网福建省电力有限公司漳州供电公司 Unmanned aerial vehicle image technology method based on transformer substation
CN115951718A (en) * 2023-03-14 2023-04-11 风脉能源(武汉)股份有限公司 Fan blade inspection local dynamic path planning method and system based on unmanned aerial vehicle
CN115951718B (en) * 2023-03-14 2023-05-09 风脉能源(武汉)股份有限公司 Unmanned aerial vehicle-based fan blade inspection local dynamic path planning method and system
CN116736879A (en) * 2023-08-16 2023-09-12 成都飞航智云科技有限公司 Unmanned aerial vehicle automatic obstacle avoidance method and obstacle avoidance system based on cloud computing
CN117170411A (en) * 2023-11-02 2023-12-05 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle
CN117170411B (en) * 2023-11-02 2024-02-02 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
CN111988524A (en) Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium
CN108496129B (en) Aircraft-based facility detection method and control equipment
CN110069071B (en) Unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
CN111958592B (en) Image semantic analysis system and method for transformer substation inspection robot
US10278333B2 (en) Pruning robot system
CN109840454B (en) Target positioning method, device, storage medium and equipment
JP7282186B2 (en) situational awareness surveillance
CN113469045B (en) Visual positioning method and system for unmanned integrated card, electronic equipment and storage medium
EP4040400A1 (en) Guided inspection with object recognition models and navigation planning
KR20160103459A (en) node-link based camera network monitoring system and method of monitoring the same
CN115049322B (en) Container management method and system for container yard
CN114067615A (en) Intelligent security system based on Internet of things
CN112085953A (en) Traffic command method, device and equipment
CN110738867B (en) Parking space detection method, device, equipment and storage medium
CN113189989A (en) Vehicle intention prediction method, device, equipment and storage medium
CN114637320A (en) Cable monitoring system for determining routing inspection parameters according to historical fault data
CN112000109B (en) Position correction method for power inspection robot, power inspection robot and medium
CN117608302A (en) Automatic routing inspection path planning method, device, equipment and storage medium
CN114286086B (en) Camera detection method and related device
CN114339155B (en) Snapshot vulnerability route determining method and related device
CN116700228A (en) Robot path planning method, electronic device and readable storage medium
CN117389311B (en) Unmanned aerial vehicle autonomous obstacle surmounting method and device, electronic equipment and storage medium
CN111024705A (en) Method, device, equipment and storage medium for detecting broken power line
CN113916233B (en) Navigation route determining method, device, equipment and storage medium
KR20150050224A (en) Apparatus and methdo for abnormal wandering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201124

RJ01 Rejection of invention patent application after publication