Nothing Special   »   [go: up one dir, main page]

WO2019144271A1 - 无人机的控制方法、设备和无人机 - Google Patents

无人机的控制方法、设备和无人机 Download PDF

Info

Publication number
WO2019144271A1
WO2019144271A1 PCT/CN2018/073803 CN2018073803W WO2019144271A1 WO 2019144271 A1 WO2019144271 A1 WO 2019144271A1 CN 2018073803 W CN2018073803 W CN 2018073803W WO 2019144271 A1 WO2019144271 A1 WO 2019144271A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
drone
hand
image
preset
Prior art date
Application number
PCT/CN2018/073803
Other languages
English (en)
French (fr)
Inventor
钱杰
刘政哲
庞磊
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2018/073803 priority Critical patent/WO2019144271A1/zh
Priority to CN201880001655.7A priority patent/CN109074168B/zh
Priority to CN202210589359.1A priority patent/CN114879715A/zh
Publication of WO2019144271A1 publication Critical patent/WO2019144271A1/zh
Priority to US16/934,910 priority patent/US12125229B2/en
Priority to US18/920,216 priority patent/US20250045949A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/13Propulsion using external fans or propellers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/10Propulsion
    • B64U50/19Propulsion using electrically powered motors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the embodiment of the invention relates to the technical field of drones, and in particular to a control method, device and drone of a drone.
  • the embodiment of the invention provides a control method, a device and a drone of a drone to simplify the control mode of the drone.
  • an embodiment of the present invention provides a method for controlling a drone, including:
  • the flight of the drone is controlled according to the position information of the hand of the target object.
  • an embodiment of the present invention provides a method for controlling a drone, including:
  • the drone When the gesture of the hand of the target object is a control gesture, the drone is controlled to perform the action indicated by the gesture.
  • an embodiment of the present invention provides a method for controlling a drone, including:
  • the drone is controlled to track the target object such that the target object is in a photographing screen of the photographing device.
  • an embodiment of the present invention provides a control device for a drone, including: a memory and a processor;
  • the memory is configured to store program code
  • the processor is configured to invoke the program code to execute:
  • the flight of the drone is controlled according to the position information of the hand of the target object.
  • an embodiment of the present invention provides a control device for a drone, including: a memory and a processor;
  • the memory is configured to store program code
  • the processor is configured to invoke the program code to execute:
  • the drone When the gesture of the hand of the target object is a control gesture, the drone is controlled to perform the action indicated by the gesture.
  • an embodiment of the present invention provides a control device for a drone, including: a memory and a processor;
  • the memory is configured to store program code
  • the processor is configured to invoke the program code to execute:
  • the drone is controlled to track the target object such that the target object is in a photographing screen of the photographing device.
  • an embodiment of the present invention provides a drone, including:
  • control device for a drone at least one of the fourth aspect, the fifth aspect, and the sixth aspect;
  • an embodiment of the present invention provides a readable storage medium, where the readable storage medium stores a computer program; when the computer program is executed, the first aspect, the second aspect, and the third aspect are implemented At least one aspect of the control method of the unmanned aerial vehicle according to the embodiment of the present invention.
  • the control method, device and drone of the unmanned aerial vehicle provided by the embodiment of the present invention determine the position of the target object in the image in the image by acquiring the image captured by the imaging device, and determine the target object according to the position of the hand in the image.
  • the position information of the hand controls the flight of the drone according to the position information of the hand of the target object. Therefore, the present embodiment can control the flight of the drone according to the hand of the target object in the image captured by the photographing device. It avoids the situation that the user must operate the control device to control the drone, overcomes the defect that the user is not familiar with the control device and cannot control the drone, simplifies the control mode and operation process of controlling the drone, and improves the control drone. The efficiency of flying enhances the entertainment of human-computer interaction.
  • FIG. 1 is a schematic architectural diagram of a drone according to an embodiment of the present invention.
  • FIG. 2 is a flowchart of a method for controlling a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a target object in an image captured by a camera device according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 5 is a schematic diagram of controlling the height of a drone according to an embodiment of the present invention.
  • FIG. 6 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 7 is a schematic diagram of controlling the height of a drone according to another embodiment of the present invention.
  • FIG. 8 is a schematic diagram of controlling the height of a drone according to another embodiment of the present invention.
  • FIG. 9 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 10 is a schematic diagram of controlling a drone to fly around a target object according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of controlling a drone to fly around a target object according to another embodiment of the present invention.
  • FIG. 12 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 13 is a schematic diagram of controlling a drone to fly away from or near a target object according to an embodiment of the present invention
  • FIG. 14 is a schematic diagram of controlling a drone to fly away from or near a target object according to another embodiment of the present invention.
  • 15 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 16 is a schematic diagram of controlling take-off of a drone according to an embodiment of the present invention.
  • FIG. 17 is a schematic diagram of controlling a drone to drop according to an embodiment of the present invention.
  • FIG. 18 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 19 is a schematic structural diagram of a control device for a drone according to an embodiment of the present invention.
  • FIG. 20 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • Embodiments of the present invention provide a control method, apparatus, and drone for a drone.
  • the drone may be a rotorcraft, for example, a multi-rotor aircraft driven by air by a plurality of pushing devices, and embodiments of the present invention are not limited thereto.
  • FIG. 1 is a schematic architectural diagram of a drone in accordance with an embodiment of the present invention. This embodiment is described by taking a rotorcraft unmanned aerial vehicle as an example.
  • the drone 100 can include a power system 150, a flight control system 160, and a rack.
  • the rack can include a fuselage and a tripod (also known as a landing gear).
  • the fuselage may include a center frame and one or more arms coupled to the center frame, the one or more arms extending radially from the center frame.
  • the tripod is coupled to the fuselage for supporting the drone 100 when landing.
  • Power system 150 may include one or more electronic governors (referred to as ESCs) 151, one or more propellers 153, and one or more electric machines 152 corresponding to one or more propellers 153, wherein motor 152 is coupled Between the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are disposed on the arm of the drone 100; the electronic governor 151 is configured to receive the driving signal generated by the flight control system 160 and provide driving according to the driving signal. Current is supplied to the motor 152 to control the rotational speed of the motor 152. Motor 152 is used to drive the propeller to rotate to power the flight of drone 100, which enables drone 100 to achieve one or more degrees of freedom of motion.
  • ESCs electronic governors
  • the drone 100 can be rotated about one or more axes of rotation.
  • the above-described rotating shaft may include a roll, a yaw, and a pitch.
  • the motor 152 can be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • Flight control system 160 may include flight controller 161 and sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the drone, that is, the position information and state information of the drone 100 in space, for example, three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration, and three-dimensional angular velocity.
  • Sensing system 162 may, for example, include at least one of a gyroscope, an ultrasonic sensor, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system can be a Global Positioning System (GPS).
  • GPS Global Positioning System
  • the flight controller 161 is used to control the flight of the drone 100, for example, the flight of the drone 100 can be controlled based on the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the drone 100 according to a pre-programmed program command, or may control the drone 100 through a photographing screen.
  • the drone 100 also includes a pan/tilt head 120, which may include a motor 122.
  • the pan/tilt is used to carry the photographing device 123.
  • the flight controller 161 can control the motion of the platform 120 via the motor 122.
  • the platform 120 may further include a controller for controlling the motion of the platform 120 by controlling the motor 122.
  • the platform 120 can be independent of the drone 100 or a portion of the drone 100.
  • the motor 122 can be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the gimbal can be located at the top of the drone or at the bottom of the drone.
  • the photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and perform photographing under the control of the flight controller, and the flight controller may also take an image according to the photographing device 123.
  • the drone 100 is controlled.
  • the imaging device 123 of the present embodiment includes at least a photosensitive element, such as a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It can be understood that the photographing device 123 can also be directly fixed to the drone 100, so that the pan/tilt head 120 can be omitted.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • FIG. 2 is a flowchart of a method for controlling a drone according to an embodiment of the present invention. As shown in FIG. 2, the method in this embodiment may include:
  • the camera is an important part of the drone, which can be used to capture images around the drone.
  • the drone can control the drone according to the image captured by the camera. After the image is captured by the camera, the drone acquires an image taken by the camera.
  • the photographing device may be the photographing device 123 of the foregoing part, and details are not described herein again.
  • the drone is mainly controlled according to the hand in the image, and the user can be located at a position that the photographing device can capture.
  • the user can be referred to as a target object, and the image captured by the photographing device has a target object. Therefore, the embodiment is After acquiring the image captured by the photographing device, determining the position of the hand of the target object in the image in the image, the position may be represented by pixel coordinates in the image coordinate system UOV, wherein, as shown in FIG. 3, FIG.
  • FIG. 3 is a A schematic diagram of a target object in an image captured by a camera provided by an embodiment, wherein an upper left corner of the image may be used as an origin (0, 0), and the length and width of the image are represented by a number of pixels, based on which the target object may be determined.
  • the pixel coordinates of the hand in the image after acquiring the image of the target object captured by the photographing device, the hand of the target object can be identified by the neural network capable of recognizing the hand. Specifically, the neural network can return the position of the hand of the target object at the image. In some cases, the neural network may return coordinates of the upper left and lower right corners of the image region corresponding to the hand of the target object in the image.
  • the embodiment can determine the person whose position is closest to the center of the image as the target object.
  • S203 Determine position information of the target object's hand according to the position of the hand in the image.
  • the position information of the target object's hand is determined according to the position of the hand in the image.
  • the position information may be represented by a three-dimensional coordinate (x, y, z), which may be, for example, a coordinate in a navigation coordinate system of the drone, and an origin O in the navigation coordinate system is a take-off point of the drone.
  • the positive axis of the X-axis of the navigation coordinate system points in the north direction
  • the positive axis of the Y-axis of the navigation coordinate system points in the east direction
  • the Z-axis of the navigation coordinate system is perpendicular to the XOY plane away from the ground.
  • the three-dimensional coordinates may also be coordinates under other coordinate systems, and are not specifically limited herein.
  • the flight of the drone is controlled according to the position information of the hand of the target object, for example, the drone can be controlled to perform various flight operations: controlling the flying height of the drone Or, the drone can be controlled to fly around the target object, or the drone can be controlled to fly away from or close to the target object, wherein the flight path of the drone and the function of the drone can be different in different flight operations.
  • the different position information of the target object's hand can correspond to different flight operations, and therefore, the target object can control the drone to perform different flight operations by controlling the hand to be in different positions.
  • This embodiment controls the flight operation corresponding to the position information of the hand of the drone that executes the target object.
  • the control method of the unmanned aerial vehicle determines the position of the hand of the target object in the image by acquiring the image captured by the imaging device, and determines the position information of the hand of the target object according to the position of the hand in the image, according to The position information of the hand of the target object controls the flight of the drone. Therefore, the present embodiment can control the flight of the drone according to the hand of the target object in the image captured by the photographing device. It avoids the situation that the user must operate the control device to control the drone, overcomes the defect that the user is not familiar with the control device and cannot control the drone, simplifies the control mode and operation process of controlling the drone, and improves the control drone. The efficiency of flying enhances the entertainment of human-computer interaction.
  • one possible implementation manner of the foregoing S203 is: according to the position of the hand in the image, the attitude of the pan/tilt carrying the camera, the horizontal distance between the target object and the drone, and the drone
  • the location information determines location information of the target object's hand.
  • the attitude of the pan/tilt carrying the imaging device determines the attitude of the imaging device, that is, determines the attitude angle of the imaging device, such as the elevation angle and the yaw angle; therefore, the attitude of the pan/tilt carrying the imaging device is different, and the target is also caused.
  • the position information of the object's hand is different.
  • the horizontal distance between the target object and the drone determines the horizontal distance between the hand and the drone, which affects the position information of the hand.
  • the horizontal distance between the target object and the drone is different, which also causes the target.
  • the position information of the object's hand is different.
  • the location information of the target object is obtained by referring to the location information of the UAV, wherein the location information of the UAV can be obtained by using a positioning sensor configured on the UAV, wherein the positioning sensor can be a GPS.
  • the receiver or the Beidou receiver may, in some cases, include an inertial measurement unit, a visual sensor, and the like.
  • the embodiment determines the accurate target object's hand according to the position of the hand in the image, the attitude of the pan/tilt carrying the camera, the horizontal distance between the target object and the drone, and the position information of the drone. location information.
  • the orientation of the hand relative to the drone is determined according to the position of the hand in the image and the posture of the pan/tilt carrying the camera; and then according to the orientation, the horizontal distance between the target object and the drone, and The position information of the drone determines the position information of the hand of the target object.
  • the angle of view (FOV) of the photographing device is known, and the angle of the hand relative to the optical axis of the photographing device can be determined according to the position of the hand in the image, for example, if the hand is in the center of the image, the hand is relatively photographed.
  • the angle of the optical axis of the device is 0.
  • the horizontal angle of the hand relative to the optical axis of the imaging device is 10 degrees, and the vertical direction is similar.
  • the description of the pan/tilt of the photographing device also determines the orientation of the optical axis of the photographing device, and the orientation of the hand relative to the drone can be obtained by combining the angle of the hand with respect to the optical axis of the photographing device and the orientation of the optical axis. .
  • the horizontal distance between the hand and the drone can be obtained according to the horizontal distance between the target object and the drone, for example, the horizontal distance between the target object and the drone can be subtracted by an empirical value (for example, 0.65 m). That is, the horizontal distance between the hand and the drone is obtained. Based on the orientation obtained above, the distance between the hand and the drone, and the position information of the drone, the position information of the hand can be determined.
  • determining the horizontal distance between the target object and the drone can be achieved as follows:
  • a feasible way determining the position of the foot of the target object in the image in the image, according to the position information of the foot in the image and the posture of the pan/tilt carrying the camera, the foot of the target object is determined relative to
  • the orientation of the drone can be determined according to the orientation to determine the angle of the foot of the target object relative to the drone in the pitch direction, and then the height value measured by the distance sensor configured on the drone can be obtained, according to the The angle in the pitch direction and the height value measured by the distance sensor determine the horizontal distance between the target object and the drone.
  • determining the position of the foot and the head of the target object in the image in the image, the position of the foot and the head in the image, and the posture of the head of the camera carrying the camera can determine the foot of the target object
  • the orientation of the part and the head relative to the drone, according to the orientation, the angle of the foot and the foot of the target object relative to the drone in the pitch direction can be determined, and the height of the target object can be set as an experience.
  • a value for example 1.75 m, can be determined according to the angle of the target object's foot and foot relative to the drone in the pitch direction and the height of the set target object to determine the horizontal distance between the target object and the drone .
  • the horizontal distance between the target object and the drone can be the horizontal distance obtained by fusing the horizontal distance determined by the above two feasible ways.
  • the above S204 may be: controlling the flying height of the drone according to the position information of the hand of the target object, for example, controlling the flying height of the drone to be turned up or down.
  • the embodiment may determine an angle of the hand relative to the drone in the pitch direction according to the position information of the target object and the position information of the drone determined in the above S203; The angle controls the flying height of the drone.
  • the angle of the hand relative to the drone in the pitch direction can be determined
  • FIG. 4 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 4, the method of the embodiment controls the flying height of the drone according to the position information of the hand.
  • the method of the example may include:
  • S303 Determine location information of the target object's hand according to the position of the hand in the image.
  • the angle of the hand relative to the drone in the pitch direction is determined according to the position information of the hand of the target object and the position information of the drone, and then determining according to the angle The desired height of the man-machine, then control the drone to fly to the desired height.
  • the desired height of the drone can be determined according to the angle and the horizontal distance between the target object and the drone, for example, subtracting an empirical value from the horizontal distance between the target object and the drone to obtain the target The horizontal distance between the object's hand and the drone, and then determining the desired height of the drone based on the angle and the horizontal distance between the hand and the drone of the target object, wherein the desired height of the drone Can be at the same height as the target object's hand.
  • 5 is a schematic diagram of controlling the height of the drone according to an embodiment of the present invention. As shown in FIG. 5, according to the position information of the hand of the target object and the position information of the drone, the hand of the target object can be determined to be relative to none.
  • the embodiment executes the above S304.
  • the state parameter of the target object satisfies the first preset requirement, that the size ratio of the target object in the image is greater than or equal to a preset first ratio threshold; and/or the target object and the target The distance of the drone is less than or equal to the preset first distance.
  • it may be determined whether the size ratio of the target object in the image is less than a preset first ratio threshold, and when the size ratio of the target object in the image is greater than or equal to a preset first ratio threshold, performing the above S304, wherein the larger the size ratio of the target object in the image, the closer the distance between the target object and the drone.
  • the distance between the target object and the drone is greater than a preset first distance, and when the distance between the target object and the drone is less than or equal to the preset first distance, performing the above S304, wherein the target object and the drone
  • the distance can be obtained by the binocular camera configured on the drone. Therefore, when the state parameter of the target object satisfies the first preset requirement, the distance between the target object and the drone is relatively close, and the near-field state is obtained, and the drone can accurately obtain the hand relative to the drone.
  • the angle in the pitch direction to precisely control the flying height of the drone.
  • the flying height of the drone is controlled by taking an image, the situation that the user must operate the control device can be controlled to control the drone, and the unfamiliar control device is overcome, and the uncontrollable cannot be controlled.
  • the defect of the machine simplifies the control mode and operation process of controlling the drone, improves the efficiency of controlling the flight of the drone, and enhances the entertainment of human-computer interaction.
  • FIG. 6 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 6 , the method of the embodiment controls the flying height of the drone according to the position information of the hand.
  • the method of the example may include:
  • S402. Determine a position of a target object in the image in the image.
  • S403. Determine position information of the target object's hand according to the position of the hand in the image.
  • S404 Determine, according to position information of the preset part of the target object and position information of the hand, an angle of the hand in the pitch direction with respect to the preset part.
  • the predetermined portion may be, for example, at least one of a head, a shoulder, and a chest.
  • the position information of the preset part of the target object may be determined according to the position information of the target object. Taking the preset part as the head, for example, the top fifth of the target object may be used as the head, thereby The location information of the target object determines the position information of the top fifth of the target object.
  • the present embodiment after acquiring the image captured by the photographing device, the present embodiment further determines the position of the target object in the image in the image, and then determines the position information of the target object according to the position of the target object in the image.
  • the embodiment may be according to a position of the target object in the image, a posture of a pan-tilt carrying the photographing device, a horizontal distance between the target object and the drone, and the The location information of the drone determines the location information of the target object.
  • the process of determining the location information of the target object is similar to the process of determining the location information of the hand, and details are not described herein again.
  • the desired height of the drone can be determined according to the angle and the horizontal distance between the target object and the drone, for example, determining the unmanned according to the angle and the horizontal distance between the target object and the drone The desired height of the machine.
  • the preset portion is taken as an example for illustration.
  • FIG. 7 is a schematic diagram of controlling the height of a drone according to another embodiment of the present invention.
  • the position information of the hand according to the target object and the position of the preset part of the target object are shown in FIG.
  • the information can determine that the angle of the target object's hand relative to the preset portion in the pitch direction is ⁇ 2, wherein the horizontal distance between the target object and the drone is D2, and the desired height of the drone and the target object can be obtained.
  • the height difference between the parts is D2*tan ⁇ 2, and the height difference between the current height Pc2 of the drone and the preset part of the target object is ⁇ h, and the drone can be obtained from the current height Pc2 to FIG.
  • the embodiment can control the drone to fly to the same line as the hand and the preset part of the target object.
  • the flying height of the drone can be controlled according to the movement of the hand.
  • FIG. 8 is a schematic diagram of controlling the height of the drone according to another embodiment of the present invention.
  • the target object may be determined according to the position information of the target object before the movement of the hand and the position information of the preset part of the target object.
  • the angle of the hand in the pitch direction relative to the preset portion before moving is ⁇ 3, and according to the position information of the target object after the movement of the hand and the position information of the preset portion of the target object, it can be determined that the target object's hand is after moving relative to The angle of the preset portion in the pitch direction is ⁇ 3'.
  • the amount of change of the angle of the target pair's hand relative to the preset portion in the pitch direction is ⁇ 3+ ⁇ 3'. It should be noted that if the hand is before and after the movement If they are located below or above the preset position, the amount of change of the above angle is - ⁇ 3- ⁇ 3'-.
  • the horizontal distance between the target object and the drone is D3, so according to the change amount of the angle and the horizontal distance between the target object and the drone, the connection between the hand before the movement and the target object can be obtained.
  • the embodiment can map the height change in the vertical direction of the drone according to the height change of the hand, thereby controlling the flying height of the drone according to the height change in the vertical direction of the drone.
  • the embodiment executes the above S404.
  • the state parameter of the target object satisfies the second preset requirement that: the size ratio of the target object in the image is less than or equal to a preset second ratio threshold; and/or the target object and the target The distance of the drone is greater than or equal to the preset second distance.
  • the drone can accurately obtain the pre-preparation of the hand relative to the target object.
  • Set the angle of the part in the pitch direction to precisely control the flying height of the drone.
  • the preset second occupancy threshold may be equal to the preset first percentage threshold.
  • the preset second distance may be equal to the preset first distance.
  • the embodiment shown in FIG. 4 and FIG. 6 can be combined, that is, when the state parameter of the target object satisfies the second preset requirement, the embodiment shown in FIG. 6 is executed, and the state parameter of the target object satisfies the first preset. The embodiment shown in Fig. 4 is executed when required.
  • the flying height of the drone is controlled by taking an image, the situation that the user must operate the control device can be controlled to control the drone, and the unfamiliar control device is overcome, and the uncontrollable cannot be controlled.
  • the defect of the machine simplifies the control mode and operation process of controlling the drone, improves the efficiency of controlling the flight of the drone, and enhances the entertainment of human-computer interaction.
  • FIG. 9 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 9 , the method of the embodiment controls a drone to perform a surround flight of a target object according to position information of the hand.
  • the method of this embodiment may include:
  • S503. Determine position information of the target object's hand according to the position of the hand in the image.
  • determining an angle of the hand in the yaw direction with respect to the target object according to the position information of the hand of the target object and the position information of the target object, according to the angle Determining the desired position information of the drone then controlling the drone and then controlling the drone to fly around the target object to the desired position information.
  • FIG. 10 is a schematic diagram of controlling a drone to fly around a target object according to an embodiment of the present invention.
  • Determining the angle of the target object's hand relative to the target object in the yaw direction is ⁇ 1.
  • the angle ⁇ 1 can be used as the desired angle of the drone with respect to the target object in the yaw direction, and then the embodiment will be unmanned.
  • the aircraft flies around the target object from the current position Pc4 to the desired position Pt4, and the angle of the drone in the yaw direction with respect to the target object is ⁇ 1 when the UAV orbits the flight to the desired position Pt4.
  • the current angle of the drone relative to the target object in the yaw direction is ⁇ 2
  • the desired angle of the drone with respect to the target object in the yaw direction is determined to be ⁇ 1
  • the present embodiment can control the flight of the drone to fly around the target object to the same yaw direction with respect to the target object.
  • the drone can be controlled to fly around the target pair according to the movement of the hand.
  • FIG. 11 is a schematic diagram of controlling a drone to fly around a target object according to another embodiment of the present invention.
  • the target object may be determined according to the position information of the target object before moving and the position information of the target object.
  • the angle of the hand relative to the target object in the yaw direction before moving is ⁇ 3.
  • the position information of the target object after moving and the position information of the target object it can be determined that the target object's hand is yawed relative to the target object after moving.
  • the angle change amount ⁇ can be used as an angle of the UAV to fly around the target object.
  • the current angle of the drone relative to the target object in the yaw direction is ⁇ 4
  • the embodiment moves the drone from the current position Pc5 to the target object to the desired position Pt5, and the drone surrounds the flight to the desired position.
  • the present embodiment can control the angle of the drone to the target object around the flight according to the angle change of the hand relative to the target object in the yaw direction.
  • the direction in which the drone is flying around the target object may be the same as the direction in which the hand moves relative to the target object.
  • the image is controlled to control the drone to fly around the target object, and the user must operate the control device to control the drone, and the user is unfamiliar with the control device and cannot control.
  • the defect of the drone simplifies the control mode and operation process of controlling the drone, improves the efficiency of controlling the flight of the drone, and enhances the entertainment of human-computer interaction.
  • FIG. 12 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 12, the method of the embodiment controls the drone to move away from or close to the target object according to the position information of the hand.
  • the method of this embodiment may include:
  • S602. Determine a position of two hands of the target object in the image in the image.
  • What is determined in the present embodiment is the position of the two hands (left hand and right hand) of the target object in the image in the image. Since the above embodiment has described how to determine the position of the hand in the image, it is determined that two For the position of each hand in the hand, refer to the description of the above embodiment, and details are not described herein again.
  • S603. Determine position information of the two hands of the target object according to the positions of the two hands in the image.
  • the position information of each hand is determined according to the position of each hand in the image.
  • the position information of each hand is determined according to the position of each hand in the image.
  • refer to the above embodiment to determine the target object according to the position of the hand in the image. A description of the location information of the hand is not described here.
  • the drone after determining the position information of the two hands of the target object, the drone is controlled to fly away from or near the target object according to the position information of the two hands.
  • the embodiment may determine the distance between the two hands according to the position information of the two hands, and the distance between the two hands may be, for example, the distance between the two hands in the horizontal direction;
  • the distance control drone moves away from or close to the target object.
  • the drone can be controlled to move away from or close to the target object along the connection direction with the target object, or the drone height can be kept unchanged.
  • the machine moves away from the horizontal direction or the target object.
  • FIG. 13 is a schematic diagram of controlling a drone to fly away from or near a target object according to an embodiment of the present invention.
  • the present embodiment controls the drone to fly away from the target object from the current position Pc6.
  • the distance ⁇ D reaches the desired position Pt6.
  • FIG. 13 shows that the drone is flying close to the target object.
  • FIG. 14 is a schematic diagram of controlling a drone to fly away from or close to a target object according to another embodiment of the present invention.
  • determining relative movement of two hands according to position information of two targets before moving relative to the target object The distance between the fronts; and the distance between the two hands after relative movement based on the positional information of the two hands of the target object after relative movement. If the two hands move toward each other, that is, the distance between the two hands decreases, as shown in FIG.
  • the quantity ⁇ D, wherein ⁇ D and ⁇ d satisfy a certain functional relationship, for example: ⁇ D ⁇ d*C2, C2 is a preset value; then the drone is controlled to fly close to the target object ⁇ D. If the two hands move back, that is, the distance between the two hands increases, as shown in FIG.
  • the quantity ⁇ D, wherein ⁇ D and ⁇ d satisfy a certain functional relationship, for example: ⁇ D ⁇ d*C2, C2 is a preset value; then the drone is controlled to fly away from the target object ⁇ D. In some embodiments, when the distance between the two hands is reduced, the drone is controlled to fly away from the target object, and when the distance between the two hands is increased, the drone is controlled to fly close to the target object.
  • the present embodiment also defines a maximum distance of the drone from the target object and a minimum distance of the drone near the target object.
  • the distance between the drone and the target object is also detected. If the distance is greater than or equal to the maximum distance, the drone is controlled to stop flying away from the target object.
  • the distance between the drone and the target object is also detected. If the distance is less than or equal to the minimum distance, the drone is controlled to stop flying close to the target object.
  • the drone is controlled to fly away from or close to the target object by taking an image, thereby avoiding the situation that the user must operate the control device to control the drone, and the user is not familiar with the control device. Controlling the defects of the drone, simplifying the control mode and operation process of controlling the drone, improving the efficiency of controlling the flight of the drone, and enhancing the entertainment of human-computer interaction.
  • the embodiment can also recognize the gesture of the hand of the target object in the image captured by the camera.
  • the method for controlling the flight of the drone according to the position information of the hand of the target object is: when the gesture of the hand is a preset gesture, according to the position information of the hand of the target object Control the flight of the drone.
  • the embodiment controls the flying height of the drone according to the position information of the hand of the target object, controls the drone to fly around the target object, and controls the drone. Move away from or near the target object.
  • the preset gesture is, for example, ok, hi, and a gesture of reaching out of the palm.
  • FIG. 15 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 15, the method in this embodiment may include:
  • the recognition of the target object can be performed by identifying a feature portion of the body of the target object. Therefore, the feature portion of the target object in the image is determined from the image captured by the photographing device, and the feature portion can be used to represent the target object.
  • the feature portion can be the head of the human body, the head and the shoulder of the human body. At least one of the body and the human body.
  • the feature portion in the image may be identified first, and then the feature portion of the target object is determined from the feature portion in the image.
  • the feature portion may be, for example, a feature portion closest to the center of the image as a feature portion of the target object such that the target object is closest to the center of the image.
  • the tracking algorithm can be used to find the feature part of the target object from the new image. For example, the position of the feature portion of the target object in the image in the image of the previous frame determines a target image region, and an image region that is most similar to the feature portion of the target object in the previous frame is found in the target image region in the next frame image. As a feature part of the target object in the next frame.
  • the hand in the image is also recognized from the image taken by the photographing device.
  • the execution order of S702 and S703 is in no particular order.
  • the hand of the target object is determined from the hand identified in the image according to the feature portion of the target object. It is possible that there are multiple hands in the captured image, and it is also possible to recognize these hands when recognizing the hand, but some of the hands are not the hand of the target object, and therefore, the present embodiment is based on the feature of the target object from the image. The hand in the hand determines the target object.
  • the joint points of the target object may be determined according to the feature parts of the target object, and the joint points include: a joint point of the hand, a joint point of the arm, a joint point of the head, a joint point of the shoulder, and the like, and then according to the target object
  • the off node determines the hand of the target object from the hand identified in the image.
  • the joint point of the hand of the target object can be determined from the joint points of the target object, and the embodiment can determine the hand closest to the joint point of the hand in the joint point of the target object identified from the image, and then The hand closest to the joint point of the hand is determined as the hand of the target object.
  • the drone is controlled to perform the action indicated by the gesture, for example, controlling the drone to take off, landing, and controlling the unmanned person.
  • the flying height of the aircraft controlling the drone to fly around the target object, controlling the drone to fly away from or near the target object, taking pictures or recording, and so on.
  • the control gesture is, for example, ok, hi, reaching out of the palm and the like.
  • controlling the flying height of the drone controlling the drone to fly around the target object, controlling the drone to move away from or approaching the target object can be referred to the description in the above embodiments.
  • the takeoff gesture when the gesture of the target object's hand is a takeoff gesture, the takeoff gesture is a control gesture, the takeoff gesture indicates that the drone is taking off, and the embodiment controls the drone to take off according to the gesture as a takeoff gesture.
  • the embodiment controls the drone to take off and hover at a preset height.
  • FIG. 16 is a schematic diagram of controlling the drone to take off according to an embodiment of the present invention.
  • the embodiment further detects the first operation of the user after performing the foregoing S701, and after detecting the first operation of the user, controls the pan/tilt carrying the imaging device to drive the imaging device within a preset angle range.
  • the first operation includes: clicking or double-clicking the battery switch, shaking the drone, and issuing a voice command to the drone, and the operation of the embodiment causes the drone to enter the gesture recognition mode.
  • the pan/tilt carrying the photographing device is controlled to drive the photographing device to scan within a preset angle range in order to recognize the take-off gesture of the target object.
  • the takeoff gesture after the takeoff gesture is recognized, it is also necessary to control the drone to take off after the takeoff gesture is a stable gesture.
  • a multi-frame image of the photographing device is acquired. After acquiring the multi-frame image, the embodiment further determines the position of the target object's hand in the image in each frame of the multi-frame image, wherein how to determine the position of the target object's hand in the image can be referred to in the above embodiments. The description is not repeated here.
  • the take-off gesture it is determined whether the position of the target object in each frame image is within the preset range of the reference position, if the target object's hand is in the image in each frame image
  • the take-off gesture is a stable gesture
  • the target object controls the drone to take off
  • the embodiment controls the drone to take off; if the target object is in each frame image
  • the position in the image is not within the preset range of the reference position, it indicates that the take-off gesture is not a stable gesture, the target object is a misoperation, and the drone is not controlled to take off, and then the embodiment ignores the gesture, that is, does not control no The man and the plane took off.
  • the reference position is the position of the hand of the target object in the image in the previous frame image, and the position of the hand in each frame of the image is stable.
  • FIG. 17 is a schematic diagram of controlling a drone to drop according to an embodiment of the present invention.
  • the embodiment further obtains a height value measured by the distance sensor, and the height value represents a flying height of the drone.
  • the embodiment further determines whether the drone can be dropped according to the height value.
  • the specific process is: controlling the drone to land when the gesture of the target object's hand is a landing gesture and the height value is less than or equal to a preset height threshold. . If the height value is greater than the preset height threshold, it indicates that the flying height of the drone is high, and the current situation is not suitable for the drone to land. In order to ensure the flight safety of the drone, the height value is greater than the preset height threshold. When the landing gesture is ignored, the drone is not controlled to land.
  • the embodiment also detects the flatness of the ground below the drone, wherein the flatness can be detected by the binocular camera.
  • the embodiment further determines whether the drone can be dropped according to the flatness.
  • the specific process is: controlling the drone when the gesture of the target object's hand is a landing gesture and the flatness is greater than or equal to the preset flatness threshold. landing. If the flatness is less than the preset flatness threshold, it indicates that the ground below the drone is not flat enough to ensure that the drone is safely dropped. Therefore, when the flatness is less than the preset flatness threshold, the landing gesture is ignored. That is, the drone is not controlled to land.
  • the embodiment also detects whether there is a water surface under the drone.
  • whether the drone can be dropped according to whether there is a water surface under the drone the specific process is: when the gesture of the target object's hand is a landing gesture and there is no water surface under the drone, the control is not The man and the plane landed. If there is a water surface under the drone, the drone will fall into the water after it stops on the water surface, causing the drone to be damaged. Therefore, when there is a water surface under the drone, the landing gesture is ignored, that is, no control is not controlled. The man and the plane landed.
  • the embodiment also detects the flight speed of the drone, and the flight speed of the drone can be detected by the speed sensor.
  • the drone is determined according to the flight speed of the drone, and the specific process is: when the gesture of the target object's hand is a landing gesture and the flight speed of the drone is less than or equal to the preset speed threshold.
  • the drone is controlled to land. If the flying speed of the drone is greater than the preset speed threshold, in order to prevent the drone from being dropped on the ground, the flying speed may cause the drone to be damaged. Therefore, the flying speed of the drone is greater than the preset speed threshold. When the landing gesture is ignored, the drone is not controlled to land.
  • the embodiment further detects whether the height of the hand of the target object is lower than the height of the head of the target object, wherein the height of the hand can be determined by the position information of the hand, and the head of the target object The height of the part can be determined by the position information of the target object.
  • the embodiment further determines whether the drone is landing according to the relationship between the height of the hand and the height of the head.
  • the specific process is: when the gesture of the target object's hand is a landing gesture and the height of the target object's hand is lower than Controls the drone's landing when the height of the head of the target object. If the gesture of the target object's hand is a landing gesture and the height of the target object's hand is not lower than the height of the target object's head, the landing gesture is ignored, ie, the drone is not controlled to land.
  • FIG. 18 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 18, the method in this embodiment may include:
  • the drone is controlled to track the target object such that the target object is in the photographing screen of the photographing device.
  • the feature portion of the target object is identified by S802
  • the feature portion of the target object is recognized
  • the hand of the target object is recognized by S803
  • the hand of the target object is not recognized, indicating that the target object exists in the captured image.
  • the drone is controlled to perform the hands-free follow mode, that is, the drone is controlled to track the target object so that the camera can capture the target object, and the target The subject is in the shooting screen of the camera.
  • the embodiment may control the unmanned person by adjusting at least one of the position information, the posture of the drone, and the posture of the pan/tilt carrying the photographing device.
  • the machine tracks the target object.
  • the embodiment shown in FIG. 18 can be combined with any of the above embodiments of FIG. 2 to FIG. 15, that is, if the hand of the target object is not recognized, the solution of the embodiment shown in FIG. 18 is executed, if For the hand of the target object, the solution of any of the above embodiments of Figures 2-15 is performed.
  • the following is achieved by controlling the drone to follow the target object by capturing the image, thereby avoiding the situation that the user must operate the control device to control the drone, and overcome the unfamiliar control device of the user and cannot control.
  • the defect of the drone simplifies the control mode and operation process of controlling the drone, improves the efficiency of controlling the flight of the drone, and enhances the entertainment of human-computer interaction.
  • the features in the above embodiments may refer to at least one of a head of a human body, a head and a shoulder of a human body, and a human body.
  • the feature portion when the state parameter of the target object satisfies a predetermined first state parameter condition, the feature portion is the head and shoulder of the human body.
  • the preset first state parameter condition includes: the size ratio of the target object in the image is greater than or equal to a preset first ratio threshold, and/or the target object and the drone The distance is less than or equal to the preset first distance.
  • the features in the embodiment are the head and shoulders of the human body.
  • the larger the proportion of the target object in the image the closer the distance between the target object and the drone. It can also be determined whether the distance between the target object and the drone is greater than a preset first distance.
  • the feature part in the above embodiments is the head of the human body.
  • the part and the shoulder, wherein the distance between the target object and the drone can be obtained by the binocular camera for ranging. Therefore, when the state parameter of the target object satisfies the first preset requirement, the distance between the target object and the drone is relatively close, and the near-field state is obtained, and the drone can accurately recognize the head of the target object and Shoulder.
  • the feature portion is a human body when the state parameter of the target object satisfies a preset second state parameter condition.
  • the preset second state parameter condition includes: a size ratio of the target object in the image is less than or equal to a preset second ratio threshold; and/or, the target object and the drone The distance is greater than or equal to the preset second distance.
  • the size ratio of the target object in the image is greater than a preset second ratio threshold, and when the size ratio of the target object in the image is less than or equal to a preset second ratio threshold, each of the foregoing
  • the feature part of the embodiment is a human body, wherein the smaller the proportion of the target object in the image, the farther the distance between the target object and the drone is. It is also possible to determine whether the distance between the target object and the drone is less than a preset second distance. When the distance between the target object and the drone is greater than or equal to the preset second distance, the feature parts of the above embodiments are human bodies. Therefore, when the state parameter of the target object satisfies the second preset requirement, the distance between the target object and the drone is far away, and the far-field state is reached, and the drone can recognize the human body of the target object.
  • the preset first ratio threshold may be equal to the preset second ratio threshold.
  • the preset first distance may be equal to the preset second distance.
  • the flight of the drone can be controlled directly according to the hand in the image captured by the camera, including take-off, flying height, flying around, away or approaching, following, landing, and the like. It avoids the situation that the user must operate the control device to control the drone, overcomes the defect that the user is not familiar with the control device and cannot control the drone, simplifies the control mode and operation process of controlling the drone, and improves the control drone. The efficiency of flying enhances the entertainment of human-computer interaction.
  • the embodiment of the present invention further provides a computer storage medium, wherein the computer storage medium stores program instructions, and the program execution may include some or all of the steps of the control method of the drone in the above embodiments.
  • FIG. 19 is a schematic structural diagram of a control device for a drone according to an embodiment of the present invention.
  • the control device 1900 of the drone of the present embodiment may include: a memory 1901 and a processor 1902.
  • the above memory 1901 is connected to the processor 1902 via a bus.
  • the memory 1901 can include read only memory and random access memory and provides instructions and data to the processor 1902.
  • a portion of the memory 1901 may also include a non-volatile random access memory.
  • the processor 1902 may be a central processing unit (CPU), and the processor 1902 may be another general-purpose processor, a digital signal processor (DSP), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the memory 1901 is configured to store program code
  • the processor 1902 is configured to invoke the program code to execute:
  • the flight of the drone is controlled according to the position information of the hand of the target object.
  • the processor 1902 is specifically configured to:
  • Determining, according to a position of the hand in the image, a posture of a pan/tilt carrying the photographing device, a horizontal distance between the target object and the drone, and position information of the drone The location information of the target object's hand.
  • the processor 1902 is specifically configured to:
  • the position information of the hand of the target object is determined according to the orientation, the horizontal distance between the target object and the drone, and the position information of the drone.
  • the processor 1902 is configured to: control a flying height of the drone according to the position information of the hand of the target object.
  • the processor 1902 is specifically configured to:
  • the flying height of the drone is controlled according to the angle.
  • the processor 1902 is specifically configured to:
  • the state parameter of the target object meets the first preset requirement, including:
  • the size ratio of the target object in the image is greater than or equal to a preset first percentage threshold; and/or,
  • the distance between the target object and the drone is less than or equal to a preset first distance.
  • the processor 1902 is specifically configured to:
  • the flying height of the drone is controlled according to the angle.
  • the preset location includes at least one of a head, a shoulder, and a chest.
  • the processor 1902 is specifically configured to:
  • the angle of the hand in the pitch direction with respect to the feature portion is determined according to the position information of the preset portion of the target object and the position information of the hand.
  • the state parameter of the target object meets the second preset requirement, including:
  • the size ratio of the target object in the image is less than or equal to a preset second ratio threshold; and/or,
  • the distance between the target object and the drone is greater than or equal to a preset second distance.
  • the location information of the preset location is determined according to location information of the target object.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to perform a surround flight on the target object according to the position information of the hand of the target object.
  • the processor 1902 is specifically configured to:
  • the processor 1902 is specifically configured to:
  • the drone is controlled to fly away from or near the target object according to the position information of the hand of the target object.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to fly away from or near the target object according to the position information of the two hands.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to fly away from or near the target object according to the distance.
  • the processor is further configured to:
  • Position information of the target object is determined according to a location of the target object in the image.
  • the processor 1902 is specifically configured to:
  • Determining the target according to a position of the target object in the image, a posture of a pan/tilt carrying a photographing device, a horizontal distance between the target object and the drone, and position information of the drone The location information of the object.
  • the processor 1902 is further configured to: identify a gesture of a target object in the image;
  • the processor controls the flight of the drone according to the position information of the hand of the target object
  • the processor is specifically configured to: when the gesture of the hand is a preset gesture, control according to the position information of the target object Drone flight.
  • the device in this embodiment may be used to implement the technical solutions of FIG. 2 and FIG. 12 and its corresponding embodiments, and the implementation principles and technical effects thereof are similar, and details are not described herein again.
  • the processor 1902 is configured to invoke the program code to execute:
  • the drone When the gesture of the hand of the target object is a control gesture, the drone is controlled to perform the action indicated by the gesture.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to take off.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to take off and hover at a preset height.
  • the processor 1902 is further configured to: after detecting the first operation of the user, control the pan/tilt of the bearing device to drive the camera to scan within a preset angle range.
  • the first operation includes: clicking or double clicking on the battery switch, shaking the drone, and issuing at least one of voice commands to the drone.
  • the processor 1902 is specifically configured to:
  • the drone is controlled to take off when the gesture of the target object is a takeoff gesture and the position of the target object in the image in each frame image is within a preset range of the reference position.
  • the reference position is a position of a hand of the target object in the image in the previous frame image.
  • the processor 1902 is specifically configured to:
  • the drone When the gesture is a landing gesture, the drone is controlled to land.
  • processor 1902 is further configured to:
  • the processor 1902 is configured to control the drone to land when the gesture is a landing gesture, including: the processor is configured to: when the gesture is a landing gesture, and the height value is less than or equal to a preset height threshold When the drone is controlled to land.
  • the processor 1902 is further configured to: detect a flatness of a ground below the drone;
  • the processor 1902 is configured to control the drone landing when the gesture is a landing gesture, including:
  • the processor 1902 is configured to control the drone to drop when the gesture is a landing gesture and the flatness is greater than or equal to a preset flatness threshold.
  • the processor 1902 is further configured to: detect whether there is a water surface under the drone;
  • the processor 1902 is configured to control the drone landing when the gesture is a landing gesture, including:
  • the processor 1902 is configured to control the drone to land when the gesture is a landing gesture and there is no water surface under the drone.
  • the processor 1902 is further configured to: detect a flight speed of the drone;
  • the processor 1902 is configured to control the drone landing when the gesture is a landing gesture, including:
  • the processor 1902 is configured to control the drone to land when the gesture is a landing gesture and the speed is less than or equal to a preset speed threshold.
  • the processor 1902 is specifically configured to:
  • a feature portion of the target object is determined from a feature portion identified in the image.
  • the processor 1902 is specifically configured to:
  • a feature portion closest to the center of the image is determined as a feature portion of the target object.
  • the processor 1902 is specifically configured to:
  • a hand of the target object is determined from a joint identified by the image based on a joint point of the target object.
  • the processor 1902 is specifically configured to:
  • the hand closest to the joint point of the hand is determined as the hand of the target object.
  • the feature part includes at least one of a head of the human body, a head and a shoulder of the human body, and a human body.
  • the feature part is the head and the shoulder.
  • the preset first state parameter condition includes: a size ratio of the target object in the image is greater than or equal to a preset first ratio threshold, and/or the target object and the none The distance of the human machine is less than or equal to the preset first distance.
  • the feature part is the human body.
  • the preset second state parameter condition includes: a size ratio of the target object in the image is less than or equal to a preset second ratio threshold, and/or the target object and the none The distance of the human machine is greater than or equal to the preset second distance.
  • the device in this embodiment may be used to implement the technical solutions in FIG. 15 and its corresponding embodiments, and the implementation principles and technical effects are similar, and details are not described herein again.
  • the processor 1902 is configured to invoke the program code to execute:
  • the drone is controlled to track the target object such that the target object is in a photographing screen of the photographing device.
  • the processor 1902 is specifically configured to:
  • At least one of position information, a posture of the drone, and a posture of a pan/tilt carrying the photographing device is adjusted to track the target object such that the target object is in a photographing screen of the photographing device.
  • the feature part includes at least one of a head of the human body, a head and a shoulder of the human body, and a human body.
  • the feature part is the head and the shoulder.
  • the preset first state parameter condition includes: a size ratio of the target object in the image is greater than or equal to a preset first ratio threshold, and/or the target object and the none The distance of the human machine is less than or equal to the preset first distance.
  • the feature part is the human body.
  • the preset second state parameter condition includes: a size ratio of the target object in the image is less than or equal to a preset second ratio threshold, and/or the target object and the none The distance of the human machine is greater than or equal to the preset second distance.
  • the device in this embodiment may be used to implement the technical solutions in FIG. 18 and its corresponding embodiments, and the implementation principles and technical effects are similar, and details are not described herein again.
  • FIG. 20 is a schematic structural diagram of a drone according to an embodiment of the present invention.
  • the drone of the embodiment may include: a control device 2001 of the drone, a photographing device 2002, and a power system ( Not shown in the figure).
  • the control device 2201 of the drone, the photographing device 2002, and the power system are connected by a bus.
  • the control device 2201 of the drone is used to control the flight of the drone, and the structure of the embodiment shown in FIG. 19 can be adopted, and correspondingly, any method embodiment of FIG. 2 to FIG. 18 and its corresponding implementation can be performed.
  • the technical solution of the example has similar implementation principles and technical effects, and will not be further described herein.
  • the photographing device 2002 is for taking an image.
  • a power system for providing flight power and power-driven drone flight wherein the power system includes an electric control, a motor, a propeller, and the like.
  • the drone may further include a pan/tilt 2003 for carrying the photographing device 2002.
  • the drone may further include: a positioning sensor, a distance sensor, a speed sensor, and the like.
  • the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
  • the foregoing storage medium includes: read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, and the like, which can store program codes. Medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

本发明实施例提供一种无人机的控制方法、设备和无人机,此方法包括:获取拍摄装置拍摄的图像,确定图像中目标对象的手在图像中的位置,根据手在图像中的位置确定目标对象的手的位置信息,根据所述目标对象的手的位置信息控制无人机的飞行。因此本实施例可根据拍摄装置拍摄的图像中目标对象的手来控制无人机的飞行。避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。

Description

无人机的控制方法、设备和无人机 技术领域
本发明实施例涉及无人机技术领域,尤其涉及一种无人机的控制方法、设备和无人机。
背景技术
随着无人机应用的越来越广泛,对无人机的操控方式也变得越来越多样。从一开始利用遥控器手柄对无人机进行控制,慢慢也演化出了利用手机、平板电脑等触摸式设备对无人机进行控制。但是这些对无人机的操控方式均需要依赖除无人机之外的额外设备,不仅会增加成本,而且还需用户学会熟练操作这些设备才能控制无人机,控制方式复杂,对用户要求较高,还会使得人机互动性较差。
发明内容
本发明实施例提供一种无人机的控制方法、设备和无人机,以简化无人机的控制方式。
第一方面,本发明实施例提供一种无人机的控制方法,包括:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的手在所述图像中的位置;
根据所述手在所述图像中的位置确定所述目标对象的手的位置信息;
根据所述目标对象的手的位置信息控制无人机的飞行。
第二方面,本发明实施例提供一种无人机的控制方法,包括:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的特征部位;
识别所述图像中的手;
根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手;
当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
第三方面,本发明实施例提供一种无人机的控制方法,包括:
获取拍摄装置拍摄的图像;
识别所述图像中目标对象的特征部位;
识别所述图像中所述目标对象的手;
当识别出所述目标对象的特征部位且识别不到所述目标对象的手时,控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
第四方面,本发明实施例提供一种无人机的控制设备,包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的手在所述图像中的位置;
根据所述手在所述图像中的位置确定所述目标对象的手的位置信息;
根据所述目标对象的手的位置信息控制无人机的飞行。
第五方面,本发明实施例提供一种无人机的控制设备,包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的特征部位;
识别所述图像中的手;
根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手;
当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
第六方面,本发明实施例提供一种无人机的控制设备,包括:存储器和处理器;
所述存储器,用于存储程序代码;
所述处理器,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
识别所述图像中目标对象的特征部位;
识别所述图像中所述目标对象的手;
当识别出所述目标对象的特征部位且识别不到所述目标对象的手时,控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
第七方面,本发明实施例提供一种无人机,包括:
如第四方面、第五方面、第六方面中至少一方面本发明实施例所述的无人机的控制设备;
拍摄装置,用于拍摄图像;
以及动力系统,用于提供飞行动力。
第八方面,本发明实施例提供一种可读存储介质,所述可读存储介质上存储有计算机程序;所述计算机程序在被执行时,实现如第一方面、第二方面、第三方面中至少一方面本发明实施例所述的无人机的控制方法。
本发明实施例提供的无人机的控制方法、设备和无人机,通过获取拍摄装置拍摄的图像,确定图像中目标对象的手在图像中的位置,根据手在图像中的位置确定目标对象的手的位置信息,根据所述目标对象的手的位置信息控制无人机的飞行。因此本实施例可根据拍摄装置拍摄的图像中目标对象的手来控制无人机的飞行。避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
附图说明
图1为根据本发明的实施例的无人机的示意性架构图;
图2为本发明一实施例提供的无人机的控制方法的流程图;
图3为本发明一实施例提供的拍摄装置拍摄的图像中目标对象的示意图;
图4为本发明另一实施例提供的无人机的控制方法的流程图;
图5为本发明一实施例提供的控制无人机的高度的示意图;
图6为本发明另一实施例提供的无人机的控制方法的流程图;
图7为本发明另一实施例提供的控制无人机的高度的示意图;
图8为本发明另一实施例提供的控制无人机的高度的示意图;
图9为本发明另一实施例提供的无人机的控制方法的流程图;
图10为本发明一实施例提供的控制无人机对目标对象环绕飞行的示意图;
图11为本发明另一实施例提供的控制无人机对目标对象环绕飞行的示意图;
图12为本发明另一实施例提供的无人机的控制方法的流程图;
图13为本发明一实施例提供的控制无人机远离或靠近目标对象飞行的示意图;
图14为本发明另一实施例提供的控制无人机远离或靠近目标对象飞行的示意图;
图15为本发明另一实施例提供的无人机的控制方法的流程图;
图16为本发明一实施例提供的控制无人机起飞的示意图;
图17为本发明一实施例提供的控制无人机降落的示意图;
图18为本发明另一实施例提供的无人机的控制方法的流程图;
图19为本发明一实施例提供的无人机的控制设备的一种结构示意图;
图20为本发明一实施例提供的无人机的一种结构示意图。
具体实施方式
本发明的实施例提供了无人机的控制方法、设备和无人机。其中无人机可以是旋翼飞行器(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼飞行器,本发明的实施例并不限于此。
图1为根据本发明的实施例的无人机的示意性架构图。本实施例以旋翼无人飞行器为例进行说明。
无人机100可以包括动力系统150、飞行控制系统160和机架。
机架可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在无人机100着陆时起支撑作用。
动力系统150可以包括一个或多个电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在无人机100的机臂上;电子调速器151用于接收飞行控制系统160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为无人机100的飞行提供动力,该动力使得无人机100能够实现一个或多个自由度的运动。在某些实施例中,无人机100可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴(Roll)、偏航轴(Yaw)和俯仰轴(pitch)。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以是有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量无人机的姿态信息,即无人机100在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、超声传感器、电子罗盘、惯性测量单元(Inertial Measurement Unit,IMU)、视觉传感器、全 球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是全球定位系统(Global Positioning System,GPS)。飞行控制器161用于控制无人机100的飞行,例如,可以根据传感系统162测量的姿态信息控制无人机100的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对无人机100进行控制,也可以通过拍摄画面对无人机100进行控制。
无人机100还包括云台120,云台120可以包括电机122。云台用于携带拍摄装置123。飞行控制器161可以通过电机122控制云台120的运动。可选地,作为另一实施例,云台120还可以包括控制器,用于通过控制电机122来控制云台120的运动。应理解,云台120可以独立于无人机100,也可以为无人机100的一部分。应理解,电机122可以是直流电机,也可以是交流电机。另外,电机122可以是无刷电机,也可以是有刷电机。还应理解,云台可以位于无人机的顶部,也可以位于无人机的底部。
拍摄装置123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄装置123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄,飞行控制器也可以根据拍摄装置123拍摄的图像控制无人机100。本实施例的拍摄装置123至少包括感光元件,该感光元件例如为互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)传感器或电荷耦合元件(Charge-coupled Device,CCD)传感器。可以理解,拍摄装置123也可直接固定于无人机100上,从而云台120可以省略。
应理解,上述对于无人机各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。
图2为本发明一实施例提供的无人机的控制方法的流程图,如图2所示,本实施例的方法可以包括:
S201、获取拍摄装置拍摄的图像。
其中,拍摄装置是无人机的重要组成部分,可以用于拍摄无人机周围的图像,在无人机进入手势控制模式下,无人机可以根据拍摄装置拍摄的图像来控制无人机。在拍摄装置拍摄到图像后,无人机获取拍摄装置拍摄的图像。其中拍摄设备可以为前述部分的拍摄装置123,此处不再赘述。
S202、确定图像中目标对象的手在图像中的位置。
本实施例主要是根据图像中的手来控制无人机,用户可以位于拍摄装置能拍摄到的地方,用户可称为目标对象,拍摄装置拍摄的图像中具有目标对象,因此,本实施例在获取拍摄装置拍摄的图像之后,确定图像中目标对象的手在图像中的位置,该位置可以用图像坐标系统UOV中的像素坐标来表示,其中,如图3所示,图3为本发明一实施例提供的拍摄装置拍摄的图像中目标对象的示意图,其中,可以将图像的左上角作为原点(0,0),图像的长和宽均用像素数为表示,基于此可以确定目标对象的手在图像中的像素坐标。其中,在获取拍摄装置拍摄的目标对象的图像后,可以通过已经训练好的能够识别手的神经网络来识别目标对象的手,具体地,神经网络可以返回目标对象的手在图像的位置。在某些情况中,神经网络可以返回图像中目标对象的手对应的图像区域左上角和右下角的坐标。
其中,在进入手势控制模式后,若图像中拍摄到多个人,则本实施例可以将位置最靠近图像中心位置的人确定为目标对象。
S203、根据手在图像中的位置确定目标对象的手的位置信息。
本实施例中,在确定图像中目标对象的手在图像中的位置之后,根据该手在图像中的位置确定目标对象的手的位置信息。该位置信息可以用三维坐标(x,y,z)来表示,该三维坐标例如可以是无人机的导航坐标系下的坐标,该导航坐标系下的原点O为无人机的起飞点,其中,导航坐标系的X轴的正轴指向朝北方向,导航坐标系的Y轴的正轴指向朝东方向,导航坐标系的Z轴垂直于XOY平面背离地面。在一些实施例中,该三维坐标也可以是其他坐标系统下的坐标,在这里,不作具体的限定。
S204、根据所述目标对象的手的位置信息控制无人机的飞行。
本实施例中,在确定目标对象的手的位置信息之后,根据目标对象的手的位置信息控制无人机的飞行,例如可以控制无人机执行各种飞行操作:控制无人机的飞行高度,或者,可以控制无人机对目标对象环绕飞行,或者,可以控制无人机远离或靠近目标对象飞行,其中,不同的飞行操作中无人机飞行轨迹、无人机实现的功能可以不一样。其中,目标对象的手的不同位置信息可以对应不同的飞行操作,因此,目标对象可以通过控制手处于不同的位置,来控制无人机执行不同的飞行操作。本实施例会控制无人机执行目标对象的手的位置信息对应的飞行操作。
本实施例提供的无人机的控制方法,通过获取拍摄装置拍摄的图像,确定图像中目标对象的手在图像中的位置,根据手在图像中的位置确定目标对象的手的位置信息,根据所述目标对象的手的位置信息控制无人机的飞行。因此本实施例可根据拍摄装置拍摄的图像中目标对象的手来控制无人机的飞行。避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
在一些实施例中,上述S203的一种可能的实现方式为:根据手在图像中的位置、承载拍摄装置的云台的姿态、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息。其中,承载拍摄装置的云台的姿态决定了拍摄装置的姿态,即决定了拍摄装置的姿态角,如俯仰角和偏航角;因此,承载拍摄装置的云台的姿态不同,也会造成目标对象的手的位置信息不同。目标对象与无人机之间的水平距离决定了手与无人机之间的水平距离,从而影响到手的位置信息,因此,目标对象与无人机之间的水平距离不同,也会造成目标对象的手的位置信息不同。另外,本实施例需要参照无人机的位置信息得到目标对象的手的位置信息,其中,无人机的位置信息可以通过无人机上配置的定位传感器获取,其中,所述定位传感器可以为GPS接收机或北斗接收机,在某些情况下,定位传感器还可以包括惯性测量单元、视觉传感器等。综上,本实施例根据手在图像中的位置、承载拍摄装置的云台的姿态、目标对象与无人机之间的水平距离和无人机的位置信息来确定准确的目标对象的手的位置信息。
具体地,本实施例先根据手在图像中的位置、承载拍摄装置的云台的姿态确定手相对于无人机的朝向;再根据所述朝向、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息。其中,拍摄装置具有的视场角(FOV)是已知的,根据手在图像中的位置可以确定手相对于拍摄装置的光轴的角度,例如:若手在图像的正中心,则说明手相对于拍摄装置的光轴的角度为0,若拍摄装置的FOV在水平方向为20度,若手在图像的最左边,则说明手相对于拍摄装置的光轴的水平角度为10度,垂直方向上也 类似,此处不再赘述;而且拍摄装置的云台的姿态也决定了拍摄装置的光轴的朝向,结合手相对于拍摄装置的光轴的角度以及光轴的朝向,可以获得手相对于无人机的朝向。根据目标对象与无人机之间的水平距离可以获得手与无人机之间的水平距离,例如可以将目标对象与无人机之间的水平距离减去一个经验值(例如0.65米),即得到手与无人机之间的水平距离。再根据上述获得的朝向、手与无人机之间的距离以及无人机的位置信息,即可确定手的位置信息。
在一些实施例中,确定目标对象与无人机之间的水平距离可以通过如下方式实现:
一种可行的方式:确定图像中目标对象的脚部在图像中的位置,根据所述脚部在图像中的位置信息和承载拍摄装置的云台的姿态即可以确定目标对象的脚部相对于无人机的朝向,根据所述朝向即可以确定目标对象的脚部相对于无人机在俯仰方向上的角度,然后,可以获取无人机上配置的距离传感器测量的高度值,根据所述在俯仰方向上的角度和距离传感器测量的高度值即可以确定目标对象与无人机之间的水平距离。
另一种可行的方式:确定图像中目标对象的脚部和头部在图像中的位置,脚部和头部在图像中的位置和承载拍摄装置的云台的姿态即可以确定目标对象的脚部和头部相对于无人机的朝向,根据所述朝向即可以确定目标对象的脚部和脚部相对于无人机在俯仰方向上的角度,可以设定目标对象的身高可以为一个经验值,例如1.75m,根据目标对象的脚部和脚部相对于无人机在俯仰方向上的角度和所述设定的目标对象的身高即可以确定目标对象与无人机之间的水平距离。
可以理解的是,目标对象与无人机之间的水平距离可以是将以上两种可行的方式确定的水平距离进行融合之后得到的水平距离。
在一些实施例中,上述S204可以为:根据所述目标对象的手的位置信息控制无人机的飞行高度,例如:可以控制将无人机的飞行高度调高或者得调低。
在一种可能的实现方式中,本实施例可以根据上述S203中确定的目标对象的手的位置信息和无人机的位置信息确定所述手相对于无人机在俯仰方向上的角度;然后根据所述角度控制无人机的飞行高度。
其中,根据目标对象的手的位置信息以及无人机的位置信息,可以确定手相对于无人机在俯仰方向上的角度,
图4为本发明另一实施例提供的无人机的控制方法的流程图,如图4所示,本实施例的方法以根据手的位置信息控制无人机的飞行高度为例,本实施例的方法可以包括:
S301、获取拍摄装置拍摄的图像。
S302、确定图像中目标对象的手在图像中的位置。
S303、根据手在图像中的位置确定目标对象的手的位置信息。
本实施例中,S303的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S304、根据所述目标对象的手的位置信息和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度。
S305、根据所述角度控制无人机的飞行高度。
本实施例中,在确定目标对象的手的位置信息之后,根据目标对象的手的位置信息和无人机的位置信息确定手相对于无人机在俯仰方向上的角度,然后根据该角度确定无人机 的期望高度,然后控制无人机飞行至该期望高度。其中,可以根据该角度以及目标对象与无人机之间的水平距离,确定该无人机的期望高度,例如:将目标对象与无人机之间的水平距离减去一个经验值,获得目标对象的手与无人机之间的水平距离,然后根据该角度和该目标对象的手与无人机之间的水平距离,确定无人机的期望高度,其中,该无人机的期望高度可以与目标对象的手处于同一高度。其中,图5为本发明一实施例提供的控制无人机的高度的示意图,如图5所示,根据目标对象的手的位置信息和无人机的位置信息可以确定目标对象的手相对于无人机在俯仰方向上的角度为α1,其中,目标对象与无人机之间的水平距离为D1,则可以获得目标对象的手与无人机之间的水平距离为d1,d1=D1-0.65m;另外,无人机的当前高度为Pc1,由图5可以获得无人机从当前高度Pc1飞行至期望高度Pt1的高度差为h1,其中,h1=d1*tanα1,因此,可以获得无人机的期望高度为Pt1=Pc1+h1。由图5可知,本实施例可以控制无人机飞行至与手处于同一高度。
在一些实施例中,在目标对象的状态参数满足第一预设要求时,本实施例执行上述S304。其中,所述目标对象的状态参数满足第一预设要求包括:目标对象在所述图像中的尺寸占比大于或等于预设第一占比阈值;和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。本实施例中,可以判断目标对象在图像中的尺寸占比是否小于预设第一占比阈值,在目标对象在图像中的尺寸占比大于或等于预设第一占比阈值时,执行上述S304,其中,目标对象在图像中的尺寸占比越大,说明目标对象与无人机之间的距离越近。也可以判断目标对象与无人机的距离是否大于预设第一距离,在目标对象与无人机的距离小于或等于预设第一距离时,执行上述S304,其中,目标对象与无人机的距离可以通过无人机上配置的双目摄像头进行测距的方式获得。因此,在目标对象的状态参数满足第一预设要求时,说明目标对象与无人机之间的距离较近,处于近场状态,此时无人机可以准确地获得手相对于无人机在俯仰方向上的角度,从而精确地控制无人机的飞行高度。
本实施例中,通过上述方案,实现了通过拍摄图像来控制无人机的飞行高度,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
图6为本发明另一实施例提供的无人机的控制方法的流程图,如图6所示,本实施例的方法以根据手的位置信息控制无人机的飞行高度为例,本实施例的方法可以包括:
S401、获取拍摄装置拍摄的图像。
S402、确定图像中目标对象的手在图像中的位置。
S403、根据手在图像中的位置确定目标对象的手的位置信息。
本实施例中,S403的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S404、根据目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述预设部位在俯仰方向上的角度。
S405、根据所述角度控制无人机的飞行高度。
本实施例中,在确定目标对象的手的位置信息之后,根据目标对象的手的位置信息和目标对象的预设部位的位置信息确定手相对于该预设部位在俯仰方向上的角度,然后根据 该角度确定无人机的期望高度,然后控制无人机飞行至该期望高度。
其中,该预设部位例如可以是:头部、肩部、胸部中的至少一个部位。其中,目标对象的预设部位的位置信息可以根据目标对象的位置信息确定的,以预设部位为头部为例,则可以将目标对象的最上方的五分之一作为头部,从而根据目标对象的位置信息,确定目标对象中最上方的五分之一的位置信息。
在一些实施例中,本实施例在获取拍摄装置拍摄的图像之后,还确定图像中目标对象在图像中的位置,然后根据该目标对象在图像中的位置确定目标对象的位置信息。具体地,本实施例可以是根据所述目标对象在所述图像中的位置、承载所述拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定目标对象的位置信息。其中,确定目标对象的位置信息与确定手的位置信息的过程类似,此处不再赘述。
其中,可以根据该角度以及目标对象与无人机之间的水平距离,确定该无人机的期望高度,例如:根据该角度和该目标对象与无人机之间的水平距离,确定无人机的期望高度。这里为了方便说明,以预设部位为头部为例来说明。
在一种实现方式中,图7为本发明另一实施例提供的控制无人机的高度的示意图,如图7所示,根据目标对象的手的位置信息和目标对象的预设部位的位置信息可以确定目标对象的手相对于预设部位在俯仰方向上的角度为α2,其中,目标对象与无人机之间的水平距离为D2,则可以获得无人机的期望高度与目标对象的预设部位之间的高度差为D2*tanα2,而且无人机的当前高度Pc2与目标对象的预设部位之间的高度差为△h,由图7可以获得无人机从当前高度Pc2飞行至期望高度Pt2的高度差为h2,其中,h2=△h+D2*tanα2,因此,可以获得无人机的期望高度为Pt2=Pc2+h2。由图7可知,本实施例可以控制无人机飞行至与目标对象的手与预设部位处于同一连线上。
在另一种实现方式中,可以根据手的移动来控制无人机的飞行高度。图8为本发明另一实施例提供的控制无人机的高度的示意图,如图8所示,根据目标对象的手移动前的位置信息和目标对象的预设部位的位置信息可以确定目标对象的手在移动前相对于预设部位在俯仰方向上的角度为α3,根据目标对象的手移动后的位置信息和目标对象的预设部位的位置信息可以确定目标对象的手在移动后相对于预设部位在俯仰方向上的角度为α3',因此,目标对的手相对于预设部位在俯仰方向上的角度的变化量为α3+α3',需要说明的是,若手在移动前和移动后均位于预设部位的下方或者上方,则上述角度的变化量为︱α3-α3'︱。其中,目标对象与无人机之间的水平距离为D3,因此,根据该角度的变化量以及目标对象与无人机之间的水平距离,可以获得手移动前与目标对象之间的连线在无人机所在垂直方向(Z坐标)上的交点P0,与,手移动后与目标对象之间的连线在无人机的垂直方向上的交点P0'之间的高度差为h3=P0'-P0。并将该高度差h3作为无人机的期望高度Pt3与无人机的当前高度Pc3之间的高度差,因此,可以获得无人机的期望高度为Pt3=Pc3+h3,其中,如图8所示,h3=D2*(tanα3+tanα3'),从而Pt3=Pc3+D2*(tanα3+tanα3')。由图8可知,本实施例可以根据手的高度变化映射到无人机所在垂直方向上的高度变化,从而根据该无人机所在垂直方向上的高度变化来控制无人机的飞行高度。
在一些实施例中,在目标对象的状态参数满足第二预设要求时,本实施例执行上述S404。其中,所述目标对象的状态参数满足第二预设要求包括:目标对象在所述图像中的 尺寸占比小于或等于预设第二占比阈值;和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。本实施例中,可以判断目标对象在图像中的尺寸占比是否大于预设第二占比阈值,在目标对象在图像中的尺寸占比小于或等于预设第二占比阈值时,执行上述S404,其中,目标对象在图像中的尺寸占比越小,说明目标对象与无人机之间的距离越远。也可以判断目标对象与无人机的距离是否小于预设第二距离,在目标对象与无人机的距离大于或等于预设第二距离时,执行上述S404。因此,在目标对象的状态参数满足第二预设要求时,说明目标对象与无人机之间的距离较远,处于远场状态,此时无人机可以准确地获得手相对于目标对象的预设部位在俯仰方向上的角度,从而精确地控制无人机的飞行高度。
在一些实施例中,上述的预设第二占比阈值可以等于上述的预设第一占比阈值。上述的预设第二距离可以等于上述的预设第一距离。相应地,图4与图6所示的实施例可以结合,即在目标对象的状态参数满足第二预设要求时,执行图6所示实施例,在目标对象的状态参数满足第一预设要求时,执行图4所示实施例。
本实施例中,通过上述方案,实现了通过拍摄图像来控制无人机的飞行高度,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
图9为本发明另一实施例提供的无人机的控制方法的流程图,如图9所示,本实施例的方法以根据手的位置信息控制无人机对目标对象进行环绕飞行为例,本实施例的方法可以包括:
S501、获取拍摄装置拍摄的图像。
S502、确定图像中目标对象的手在图像中的位置。
S503、根据手在图像中的位置确定目标对象的手的位置信息。
本实施例中,S501-S503的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S504、根据目标对象的手的位置信息和目标对象的位置信息确定手相对于目标对象在偏航方向上的角度。
S505、根据所述角度控制无人机对目标对象进行环绕飞行。
本实施例中,在确定目标对象的手的位置信息之后,根据目标对象的手的位置信息和目标对象的位置信息确定手相对于该目标对象在偏航方向上的角度,然后根据该角度确定无人机相对于目标对象在偏航方向上的期望角度,然后控制无人机对目标对象环绕飞行至该期望角度。在某些实施例中,在确定目标对象的手的位置信息之后,根据目标对象的手的位置信息和目标对象的位置信息确定手相对于该目标对象在偏航方向上的角度,根据所述角度确定无人机的期望位置信息,然后再控制无人机然后控制无人机对目标对象环绕飞行至该期望位置信息。
在一种实现方式中,图10为本发明一实施例提供的控制无人机对目标对象环绕飞行的示意图,如图10所示,根据目标对象的手的位置信息和目标对象的位置信息可以确定目标对象的手相对于目标对象在偏航方向上的角度为β1,本实施例可以将该角度β1作为无人机相对于目标对象在偏航方向上的期望角度,然后本实施例将无人机从当前位置Pc4 对目标对象环绕飞行至期望位置Pt4,无人机环绕飞行至期望位置Pt4时,该无人机相对于目标对象在偏航方向上的角度为β1。其中,无人机相对于目标对象在偏航方向上的当前角度为β2,由于已确定无人机相对于目标对象在偏航方向上的期望角度为β1,因此,可以获得无人机对目标对象环绕飞行的角度为△β=β1-β2。由图10可知,本实施例可以控制无人机飞行对目标对象环绕飞行至与目标对象的手相对于目标对象处于同一偏航方向上。
在另一种实现方式中,可以根据手的移动来控制无人机对目标对环绕飞行。图11为本发明另一实施例提供的控制无人机对目标对象环绕飞行的示意图,如图11所示,根据目标对象的手移动前的位置信息和目标对象的位置信息可以确定目标对象的手在移动前相对于目标对象在偏航方向上的角度为β3,根据目标对象的手移动后的位置信息和目标对象的位置信息可以确定目标对象的手在移动后相对于目标对象在偏航方向上的角度为β3',因此,目标对的手相对于目标对象在偏航方向上的角度的变化量△β=β3'-β3。本实施例可以将该角度变化量△β作为无人机对目标对象环绕飞行的角度。其中,无人机相对于目标对象在偏航方向上的当前角度为β4,然后本实施例将无人机从当前位置Pc5对目标对象环绕飞行至期望位置Pt5,无人机环绕飞行至期望位置Pt5时,该无人机相对于目标对象在偏航方向上的角度为β4',其中,β4'=β4+△β。由图11可知,本实施例可以根据手相对于目标对象在偏航方向上的角度变化来控制无人机对目标对象环绕飞行的角度。其中,无人机对目标对象环绕飞行的方向可以与手相对于目标对象移动的方向相同。
本实施例中,通过上述方案,实现了通过拍摄图像来控制无人机对目标对象环绕飞行,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
图12为本发明另一实施例提供的无人机的控制方法的流程图,如图12所示,本实施例的方法以根据手的位置信息控制无人机远离或靠近目标对象飞行为例,本实施例的方法可以包括:
S601、获取拍摄装置拍摄的图像。
本实施例中,S601-S603的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S602、确定图像中目标对象的两个手在图像中的位置。
本实施例中确定的是图像中目标对象的两个手(左手和右手)在图像中的位置,由于上述实施例中已描述了如何确定手在图像中的位置的方案,因此,确定两个手中每个手在图像的位置可以参见上述实施例的记载,此处不再赘述。
S603、根据两个手在图像中的位置确定目标对象的两个手的位置信息。
在确定两个手在图像中的位置之后,再根据每个手在图像中的位置确定每个手的位置信息,具体实现过程可以参见上述实施例中根据手在图像中的位置确定目标对象的手的位置信息的相关描述,此处不再赘述。
S604、根据两个手的位置信息控制无人机远离或靠近目标对象飞行。
本实施例中,在确定目标对象的两个手的位置信息之后,根据这两个手的位置信息控制无人机远离或靠近目标对象飞行。其中,本实施例可以根据所述两个手的位置信息确定 两个手之间的距离,该两个手之间的距离例如可以是这两个手在水平方向上的距离;然后根据所述距离控制无人机远离或靠近目标对象飞行,例如:可以控制无人机沿与目标对象之间的连线方向远离或靠近目标对象飞行,或者,可以保持无人机的高度不变控制无人机沿水平方向远离或者目标对象飞行。
在一种可能的实现方式中,图13为本发明一实施例提供的控制无人机远离或靠近目标对象飞行的示意图,如图13所示,根据目标对象两个手之间的距离d2,确定无人机与目标对象之间的期望距离D4',其中d2与D4'满足一定的函数关系,例如:D4'=d2*C1,C1为预设值。若无人机与目标对象之间的当前距离D4大于期望距离D4',则距离差值△D=D4-D4',因此本实施例控制无人机从当前位置Pc6以靠近目标对象的方向飞行距离△D到达期望位置Pt6。若无人机与目标对象之间的当前距离D4小于期望距离D4',则距离差值△D=D4'-D4,因此本实施例控制无人机从当前位置Pc6以远离目标对象的方向飞行距离△D到达期望位置Pt6。其中,图13中示出无人机靠近目标对象飞行。
在另一种实现方式中,可以根据两个手的相对移动来控制无人机远离或靠近目标对象飞行。图14为本发明另一实施例提供的控制无人机远离或靠近目标对象飞行的示意图,如图14所示,根据目标对象的两个手相对移动前的位置信息,确定两个手相对移动前之间的距离;以及根据目标对象的两个手相对移动后的位置信息,确定两个手相对移动后之间的距离。若两个手相向移动,即两个手之间的距离减少,如图14所示,两个手相对移动前之间的距离为d3,两个手相对移后之间的距离d4,然后根据d3和d4可以确定两个手之间的距离的变化量△d=d3-d4,然后根据两个手之间的距离的变化量△d,确定无人机与目标对象之间的距离的变化量△D,其中△D与△d满足一定的函数关系,例如:△D=△d*C2,C2为预设值;然后再控制无人机靠近目标对象飞行△D。若两个手背向移动,即两个手之间的距离增加,如图14所示,两个手相对移动前之间的距离为d4,两个手相对移后之间的距离d3,然后根据d3和d4可以确定两个手之间的距离的变化量△d=d3-d4,然后根据两个手之间的距离的变化量△d,确定无人机与目标对象之间的距离的变化量△D,其中△D与△d满足一定的函数关系,例如:△D=△d*C2,C2为预设值;然后再控制无人机远离目标对象飞行△D。在一些实施例中,也可以是两个手之间的距离减少时,控制无人机远离目标对象飞行,两个手之间的距离增大时,控制无人机靠近目标对象飞行。
在一些实施例中,本实施例还限定了无人机远离目标对象的最大距离,以及无人机靠近目标对象的最小距离。在控制无人机远离目标对象飞行时,还检测无人机与目标对象之间的距离,若该距离大于或等于最大距离时,控制无人机停止远离目标对象飞行。在控制无人机靠近目标对象飞行时,还检测无人机与目标对象之间的距离,若该距离小于或等于最小距离时,控制无人机停止靠近目标对象飞行。
本实施例中,通过上述方案,实现了通过拍摄图像来控制无人机远离或者靠近目标对象飞行,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
在一些实施例中,在上述各实施例的基础上,本实施例还可以识别拍摄装置拍摄的图像中目标对象的手的手势。相应地,本实施例根据所述目标对象的手的位置信息控制无人机的飞行的一种方式为:当所述手的手势为预设手势时,根据所述目标对象的手的位置信 息控制无人机的飞行。当识别到的目标对象的手的手势为预设手势时,本实施例再根据目标对象的手的位置信息控制无人机的飞行高度、控制无人机对目标对象环绕飞行、控制无人机远离或靠近目标对象飞行等。其中,该预设手势例如为:ok,yeah,伸出手掌等手势。
图15为本发明另一实施例提供的无人机的控制方法的流程图,如图15所示,本实施例的方法可以包括:
S701、获取拍摄装置拍摄的图像。
本实施例中,S701的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S702、确定图像中目标对象的特征部位。
本实施例中,需要从图像中识别目标对象,其中,目标对象的识别可以通过识别目标对象的身体的特征部位。因此,从拍摄装置拍摄的图像中确定图像中目标对象的特征部位,该特征部位可以用于表征目标对象,该目标对象为人时,该特征部位可以是人体的头部、人体的头部和肩部、人体中的至少一种。
在一些实施例中,在进入手势控制模式后,可以先识别图像中的特征部位,然后从图像中的特征部位确定目标对象的特征部位。某些情况中,有可能拍摄到的图像中存在多个人物,在识别特征部位时,也可能将这些人物的特征部位也识别到,因此,本实施例需要从图像中的特征部位确定目标对象的特征部位,例如可以是将距离图像的中心最近的特征部位确定为目标对象的特征部位,以使得目标对象最靠近图像的中心。通过这种方式,在进入手势控制模式后,即可以从图像中找到目标对象的特征部位。在图像中找到目标对象的特征部位后,当拍摄装置获取到新的图像后,即可以利用追踪算法从新的图像中找到目标对象的特征部位。例如,以上一帧图像中目标对象的特征部位在图像中的位置确定一个目标图像区域,在下一帧图像中的目标图像区域中找到一个与上一帧中目标对象的特征部位最相似的图像区域作为所述下一帧中的目标对象的特征部位。
S703、识别图像中的手。
本实施例中,还从拍摄装置拍摄的图像中识别图像中的手。其中,S702与S703的执行顺序不分先后。
S704、根据所述目标对象的特征部位从图像中识别的手中确定目标对象的手。
本实施例中,在确定图像中目标对象的特征部位以及识别图像中的手之后,根据目标对象的特征部位从图像中识别的手中确定目标对象的手。有可能拍摄到的图像中存在多个手,在识别手时,也可能将这些手均识别到,但是其中一些手并不是目标对象的手,因此,本实施例根据目标对象的特征部位从图像中的手中确定该目标对象的手。其中,本实施例可以根据目标对象的特征部位确定目标对象的关节点,这些关节点包括:手的关节点、胳膊的关节点、头的关节点、肩的关节点等,再根据目标对象的关节点从图像中识别的手中确定目标对象的手。其中,可以从目标对象的关节点中确定出目标对象的手的关节点,本实施例可以确定从图像中识别的手中距离该目标对象的关节点中手的关节点最近的手,然后将该距离手的关节点最近的手确定为目标对象的手。
S705、当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
本实施例中,在识别到目标对象的手之后,当该目标对象的手的手势为控制手势时, 控制无人机执行手势指示的动作,例如:控制无人机起飞、降落、控制无人机的飞行高度、控制无人机对目标对象环绕飞行、控制无人机远离或靠近目标对象飞行、拍照或者录像等等。其中,该控制手势例如为ok,yeah,伸出手掌等手势。
其中,控制无人机的飞行高度、控制无人机对目标对象环绕飞行、控制无人机远离或靠近目标对象飞行可以参见上述各实施例中的记载。
下面对控制无人机起飞和降落进行描述。
在一些实施例中,在该目标对象的手的手势为起飞手势时,该起飞手势为控制手势,该起飞手势指示无人机起飞,本实施例根据手势为起飞手势,控制无人机起飞。可选地,本实施例控制无人机起飞并悬停在预设高度,其中,如图16所示,图16为本发明一实施例提供的控制无人机起飞的示意图。在一些实施例中,本实施例在执行上述S701之后还检测用户的第一操作,在检测到用户的第一操作后,控制承载拍摄装置的云台以带动拍摄装置在预设的角度范围内扫描,其中,云台的转动带动拍摄装置的转动,从而使得拍摄装置在预设的角度范围内扫描以拍摄预设的角度范围内的图像,以便执行后续S701-S705的操作。其中,所述第一操作包括:单击或双击电池开关、晃动所述无人机、向无人机发出语音指令中的至少一种,本实施例通过这些操作使得无人机进入手势识别模式,在无人机进行手势识别模式后控制承载拍摄装置的云台以带动拍摄装置在预设的角度范围内扫描,以便识别到目标对象的起飞手势。
在一些实施例中,在识别到起飞手势之后,还需要在该起飞手势为稳定的手势之后,才控制无人机起飞。其中,本实施例在执行S701时是获取拍摄装置的多帧图像。在获取到多帧图像之后,本实施例还确定多帧图像中每一帧中目标对象的手在图像中的位置,其中如何确定目标对象的手在图像中的位置可以参见上述各实施例中的描述,此处不再赘述。然后在目标对象的手为起飞手势时,确定每一帧图像中目标对象的手在图像的位置是否在参考位置的预设范围内,若每一帧图像中该目标对象的手在图像中的位置在参考位置的预设范围内时,说明该起飞手势为稳定的手势,目标对象要控制无人机起飞,然后本实施例控制无人机起飞;若每一帧图像中该目标对象的手在图像中的位置不在参考位置的预设范围内时,说明该起飞手势不是稳定的手势,目标对象是误操作,不是要控制无人机起飞,然后本实施例忽略该手势,即不控制无人机起飞。在一些实施例中,该参考位置为上一帧图像中目标对象的手在图像中的位置,也说明每一帧图像中手的位置稳定。
在一些实施例中,在该目标对象的手的手势为降落手势时,该降落手势为控制手势,该降落手势指示无人机降落,本实施例根据手势为降落手势,控制无人机降落,其中,如图17所示,图17为本发明一实施例提供的控制无人机降落的示意图。
在第一种可能的实现方式中,本实施例还获取距离传感器测量得到的高度值,该高度值表示无人机的飞行高度。本实施例还根据该高度值来判断无人机是否可以降落,具体过程为:在目标对象的手的手势为降落手势时并且该高度值小于或等于预设高度阈值时,控制无人机降落。若该高度值大于预设高度阈值时,说明该无人机的飞行高度较高,当前状况下不适合无人机降落,为了保证无人机的飞行安全,在该高度值大于预设高度阈值时,忽略该降落手势,即不控制无人机降落。
在第二种可能的实现方式中,本实施例还检测无人机下方地面的平整度,其中,该平整度可以通过双目摄像头来检测。本实施例还根据该平整度来判断无人机是否可以降落, 具体过程为:在目标对象的手的手势为降落手势时并且该平整度大于或等于预设平整度阈值时,控制无人机降落。若该平整度小于预设平整度阈值时,说明该无人机的下方地面不够平整,无法保证无人机安全降落,因此,在该平整度小于预设平整度阈值时,忽略该降落手势,即不控制无人机降落。
在第三种可能的实现方式中,本实施例还检测无人机下方是否存在水面。本实施例还根据该无人机下方是否存在水面来判断无人机是否可以降落,具体过程为:在目标对象的手的手势为降落手势时并且该无人机下方不存在水面时,控制无人机降落。若该无人机下方存在水面时,无人机停在水面上后会掉入水中,造成无人机损坏,因此,在该无人机下方存在水面时,忽略该降落手势,即不控制无人机降落。
在第四种可能的实现方式中,本实施例还检测无人机的飞行速度,无人机的飞行速度可以通过速度传感器来检测。本实施例还根据该无人机的飞行速度来判断无人机是否降落,具体过程为:在目标对象的手的手势为降落手势时并且该无人机的飞行速度小于或等于预设速度阈值时,控制无人机降落。若该无人机的飞行速度大于预设速度阈值时,为了避免无人机在降落在地面上仍有飞行速度造成无人机损坏,因此,在该无人机的飞行速度大于预设速度阈值时,忽略该降落手势,即不控制无人机降落。
在第五种可能的实现方式中,本实施例还检测目标对象的手的高度是否低于目标对象的头部的高度,其中,手的高度可以通过手的位置信息来确定,目标对象的头部的高度可以通过目标对象的位置信息来确定,如何确定手的位置信息和目标对象的位置信息可以参见上述实施例中的相关描述,此处不再赘述。本实施例还根据手的高度与头部的高度之间的关系来判断无人机是否降落,具体过程为:在目标对象的手的手势为降落手势时并且该目标对象的手的高度低于目标对象的头部的高度时,控制无人机降落。若目标对象的手的手势为降落手势时并且该目标对象的手的高度不低于目标对象的头部的高度,则忽略该降落手势,即不控制无人机降落。
需要说明的是,上述第一种至第五种可能的实现方式中的至少两种可能的实现方式也可以结合来控制无人机降落。
图18为本发明另一实施例提供的无人机的控制方法的流程图,如图18所示,本实施例的方法可以包括:
S801、获取拍摄装置拍摄的图像。
本实施例中,S801的具体实现过程可以参见图2所示实施例中的相关描述,此处不再赘述。
S802、识别图像中目标对象的特征部位。
本实施例中,识别图像中目标对象的特征部位的具体实现过程可以参见图15所示实施例中的相关描述,此处不再赘述。
S803、识别图像中目标对象的手。
本实施例中,识别图像中目标对象的手的具体实现过程可以参见图15所示实施例中的相关描述,此处不再赘述。其中,S802与S803的执行顺序不分先后。
S804、当识别出目标对象的特征部位且识别不到目标对象的手时,控制无人机对目标对象进行跟踪以使所述目标对象在拍摄装置的拍摄画面中。
本实施例中,当通过S802识别目标对象的特征部位,识别出目标对象的特征部位, 以及当通过S803识别目标对象的手,识别不到目标对象的手时,说明拍摄的图像中存在目标对象但不存在手,无需根据手来控制无人机,然后控制无人机进行无手跟随模式,即控制无人机对目标对象进行跟踪以使该拍摄装置能拍摄到该目标对象,并且该目标对象在拍摄装置的拍摄画面中。
在一些实施例中,为了使得目标对象在拍摄装置的拍摄画面中,本实施例可以通过调整无人机的位置信息、姿态和承载拍摄装置的云台的姿态中的至少一种,控制无人机对目标对象进行跟踪。
在一些实施例中,图18所示实施例可以与上述图2-图15任一实施例结合,即如果识别不到目标对象的手时,执行图18所示实施例的方案,如果识别到对目标对象的手,则执行上述图2-图15任一种实施例的方案。
本实施例中,通过上述方案,实现了通过拍摄图像来控制无人机对目标对象的跟随,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
在一些实施例中,上述各实施例中的特征部位可以是指:人体的头部、人体的头部和肩部、人体中的至少一种。
在一些实施例中,在目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为人体的头部和肩部。其中,所述预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。本实施例中,可以判断目标对象在图像中的尺寸占比是否小于预设第一占比阈值,在目标对象在图像中的尺寸占比大于或等于预设第一占比阈值时,上述各实施例中的特征部位为人体的头部和肩部。其中,目标对象在图像中的尺寸占比越大,说明目标对象与无人机之间的距离越近。也可以判断目标对象与无人机的距离是否大于预设第一距离,在目标对象与无人机的距离小于或等于预设第一距离时,上述各实施例中的特征部位为人体的头部和肩部,其中,目标对象与无人机的距离可以通过双目摄像头进行测距的方式获得。因此,在目标对象的状态参数满足第一预设要求时,说明目标对象与无人机之间的距离较近,处于近场状态,此时无人机可以准确识别到目标对象的头部和肩部。
在一些实施例中,在目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为人体。其中,所述预设的第二状态参数条件包括:目标对象在所述图像中的尺寸占比小于或等于预设第二占比阈值;和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。本实施例中,可以判断目标对象在图像中的尺寸占比是否大于预设第二占比阈值,在目标对象在图像中的尺寸占比小于或等于预设第二占比阈值时,上述各实施例的特征部位为人体,其中,目标对象在图像中的尺寸占比越小,说明目标对象与无人机之间的距离越远。也可以判断目标对象与无人机的距离是否小于预设第二距离,在目标对象与无人机的距离大于或等于预设第二距离时,上述各实施例的特征部位为人体。因此,在目标对象的状态参数满足第二预设要求时,说明目标对象与无人机之间的距离较远,处于远场状态,此时无人机可以识别到目标对象的人体。
在一些实施例中,上述的预设第一占比阈值可以等于上述的预设第二占比阈值。上述 的预设第一距离可以等于上述的预设第二距离。
综上所述,本发明实施例中可以直接根据拍摄装置拍摄的图像中的手来控制无人机的飞行,包括起飞、飞行高度、环绕飞行、远离或靠近、跟随、降落等一系列过程,避免了用户必须操作控制设备才能控制无人机的情况,克服了用户不熟悉控制设备而无法控制无人机的缺陷,简化了控制无人机的控制方式和操作过程,提高了控制无人机飞行的效率,增强了人机互动的娱乐性。
本发明实施例中还提供了一种计算机存储介质,该计算机存储介质中存储有程序指令,所述程序执行时可包括上述各实施例中的无人机的控制方法的部分或全部步骤。
图19为本发明一实施例提供的无人机的控制设备的一种结构示意图,如图19所示,本实施例的无人机的控制设备1900可以包括:存储器1901和处理器1902。上述存储器1901与处理器1902通过总线连接。存储器1901可以包括只读存储器和随机存取存储器,并向处理器1902提供指令和数据。存储器1901的一部分还可以包括非易失性随机存取存储器。
上述处理器1902可以是中央处理单元(Central Processing Unit,CPU),该处理器1902还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
其中,所述存储器1901用于存储程序代码;
在一些实施例中,所述处理器1902,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的手在所述图像中的位置;
根据所述手在所述图像中的位置确定所述目标对象的手的位置信息;
根据所述目标对象的手的位置信息控制无人机的飞行。
可选的,所述处理器1902,具体用于:
根据所述手在所述图像中的位置、承载所述拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的手的位置信息。
可选的,所述处理器1902,具体用于:
根据手在图像中的位置、承载拍摄装置的云台的姿态确定手相对于无人机的朝向;
根据所述朝向、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息。
可选的,所述处理器1902,具体用于:根据所述目标对象的手的位置信息控制无人机的飞行高度。
可选的,所述处理器1902,具体用于:
根据所述目标对象的手的位置信息和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度;
根据所述角度控制所述无人机的飞行高度。
可选的,所述处理器1902,具体用于:
当所述目标对象的状态参数满足第一预设要求时,根据所述目标对象的手的位置和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度。
可选的,所述目标对象的状态参数满足第一预设要求包括:
所述目标对象在所述图像中的尺寸占比大于或等于预设第一占比阈值;和/或,
所述目标对象与所述无人机的距离小于或等于预设第一距离。
可选的,所述处理器1902,具体用于:
根据所述目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述预设部位在俯仰方向上的角度;
根据所述角度控制所述无人机的飞行高度。
可选的,所述预设部位包括头部、肩部、胸部中的至少一个部位。
可选的,所述处理器1902,具体用于:
当目标对象的状态参数满足第二预设要求时,根据目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述特征部位在俯仰方向上的角度。
可选的,所述目标对象的状态参数满足第二预设要求包括:
所述目标对象在所述图像中的尺寸占比小于或等于预设第二占比阈值;和/或,
所述目标对象与所述无人机的距离大于或等于预设第二距离。
可选的,所述预设部位的位置信息是根据所述目标对象的位置信息确定的。
可选的,所述处理器1902,具体用于:
根据所述目标对象的手的位置信息控制所述无人机对所述目标对象进行环绕飞行。
可选的,所述处理器1902,具体用于:
根据所述目标对象的手的位置信息和所述目标对象的位置信息确定所述手相对于所述目标对象在偏航方向上的角度;
根据所述角度控制所述无人机对所述目标对象进行环绕飞行。
可选的,所述处理器1902,具体用于:
根据所述目标对象的手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
可选的,所述处理器1902具体用于:
确定所述目标对象的两个手在所述图像中的位置;
根据所述两个手在所述图像中的位置确定所述目标对象的两个手的位置信息;
根据所述两个手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
可选的,所述处理器1902具体用于:
根据所述两个手的位置信息确定所述两个手之间的距离;
根据所述距离控制所述无人机远离或靠近所述目标对象飞行。
可选的,所述处理器还用于:
确定所述图像中所述目标对象在所述图像中的位置;
根据所述目标对象在所述图像中的位置确定所述目标对象的位置信息。
可选的,所述处理器1902具体用于:
根据所述目标对象在所述图像中的位置、承载拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的位置信息。
可选的,所述处理器1902还用于:识别图像中目标对象的手的手势;
所述处理器在根据所述目标对象的手的位置信息控制无人机的飞行时,具体用于:当所述手的手势为预设手势时,根据所述目标对象的手的位置信息控制无人机的飞行。
本实施例的设备,可以用于执行图2-图12及其对应实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
在一些实施例中,所述处理器1902,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
确定所述图像中目标对象的特征部位;
识别所述图像中的手;
根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手;
当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
可选的,所述处理器1902,具体用于:
当所述手势为起飞手势时,控制所述无人机起飞。
可选的,所述处理器1902,具体用于:
控制所述无人机起飞并悬停在预设高度。
可选的,所述处理器1902,还用于:在检测到用户的第一操作后,控制承载拍摄装置的云台以带动所述拍摄装置在预设的角度范围内扫描。
可选的,所述第一操作包括:单击或双击电池开关、晃动所述无人机、向所述无人机发出语音指令中的至少一种。
可选的,所述处理器1902,具体用于:
获取所述拍摄装置拍摄的多帧图像;
确定所述多帧图像中每一帧中所述目标对象的手在所述图像中的位置;
当所述目标对象的手势为起飞手势且每一帧图像中所述目标对象的手在所述图像中的位置在参考位置的预设范围内时,控制所述无人机起飞。
可选的,所述参考位置为上一帧图像中所述目标对象的手在所述图像中的位置。
可选的,所述处理器1902,具体用于:
当所述手势为降落手势时,控制所述无人机降落。
可选的,所述处理器1902还用于:
获取距离传感器测量得到的高度值;
所述处理器1902用于当所述手势为降落手势时,控制所述无人机降落,包括:所述处理器用于当所述手势为降落手势且所述高度值小于或等于预设高度阈值时,控制所述无人机降落。
可选的,所述处理器1902还用于:检测所述无人机下方地面的平整度;
所述处理器1902用于当所述手势为降落手势时,控制所述无人机降落包括:
所述处理器1902用于当所述手势为降落手势且所述平整度大于或等于预设平整度阈值时,控制所述无人机降落。
可选的,所述处理器1902还用于:检测所述无人机下方是否存在水面;
所述处理器1902用于当所述手势为降落手势时,控制所述无人机降落包括:
所述处理器1902用于当所述手势为降落手势且所述无人机下方不存在水面时,控制所述无人机降落。
可选的,所述处理器1902,还用于:检测所述无人机的飞行速度;
所述处理器1902用于当所述手势为降落手势时,控制所述无人机降落包括:
所述处理器1902用于当所述手势为降落手势且所述速度小于或等于预设速度阈值时,控制所述无人机降落。
可选的,所述处理器1902,具体用于:
识别所述图像中的特征部位;
从所述图像中识别的特征部位确定所述目标对象的特征部位。
可选的,所述处理器1902,具体用于:
将距离所述图像的中心最近的特征部位确定为所述目标对象的特征部位。
可选的,所述处理器1902,具体用于:
根据所述目标对象的特征部位确定所述目标对象的关节点;
根据所述目标对象的关节点从所述图像中识别的手中确定所述目标对象的手。
可选的,所述处理器1902,具体用于:
确定从所述图像中识别的手中距离所述目标对象的关节点中手的关节点最近的手;
将所述距离手的关节点最近的手确定为所述目标对象的手。
可选的,所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
可选的,在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为所述头部和肩部。
可选的,所述预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
可选的,在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
可选的,所述预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
本实施例的设备,可以用于执行图15及其对应实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
在一些实施例中,所述处理器1902,用于调用所述程序代码执行:
获取拍摄装置拍摄的图像;
识别所述图像中目标对象的特征部位;
识别所述图像中所述目标对象的手;
当识别出所述目标对象的特征部位且识别不到所述目标对象的手时,控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
可选的,所述处理器1902,具体用于:
调整所述无人机的位置信息、姿态和承载所述拍摄装置的云台的姿态中的至少一种对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
可选的,所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
可选的,在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位 为所述头部和肩部。
可选的,所述预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
可选的,在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
可选的,所述预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
本实施例的设备,可以用于执行图18及其对应实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图20为本发明一实施例提供的无人机的一种结构示意图,如图20所示,本实施例的无人机可以包括:无人机的控制设备2001、拍摄装置2002和动力系统(图中未标示)。无人机的控制设备2201、拍摄装置2002和动力系统通过总线连接。其中,无人机的控制设备2201用于控制无人机的飞行,可以采用图19所示实施例的结构,其对应地,可以执行图2~图18中任一方法实施例及其对应实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。拍摄装置2002,用于拍摄图像。动力系统,用于提供飞行动力,动力驱动无人机飞行,其中,动力系统包括电调、电机、螺旋桨等。在一些实施例中,该无人机还可以包括云台2003,该云台2003用于承载拍摄装置2002。在一些实施例中,无人机还可以包括:定位传感器、距离传感器、速度传感器等。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:只读内存(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (97)

  1. 一种无人机的控制方法,其特征在于,包括:
    获取拍摄装置拍摄的图像;
    确定所述图像中目标对象的手在所述图像中的位置;
    根据所述手在所述图像中的位置确定所述目标对象的手的位置信息;
    根据所述目标对象的手的位置信息控制无人机的飞行。
  2. 根据权利要求1所述的方法,其特征在于,
    所述根据所述手在所述图像中的位置确定所述目标对象的手的位置信息包括:
    根据所述手在所述图像中的位置、承载所述拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的手的位置信息。
  3. 根据权利要求2所述的方法,其特征在于,
    所述根据手在图像中的位置、承载拍摄装置的云台的姿态、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息包括:
    根据手在图像中的位置、承载拍摄装置的云台的姿态确定手相对于无人机的朝向;
    根据所述朝向、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机的飞行包括:
    根据所述目标对象的手的位置信息控制无人机的飞行高度。
  5. 根据权利要求4所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机的飞行高度包括:
    根据所述目标对象的手的位置信息和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度;
    根据所述角度控制所述无人机的飞行高度。
  6. 根据权利要求5所述的方法,其特征在于,
    所述根据目标对象的手的位置信息和无人机的位置信息确定所述手相对于无人机在俯仰方向上的角度包括:
    当所述目标对象的状态参数满足第一预设要求时,根据所述目标对象的手的位置和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度。
  7. 根据权利要求6所述的方法,其特征在于,
    所述目标对象的状态参数满足第一预设要求包括:
    所述目标对象在所述图像中的尺寸占比大于或等于预设第一占比阈值;和/或,
    所述目标对象与所述无人机的距离小于或等于预设第一距离。
  8. 根据权利要求4所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机的飞行高度包括:
    根据所述目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述预设部位在俯仰方向上的角度;
    根据所述角度控制所述无人机的飞行高度。
  9. 根据权利要求8所述的方法,其特征在于,
    所述预设部位包括头部、肩部、胸部中的至少一个部位。
  10. 根据权利要求8或9所述的方法,其特征在于,
    所述根据目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述预设部位在俯仰方向上的角度包括:
    当目标对象的状态参数满足第二预设要求时,根据目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述特征部位在俯仰方向上的角度。
  11. 根据权利要求10所述的方法,其特征在于,
    所述目标对象的状态参数满足第二预设要求包括:
    所述目标对象在所述图像中的尺寸占比小于或等于预设第二占比阈值;和/或,
    所述目标对象与所述无人机的距离大于或等于预设第二距离。
  12. 根据权利要求8-11任一项所述的方法,其特征在于,所述预设部位的位置信息是根据所述目标对象的位置信息确定的。
  13. 根据权利要求1-3任一项所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机的飞行包括:
    根据所述目标对象的手的位置信息控制所述无人机对所述目标对象进行环绕飞行。
  14. 根据权利要求13所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机对目标对象进行环绕飞行包括:
    根据所述目标对象的手的位置信息和所述目标对象的位置信息确定所述手相对于所述目标对象在偏航方向上的角度;
    根据所述角度控制所述无人机对所述目标对象进行环绕飞行。
  15. 根据权利要求1-3任一项所述的方法,其特征在于,
    所述根据所述目标对象的手的位置信息控制无人机的飞行包括:
    根据所述目标对象的手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
  16. 根据权利要求15所述的方法,其特征在于,
    所述确定图像中目标对象的手在图像中的位置包括:
    确定所述目标对象的两个手在所述图像中的位置;
    根据手在图像中的位置确定目标对象的手的位置信息包括:
    根据所述两个手在所述图像中的位置确定所述目标对象的两个手的位置信息;
    所述根据所述目标对象的手的位置信息控制所述无人机远离或靠近所述目标对象飞行包括:
    根据所述两个手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
  17. 根据权利要求16所述的方法,其特征在于,
    所述根据所述两个手的位置信息控制所述无人机远离或靠近目标对象飞行包括:
    根据所述两个手的位置信息确定所述两个手之间的距离;
    根据所述距离控制所述无人机远离或靠近所述目标对象飞行。
  18. 根据权利要求12或14所述的方法,其特征在于,所述方法还包括:
    确定所述图像中所述目标对象在所述图像中的位置;
    根据所述目标对象在所述图像中的位置确定所述目标对象的位置信息。
  19. 根据权利要求18所述的方法,其特征在于,
    所述根据所述目标对象在所述图像中的位置确定所述目标对象的位置信息包括:
    根据所述目标对象在所述图像中的位置、承载拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的位置信息。
  20. 根据权利要求1-19任一项所述的方法,其特征在于,所述方法还包括:
    识别图像中目标对象的手的手势;
    所述根据所述目标对象的手的位置信息控制无人机的飞行包括:
    当所述手的手势为预设手势时,根据所述目标对象的手的位置信息控制无人机的飞行。
  21. 一种无人机的控制方法,其特征在于,包括:
    获取拍摄装置拍摄的图像;
    确定所述图像中目标对象的特征部位;
    识别所述图像中的手;
    根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手;
    当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
  22. 根据权利要求21所述的方法,其特征在于,当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作,包括:
    当所述手势为起飞手势时,控制所述无人机起飞。
  23. 根据权利要求22所述的方法,其特征在于,所述控制无人机起飞包括:
    控制所述无人机起飞并悬停在预设高度。
  24. 根据权利要求22或23所述的方法,其特征在于,所述方法还包括:
    在检测到用户的第一操作后,控制承载拍摄装置的云台以带动所述拍摄装置在预设的角度范围内扫描。
  25. 根据权利要求24所述的方法,其特征在于,所述第一操作包括:单击或双击电池开关、晃动所述无人机、向所述无人机发出语音指令中的至少一种。
  26. 根据权利要求22-25任一项所述的方法,其特征在于,包括:
    所述获取拍摄装置拍摄的图像包括:
    获取所述拍摄装置拍摄的多帧图像;
    所述方法还包括:
    确定所述多帧图像中每一帧中所述目标对象的手在所述图像中的位置;
    所述当所述目标对象的手势为起飞手势时,控制所述无人机起飞包括:
    当所述目标对象的手势为起飞手势且每一帧图像中所述目标对象的手在所述图像中的位置在参考位置的预设范围内时,控制所述无人机起飞。
  27. 根据权利要求26所述的方法,其特征在于,
    所述参考位置为上一帧图像中所述目标对象的手在所述图像中的位置。
  28. 根据权利要求21所述的方法,其特征在于,当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作,包括:
    当所述手势为降落手势时,控制所述无人机降落。
  29. 根据权利要求28所述的方法,其特征在于,所述方法还包括:
    获取距离传感器测量得到的高度值;
    所述当所述手势为降落手势时,控制所述无人机降落包括:
    当所述手势为降落手势且所述高度值小于或等于预设高度阈值时,控制所述无人机降落。
  30. 根据权利要求28或29所述的方法,其特征在于,所述方法还包括:
    检测所述无人机下方地面的平整度;
    所述当所述手势为降落手势时,控制所述无人机降落包括:
    当所述手势为降落手势且所述平整度大于或等于预设平整度阈值时,控制所述无人机降落。
  31. 根据权利要求28-30任一项所述的方法,其特征在于,所述方法还包括:
    检测所述无人机下方是否存在水面;
    所述当所述手势为降落手势时,控制所述无人机降落包括:
    当所述手势为降落手势且所述无人机下方不存在水面时,控制所述无人机降落。
  32. 根据权利要求28-31任一项所述的方法,其特征在于,所述方法还包括:
    检测所述无人机的飞行速度;
    所述当所述手势为降落手势时,控制所述无人机降落包括:
    当所述手势为降落手势且所述速度小于或等于预设速度阈值时,控制所述无人机降落。
  33. 根据权利要求21-32任一项所述的方法,其特征在于,
    所述确定所述图像中目标对象的特征部位包括:
    识别所述图像中的特征部位;
    从所述图像中识别的特征部位确定所述目标对象的特征部位。
  34. 根据权利要求33所述的方法,其特征在于,
    所述从所述图像中识别的特征部位确定所述目标对象的特征部位包括:
    将距离所述图像的中心最近的特征部位确定为所述目标对象的特征部位。
  35. 根据权利要求21-34任一项所述的方法,其特征在于,
    所述根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手包括:
    根据所述目标对象的特征部位确定所述目标对象的关节点;
    根据所述目标对象的关节点从所述图像中识别的手中确定所述目标对象的手。
  36. 根据权利要求35所述的方法,其特征在于,
    所述根据所述目标对象的关节点从所述图像中识别的手中确定所述目标对象的手包括:
    确定从所述图像中识别的手中距离所述目标对象的关节点中手的关节点最近的手;
    将所述距离手的关节点最近的手确定为所述目标对象的手。
  37. 根据权利要求21-36任一项所述的方法,其特征在于,
    所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
  38. 根据权利要求37所述的方法,其特征在于,
    在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为所述头部和肩部。
  39. 根据权利要求38所述的方法,其特征在于,
    所述目标对象的状态参数满足预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
  40. 根据权利要求37-39任一项所述的方法,其特征在于,
    在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
  41. 根据权利要求40所述的方法,其特征在于,
    所述目标对象的状态参数满足预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
  42. 一种无人机的控制方法,其特征在于,包括:
    获取拍摄装置拍摄的图像;
    识别所述图像中目标对象的特征部位;
    识别所述图像中所述目标对象的手;
    当识别出所述目标对象的特征部位且识别不到所述目标对象的手时,控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
  43. 根据权利要求42所述的方法,其特征在于,
    所述控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中包括:
    调整所述无人机的位置信息、姿态和承载所述拍摄装置的云台的姿态中的至少一种对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
  44. 根据权利要求42或43所述的方法,其特征在于,
    所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
  45. 根据权利要求44所述的方法,其特征在于,
    在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为所述头部和肩部。
  46. 根据权利要求45所述的方法,其特征在于,
    所述目标对象的状态参数满足预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
  47. 根据权利要求44-46任一项所述的方法,其特征在于,
    在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
  48. 根据权利要求47所述的方法,其特征在于,
    所述目标对象的状态参数满足预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
  49. 一种无人机的控制设备,其特征在于,包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,用于调用所述程序代码执行:
    获取拍摄装置拍摄的图像;
    确定所述图像中目标对象的手在所述图像中的位置;
    根据所述手在所述图像中的位置确定所述目标对象的手的位置信息;
    根据所述目标对象的手的位置信息控制无人机的飞行。
  50. 根据权利要求49所述的设备,其特征在于,所述处理器,具体用于:
    根据所述手在所述图像中的位置、承载所述拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的手的位置信息。
  51. 根据权利要求50所述的设备,其特征在于,所述处理器,具体用于:
    根据手在图像中的位置、承载拍摄装置的云台的姿态确定手相对于无人机的朝向;
    根据所述朝向、目标对象与无人机之间的水平距离和无人机的位置信息确定目标对象的手的位置信息。
  52. 根据权利要求49-51任一项所述的设备,其特征在于,所述处理器,具体用于:根据所述目标对象的手的位置信息控制无人机的飞行高度。
  53. 根据权利要求52所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的手的位置信息和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度;
    根据所述角度控制所述无人机的飞行高度。
  54. 根据权利要求53所述的设备,其特征在于,所述处理器,具体用于:
    当所述目标对象的状态参数满足第一预设要求时,根据所述目标对象的手的位置和所述无人机的位置信息确定所述手相对于所述无人机在俯仰方向上的角度。
  55. 根据权利要求54所述的设备,其特征在于,
    所述目标对象的状态参数满足第一预设要求包括:
    所述目标对象在所述图像中的尺寸占比大于或等于预设第一占比阈值;和/或,
    所述目标对象与所述无人机的距离小于或等于预设第一距离。
  56. 根据权利要求52所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述预设部位在俯仰方向上的角度;
    根据所述角度控制所述无人机的飞行高度。
  57. 根据权利要求56所述的设备,其特征在于,
    所述预设部位包括头部、肩部、胸部中的至少一个部位。
  58. 根据权利要求56或57所述的设备,其特征在于,
    所述处理器,具体用于:
    当目标对象的状态参数满足第二预设要求时,根据目标对象的预设部位的位置信息和所述手的位置信息确定所述手相对于所述特征部位在俯仰方向上的角度。
  59. 根据权利要求58所述的设备,其特征在于,
    所述目标对象的状态参数满足第二预设要求包括:
    所述目标对象在所述图像中的尺寸占比小于或等于预设第二占比阈值;和/或,
    所述目标对象与所述无人机的距离大于或等于预设第二距离。
  60. 根据权利要求56-59任一项所述的设备,其特征在于,所述预设部位的位置信息是根据所述目标对象的位置信息确定的。
  61. 根据权利要求49-52任一项所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的手的位置信息控制所述无人机对所述目标对象进行环绕飞行。
  62. 根据权利要求61所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的手的位置信息和所述目标对象的位置信息确定所述手相对于所述目标对象在偏航方向上的角度;
    根据所述角度控制所述无人机对所述目标对象进行环绕飞行。
  63. 根据权利要求49-52任一项所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
  64. 根据权利要求63所述的设备,其特征在于,所述处理器具体用于:
    确定所述目标对象的两个手在所述图像中的位置;
    根据所述两个手在所述图像中的位置确定所述目标对象的两个手的位置信息;
    根据所述两个手的位置信息控制所述无人机远离或靠近所述目标对象飞行。
  65. 根据权利要求64所述的设备,其特征在于,所述处理器具体用于:
    根据所述两个手的位置信息确定所述两个手之间的距离;
    根据所述距离控制所述无人机远离或靠近所述目标对象飞行。
  66. 根据权利要求60或62所述的设备,其特征在于,所述处理器还用于:
    确定所述图像中所述目标对象在所述图像中的位置;
    根据所述目标对象在所述图像中的位置确定所述目标对象的位置信息。
  67. 根据权利要求66所述的设备,其特征在于,所述处理器具体用于:
    根据所述目标对象在所述图像中的位置、承载拍摄装置的云台的姿态、所述目标对象与所述无人机之间的水平距离和所述无人机的位置信息确定所述目标对象的位置信息。
  68. 根据权利要求49-67任一项所述的设备,其特征在于,所述处理器还用于:识别图像中目标对象的手的手势;
    所述处理器在根据所述目标对象的手的位置信息控制无人机的飞行时,具体用于:当所述手的手势为预设手势时,根据所述目标对象的手的位置信息控制无人机的飞行。
  69. 一种无人机的控制设备,其特征在于,包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,用于调用所述程序代码执行:
    获取拍摄装置拍摄的图像;
    确定所述图像中目标对象的特征部位;
    识别所述图像中的手;
    根据所述目标对象的特征部位从所述图像中识别的手中确定所述目标对象的手;
    当所述目标对象的手的手势为控制手势时,控制无人机执行所述手势指示的动作。
  70. 根据权利要求69所述的设备,其特征在于,所述处理器,具体用于:
    当所述手势为起飞手势时,控制所述无人机起飞。
  71. 根据权利要求70所述的设备,其特征在于,所述处理器,具体用于:
    控制所述无人机起飞并悬停在预设高度。
  72. 根据权利要求70或71所述的设备,其特征在于,所述处理器,还用于:在检测到用户的第一操作后,控制承载拍摄装置的云台以带动所述拍摄装置在预设的角度范围内扫描。
  73. 根据权利要求72所述的设备,其特征在于,所述第一操作包括:单击或双击电池开关、晃动所述无人机、向所述无人机发出语音指令中的至少一种。
  74. 根据权利要求70-73任一项所述的设备,其特征在于,所述处理器,具体用于:
    获取所述拍摄装置拍摄的多帧图像;
    所述处理器,还用于:
    确定所述多帧图像中每一帧中所述目标对象的手在所述图像中的位置;
    所述处理器,具体用于:
    当所述目标对象的手势为起飞手势且每一帧图像中所述目标对象的手在所述图像中的位置在参考位置的预设范围内时,控制所述无人机起飞。
  75. 根据权利要求74所述的设备,其特征在于,
    所述参考位置为上一帧图像中所述目标对象的手在所述图像中的位置。
  76. 根据权利要求69所述的设备,其特征在于,所述处理器,具体用于:
    当所述手势为降落手势时,控制所述无人机降落。
  77. 根据权利要求76所述的设备,其特征在于,所述处理器还用于:
    获取距离传感器测量得到的高度值;
    所述处理器用于当所述手势为降落手势时,控制所述无人机降落时,具体用于:当所述手势为降落手势且所述高度值小于或等于预设高度阈值时,控制所述无人机降落。
  78. 根据权利要求76或77所述的设备,其特征在于,所述处理器还用于:检测所述无人机下方地面的平整度;
    所述处理器用于当所述手势为降落手势时,控制所述无人机降落时,具体用于:
    当所述手势为降落手势且所述平整度大于或等于预设平整度阈值时,控制所述无人机降落。
  79. 根据权利要求76-78任一项所述的设备,其特征在于,所述处理器还用于:检测所述无人机下方是否存在水面;
    所述处理器用于当所述手势为降落手势时,控制所述无人机降落时,具体用于:
    当所述手势为降落手势且所述无人机下方不存在水面时,控制所述无人机降落。
  80. 根据权利要求76-79任一项所述的设备,其特征在于,所述处理器,还用于:检测所述无人机的飞行速度;
    所述处理器用于当所述手势为降落手势时,控制所述无人机降落时,具体用于:
    当所述手势为降落手势且所述速度小于或等于预设速度阈值时,控制所述无人机降落。
  81. 根据权利要求76-80任一项所述的设备,其特征在于,所述处理器,具体用于:
    识别所述图像中的特征部位;
    从所述图像中识别的特征部位确定所述目标对象的特征部位。
  82. 根据权利要求81所述的设备,其特征在于,所述处理器,具体用于:
    将距离所述图像的中心最近的特征部位确定为所述目标对象的特征部位。
  83. 根据权利要求69-82任一项所述的设备,其特征在于,所述处理器,具体用于:
    根据所述目标对象的特征部位确定所述目标对象的关节点;
    根据所述目标对象的关节点从所述图像中识别的手中确定所述目标对象的手。
  84. 根据权利要求83所述的设备,其特征在于,所述处理器,具体用于:
    确定从所述图像中识别的手中距离所述目标对象的关节点中手的关节点最近的手;
    将所述距离手的关节点最近的手确定为所述目标对象的手。
  85. 根据权利要求69-84任一项所述的设备,其特征在于,
    所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
  86. 根据权利要求85所述的设备,其特征在于,
    在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为所述头部和肩部。
  87. 根据权利要求86所述的设备,其特征在于,
    所述目标对象的状态参数满足预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
  88. 根据权利要求85-87任一项所述的设备,其特征在于,
    在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
  89. 根据权利要求88所述的设备,其特征在于,
    所述目标对象的状态参数满足预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
  90. 一种无人机的控制设备,其特征在于,包括:存储器和处理器;
    所述存储器,用于存储程序代码;
    所述处理器,用于调用所述程序代码执行:
    获取拍摄装置拍摄的图像;
    识别所述图像中目标对象的特征部位;
    识别所述图像中所述目标对象的手;
    当识别出所述目标对象的特征部位且识别不到所述目标对象的手时,控制无人机对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
  91. 根据权利要求90所述的设备,其特征在于,所述处理器,具体用于:
    调整所述无人机的位置信息、姿态和承载所述拍摄装置的云台的姿态中的至少一种对所述目标对象进行跟踪以使所述目标对象在所述拍摄装置的拍摄画面中。
  92. 根据权利要求90或91所述的设备,其特征在于,
    所述特征部位包括人体的头部、人体的头部和肩部、人体中的至少一种。
  93. 根据权利要求92所述的设备,其特征在于,
    在所述目标对象的状态参数满足预设的第一状态参数条件时,所述特征部位为所述头部和肩部。
  94. 根据权利要求93所述的设备,其特征在于,
    所述目标对象的状态参数满足预设的第一状态参数条件包括:所述目标对象在图像中的尺寸占比大于或等于预设第一占比阈值,和/或,所述目标对象与所述无人机的距离小于或等于预设第一距离。
  95. 根据权利要求92-94任一项所述的设备,其特征在于,
    在所述目标对象的状态参数满足预设的第二状态参数条件时,所述特征部位为所述人体。
  96. 根据权利要求95所述的设备,其特征在于,
    所述目标对象的状态参数满足预设的第二状态参数条件包括:所述目标对象在图像中的尺寸占比小于或等于预设第二占比阈值,和/或,所述目标对象与所述无人机的距离大于或等于预设第二距离。
  97. 一种无人机,其特征在于,包括:
    如权利要求49-96任一项所述的无人机的控制设备,
    拍摄装置,用于拍摄图像;
    以及动力系统,用于提供飞行动力。
PCT/CN2018/073803 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机 WO2019144271A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/CN2018/073803 WO2019144271A1 (zh) 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机
CN201880001655.7A CN109074168B (zh) 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机
CN202210589359.1A CN114879715A (zh) 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机
US16/934,910 US12125229B2 (en) 2018-01-23 2020-07-21 UAV control method, device and UAV
US18/920,216 US20250045949A1 (en) 2018-01-23 2024-10-18 Uav control method, device and uav

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/073803 WO2019144271A1 (zh) 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/934,910 Continuation US12125229B2 (en) 2018-01-23 2020-07-21 UAV control method, device and UAV

Publications (1)

Publication Number Publication Date
WO2019144271A1 true WO2019144271A1 (zh) 2019-08-01

Family

ID=64789396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/073803 WO2019144271A1 (zh) 2018-01-23 2018-01-23 无人机的控制方法、设备和无人机

Country Status (3)

Country Link
US (2) US12125229B2 (zh)
CN (2) CN109074168B (zh)
WO (1) WO2019144271A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021073733A1 (en) * 2019-10-16 2021-04-22 Supsi Method for controlling a device by a human

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102032067B1 (ko) * 2018-12-05 2019-10-14 세종대학교산학협력단 강화학습 기반 무인 항공기 원격 제어 방법 및 장치
CN110097592B (zh) * 2019-04-07 2020-12-15 杭州晶一智能科技有限公司 地面信息的语义性描述方法
JP7435599B2 (ja) * 2019-04-08 2024-02-21 ソニーグループ株式会社 情報処理装置、情報処理方法、及びプログラム
CN110385324B (zh) * 2019-06-03 2021-07-27 浙江大华技术股份有限公司 摄像装置的清洁方法、系统、可读存储介质和设备
CN110426970B (zh) * 2019-06-25 2021-05-25 西安爱生无人机技术有限公司 一种无人机拍照系统及其控制方法
CN110262539A (zh) * 2019-07-05 2019-09-20 深圳市道通智能航空技术有限公司 无人机起降控制方法、飞行控制器及无人机
CN112154390A (zh) * 2019-07-30 2020-12-29 深圳市大疆创新科技有限公司 飞行器的降落方法、无人飞行器及计算机可读存储介质
WO2021026782A1 (zh) * 2019-08-13 2021-02-18 深圳市大疆创新科技有限公司 手持云台的控制方法、控制装置、手持云台及存储介质
CN114585985A (zh) * 2020-11-05 2022-06-03 深圳市大疆创新科技有限公司 无人机控制方法、装置、无人机及计算机可读存储介质
JP7174748B2 (ja) * 2020-12-18 2022-11-17 楽天グループ株式会社 走行制御システム、制御方法、及び制御装置
US11615639B1 (en) * 2021-01-27 2023-03-28 Jackson Klein Palm vein identification apparatus and method of use
CN113091752A (zh) * 2021-04-16 2021-07-09 中山大学 一种基于多无人机的目标位姿实时测量方法及系统
CN113870345B (zh) * 2021-09-24 2022-10-18 埃洛克航空科技(北京)有限公司 基于三维场景的飞行定位方法以及装置、存储介质、电子装置
US20230306715A1 (en) * 2022-03-24 2023-09-28 AO Kaspersky Lab System and method for detecting and recognizing small objects in images using a machine learning algorithm
US20230350427A1 (en) * 2022-04-27 2023-11-02 Snap Inc. Landing an autonomous drone with gestures
CN116189308B (zh) * 2023-03-09 2023-08-01 杰能科世智能安全科技(杭州)有限公司 一种无人机飞手检测方法、系统及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090222149A1 (en) * 2008-02-28 2009-09-03 The Boeing Company System and method for controlling swarm of remote unmanned vehicles through human gestures
CN106200657A (zh) * 2016-07-09 2016-12-07 东莞市华睿电子科技有限公司 一种无人机控制方法
CN106249888A (zh) * 2016-07-28 2016-12-21 纳恩博(北京)科技有限公司 一种云台控制方法和装置
CN106339079A (zh) * 2016-08-08 2017-01-18 清华大学深圳研究生院 一种基于计算机视觉的利用无人飞行器实现虚拟现实的方法及装置
CN106377228A (zh) * 2016-09-21 2017-02-08 中国人民解放军国防科学技术大学 基于Kinect的无人机操作员状态监视与分层控制方法
CN106843489A (zh) * 2017-01-24 2017-06-13 腾讯科技(深圳)有限公司 一种飞行器的飞行路线控制方法及飞行器

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105518576B (zh) * 2013-06-28 2019-04-16 陈家铭 根据手势的控制装置操作
US9836053B2 (en) * 2015-01-04 2017-12-05 Zero Zero Robotics Inc. System and method for automated aerial system operation
WO2017060782A1 (en) * 2015-10-07 2017-04-13 Lee Hoi Hung Herbert Flying apparatus with multiple sensors and gesture-based operation
US9758246B1 (en) * 2016-01-06 2017-09-12 Gopro, Inc. Systems and methods for adjusting flight control of an unmanned aerial vehicle
US10191496B2 (en) * 2016-04-21 2019-01-29 Foundation Of Soongsil University-Industry Cooperation Unmanned aerial vehicle and a landing guidance method using the same
CN106094861B (zh) * 2016-06-02 2024-01-12 零度智控(北京)智能科技有限公司 无人机、无人机控制方法及装置
KR20180051996A (ko) * 2016-11-09 2018-05-17 삼성전자주식회사 무인 비행 장치 및 이를 이용한 피사체 촬영 방법
US10409276B2 (en) * 2016-12-21 2019-09-10 Hangzhou Zero Zero Technology Co., Ltd. System and method for controller-free user drone interaction
CN107203215A (zh) * 2017-05-04 2017-09-26 西北工业大学 一种手势及语音控制四旋翼飞行器方法
CN107463181A (zh) * 2017-08-30 2017-12-12 南京邮电大学 一种基于AprilTag的四旋翼飞行器自适应追踪系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090222149A1 (en) * 2008-02-28 2009-09-03 The Boeing Company System and method for controlling swarm of remote unmanned vehicles through human gestures
CN106200657A (zh) * 2016-07-09 2016-12-07 东莞市华睿电子科技有限公司 一种无人机控制方法
CN106249888A (zh) * 2016-07-28 2016-12-21 纳恩博(北京)科技有限公司 一种云台控制方法和装置
CN106339079A (zh) * 2016-08-08 2017-01-18 清华大学深圳研究生院 一种基于计算机视觉的利用无人飞行器实现虚拟现实的方法及装置
CN106377228A (zh) * 2016-09-21 2017-02-08 中国人民解放军国防科学技术大学 基于Kinect的无人机操作员状态监视与分层控制方法
CN106843489A (zh) * 2017-01-24 2017-06-13 腾讯科技(深圳)有限公司 一种飞行器的飞行路线控制方法及飞行器

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021073733A1 (en) * 2019-10-16 2021-04-22 Supsi Method for controlling a device by a human

Also Published As

Publication number Publication date
US20200346753A1 (en) 2020-11-05
US20250045949A1 (en) 2025-02-06
CN114879715A (zh) 2022-08-09
US12125229B2 (en) 2024-10-22
CN109074168B (zh) 2022-06-17
CN109074168A (zh) 2018-12-21

Similar Documents

Publication Publication Date Title
WO2019144271A1 (zh) 无人机的控制方法、设备和无人机
US11724805B2 (en) Control method, control device, and carrier system
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US11604479B2 (en) Methods and system for vision-based landing
CN111596649B (zh) 用于空中系统的单手远程控制设备
JP2022554248A (ja) 無人飛行体を使用する構造体スキャン
WO2018098784A1 (zh) 无人机的控制方法、装置、设备和无人机的控制系统
WO2020172800A1 (zh) 可移动平台的巡检控制方法和可移动平台
WO2019155335A1 (en) Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same
WO2021168819A1 (zh) 无人机的返航控制方法和设备
WO2019183789A1 (zh) 无人机的控制方法、装置和无人机
WO2020019106A1 (zh) 云台和无人机控制方法、云台及无人机
CN113795803B (zh) 无人飞行器的飞行辅助方法、设备、芯片、系统及介质
WO2021217371A1 (zh) 可移动平台的控制方法和装置
WO2019227289A1 (zh) 延时拍摄控制方法和设备
CN110568860A (zh) 一种无人飞行器的返航方法、装置及无人飞行器
WO2020042159A1 (zh) 一种云台的转动控制方法、装置及控制设备、移动平台
WO2019189381A1 (ja) 移動体、制御装置、および制御プログラム
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
CN108450032B (zh) 飞行控制方法和装置
CN109754420B (zh) 一种目标距离估计方法、装置及无人机
WO2020154942A1 (zh) 无人机的控制方法和无人机
WO2022205294A1 (zh) 无人机的控制方法、装置、无人机及存储介质
WO2021223176A1 (zh) 无人机的控制方法和设备
WO2020237429A1 (zh) 遥控设备的控制方法和遥控设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902911

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902911

Country of ref document: EP

Kind code of ref document: A1