CN114079727A - Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition - Google Patents
Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition Download PDFInfo
- Publication number
- CN114079727A CN114079727A CN202010816328.6A CN202010816328A CN114079727A CN 114079727 A CN114079727 A CN 114079727A CN 202010816328 A CN202010816328 A CN 202010816328A CN 114079727 A CN114079727 A CN 114079727A
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- face
- target object
- face recognition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 17
- 230000006399 behavior Effects 0.000 claims abstract 2
- 238000000034 method Methods 0.000 claims description 8
- 238000004458 analytical method Methods 0.000 claims description 5
- 238000013499 data model Methods 0.000 claims description 4
- 230000003068 static effect Effects 0.000 claims description 3
- 230000003993 interaction Effects 0.000 claims 3
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims 1
- 230000002159 abnormal effect Effects 0.000 claims 1
- 230000004888 barrier function Effects 0.000 claims 1
- 238000004364 calculation method Methods 0.000 claims 1
- 230000026676 system process Effects 0.000 claims 1
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000033999 Device damage Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/933—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses an unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition, which is used for processing and determining a target object to be tracked and shot by acquiring face information of a face; by acquiring body information of a target object, a node line information model is established for the target object, prejudgment of motion behaviors is achieved, and preparation is made for rapid movement of an unmanned aerial vehicle; determining a central point (reference point) through face modeling or line intersection, measuring a distance through a laser radar, and controlling the flight track of the unmanned aerial vehicle according to the change of the central point (reference point) on a time axis and the distance measured by the radar; identifying an environment and an obstacle through a laser radar, and establishing a sphere reference feasible safety path with a proper radius by taking an unmanned aerial vehicle as a center; the invention can reduce the workload of outdoor photographers, and can be used for real person show photography, outdoor photography, photography in crowds and the like.
Description
Technical Field
The invention relates to the technical field of automatic control of unmanned aerial vehicles for shooting based on face recognition, in particular to an unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition.
Background
In recent years, more than two photographers are needed to shoot each picture in the outdoors by the synthesis process of each large television station, so that the investment of manpower and material resources is increased, more people surround one person for recording programs, and the 'true' of the real show program cannot be interpreted well; in the same way, in the shooting process of a part of film and television works, the actors cannot release the emotion required by the scene well due to the interference of too many external personnel, and the resonance degree of the works is reduced.
With the adoption of the 5G standard, the transmission of the video can reach higher definition and the time delay of remote real-time control can be greatly reduced at the end of various automatic driving and remote control entry hot technologies; the existing unmanned aerial vehicle mainly carries out large-area aerial photography with higher shooting height, and one person needs to control another person to see pictures; the photographer must also manually hold the camera, and during heavy running, the stability of the picture is greatly compromised.
Disclosure of Invention
The invention aims to provide unmanned aerial vehicle automatic tracking shooting based on 5G and face recognition, which aims to solve the practical problems existing in unmanned aerial vehicle shooting and artificial shooting in the background technology, reduce the interference of real shows and other film and television works caused by shooting conditions in the manufacturing process, simultaneously liberate manpower and reduce material resources.
The invention aims to have two task areas of face information acquisition and ambient environment information acquisition, and realize the following four functional modules: face identification module, with shoot flight path planning module, health point line motion information analysis module, safe flight path planning module, the unmanned aerial vehicle of common control carries out automatic tracking and shoots.
Face information acquisition task area: acquiring face information characteristics for face recognition, and establishing a data model;
the surrounding environment information acquisition task area: acquiring surrounding environment data information for judging safe flight conditions, establishing a surrounding environment three-dimensional model in real time, and establishing three different flight safety levels of safety, generality and danger;
a face recognition module: after the face information is collected, the target object is accurately and continuously tracked and shot;
follow and shoot flight path planning module: the system is used for realizing automatic planning of the flight path of the unmanned aerial vehicle, and calculating and planning the flight path of the unmanned aerial vehicle according to the movement of a central point (a reference point) of the target object face;
body point line motion information analysis module: the unmanned aerial vehicle is assisted to continuously track and shoot the target;
the safe flight path planning module: a spherical safety channel with a proper radius is established by taking the unmanned aerial vehicle as a center, so that the unmanned aerial vehicle can track and shoot a target object under a complex environment.
The invention provides 5G and face recognition based unmanned aerial vehicle automatic tracking shooting, which establishes a target object for unmanned aerial vehicle automatic tracking shooting and plans a safe flight path through face information acquisition and environment information acquisition, reduces the use of manpower and material resources for shooting work, and reduces the shaking influence on shooting caused by violent movement.
Drawings
The drawings are briefly described in order to more clearly illustrate the technical solution and the implementation of the present invention, and it is obvious that the drawings in the following description are only some implementation of the present invention, and other drawings can be obtained by those skilled in the art without inventive exercise.
FIG. 1 is a schematic view illustrating a center point (reference point) according to the present invention;
FIG. 2 is a schematic diagram illustrating a sphere datum safe passage according to the present invention;
fig. 3 is a flow chart of the unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of them.
The invention provides 5G and face recognition-based unmanned aerial vehicle automatic tracking shooting, which realizes continuous tracking shooting and automatic planning of a flight path of an unmanned aerial vehicle by acquiring face information and ambient environment information, and jointly calculating through a face recognition module, a follow-shooting flight path planning module, a body point and line motion information analysis module and a safe flight path planning module, and a simple flow chart of the system can refer to fig. 3.
Face information acquisition task area: and acquiring facial feature information of the target object through the configured lens or other face recognition sensors, and establishing a face data model.
The surrounding environment information acquisition task area: the method comprises the steps of collecting surrounding environment information by matching a laser radar with a depth sensor and the like, obtaining surrounding environment data information used for judging safe flight conditions, and establishing a surrounding environment three-dimensional model in real time. Different safe flight levels are set for different surrounding environmental complexities: firstly, if the unmanned aerial vehicle is in open outdoor, no obstacle exists in a low-altitude range required by visual field visible flight or the influence of the obstacle on the unmanned aerial vehicle flight is calculated by obstacle influence factors, wherein the safety flight level is defined as safety; secondly, in wider indoor public places such as hotels, shopping malls, schools, convention and exhibition centers and the like, the influence factor of the obstacle is calculated to obtain that the influence of the obstacle on the flight of the unmanned aerial vehicle is between 0.3 and 0.7, and the safe flight level is defined as general; thirdly, in a room with a narrow space or an environment which is complex in structure and not beneficial to flight and has more interference in air flight, the influence factor of the obstacle is calculated to obtain that the influence of the obstacle on the flight of the unmanned aerial vehicle is between 0.7 and 1, and the safe flight level is defined as a danger; and sending warning information to an administrator when the safety level is dangerous.
A face recognition module: determining a target object to be tracked and shot by a face recognition algorithm according to a face data model established in a face information acquisition task area; when the object is lost, the crowd range in which the target object possibly exists is set, and the target object is searched again in the crowd range through face recognition.
Follow and shoot flight path planning module: a first part: determining a central point (reference point, see fig. 1) of a target object face, or determining a central point (reference point) in an HOG image of a face, establishing a two-dimensional plane coordinate system parallel to the face with the point, determining a rotation direction and a distance of the target object according to a displacement state of the two points, and determining a moving path of the unmanned aerial vehicle on the plane, so that the unmanned aerial vehicle can follow the target object, and a shooting lens and the face of a person are on the same parallel plane; a second part: the separation distance between the unmanned aerial vehicle and the target object is obtained through the laser radar and the distance sensor, so that the unmanned aerial vehicle keeps a relative static state between a point and the point in the process of moving the unmanned aerial vehicle and the target object on the same horizontal plane; and finally, establishing a real-time refreshed three-dimensional space coordinate system by combining the two-dimensional plane coordinate systems of the first part and the second part and the relative static connecting line of the point, and controlling the flight path of the unmanned aerial vehicle by the system in a coordinate point mode through a single chip microcomputer or other embedded systems, so that the unmanned aerial vehicle can automatically plan the flight path in the coordinate system to carry out surrounding tracking shooting on the target object.
Body point line motion information analysis module: the method comprises the steps of obtaining body point line information of a target object, arranging nodes at joints, connecting according to the direction of human bones to form body point line graphs, collecting a large number of body point line graphs in different motion states, establishing a large data comparison library, and when the body point lines of the target object are analyzed and judged to have a severe motion trend, preparing acceleration in advance by an unmanned aerial vehicle, preventing the target object from being lost, and preventing the engine and the like from being seriously damaged due to sudden speed change; make unmanned aerial vehicle solve the target that leads to under fast, the big motion of change and the displacement state and lose and the problem of device damage, supplementary unmanned aerial vehicle can continuously track the shooting to the target.
The safe flight path planning module: establishing a sphere with a proper safe radius by combining a three-dimensional model established in a surrounding environment information acquisition task area and establishing a sphere reference feasible channel (refer to fig. 2) by taking an unmanned aerial vehicle as a center, wherein the sphere reference feasible channel is used for flight judgment in a relatively complex structure; if the flight path calculated by the follow-shooting flight path planning module is within the safety range of the sphere reference safety channel established by the safety flight path planning module, no one can continue to perform tracking shooting according to the planned path; if the flight path calculated by the follow-shooting flight path planning module exceeds the safety range of the sphere reference feasible channel established by the safe flight path planning module, the unmanned aerial vehicle immediately hovers and sends a hovering warning message to an administrator, so that the administrator manually controls the flight or selects a feasible flight scheme; make unmanned aerial vehicle also can maximize safety under comparatively complicated environment track the shooting to the target object.
From the above description of the embodiments, it is clear for those skilled in the art that the present invention can be implemented by software plus necessary hardware and a single chip system, and based on such understanding, all or part of the technical solution of the present invention that contributes to the background art can be embodied in the form of a software product or a combination of software and hardware, where the computer software product can be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in each embodiment or some parts of the embodiments of the present invention.
The present invention has been described in detail, and the above description of the embodiments is only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (7)
1. The utility model provides an unmanned aerial vehicle automatic tracking shooting system based on 5G and face identification which characterized in that includes: the system processes and determines a target object to be tracked and shot by acquiring face information of a face; by acquiring body information of a target object, a node line information model is established for the target object, prejudgment of motion behaviors is achieved, and preparation is made for rapid movement of an unmanned aerial vehicle; determining a central point (reference point) through face modeling or line intersection, measuring a distance through a laser radar, and controlling the flight track of the unmanned aerial vehicle according to the change of the central point (reference point) on a time axis and the distance measured by the laser radar; through laser radar discernment environment and barrier, use unmanned aerial vehicle as the sphere benchmark escape way of center suitable radius to establish.
2. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: acquiring a task area through face information: acquiring face information characteristics for face recognition, and establishing a data model; collecting the task area through the surrounding environment information: the method comprises the steps of obtaining surrounding environment data information used for judging safe flight conditions, establishing a surrounding environment three-dimensional model in real time, and establishing three different flight safety levels of safety, generality and danger.
3. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: the system mainly comprises the following modules: the device comprises a face recognition module, a follow-shooting flight path planning module, a body point and line motion information analysis module and a safe flight path planning module.
4. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: use unmanned aerial vehicle to establish spheroid benchmark escape way as the center for provide the judgement condition for unmanned aerial vehicle safe flight in the place that the structure is comparatively complicated or the space is comparatively narrow and small.
5. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: the body point line motion information of the target object is collected and compared with the point line information in the big database, the motion trend of the target object is judged, preparation is made for the rapid movement of the unmanned aerial vehicle, and the unmanned aerial vehicle can avoid the target object loss caused by the slow starting of the hardware facilities and the damage to the hardware facilities caused by abnormal and rapid operation.
6. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: the system establishes a real-time refreshed three-dimensional space coordinate system by combining a two-dimensional plane coordinate system of a parallel human face established by a central point (reference point) of the human face and a connection line of the unmanned aerial vehicle and the human face in a point-to-point relative static state, and controls the flight path of the unmanned aerial vehicle by a singlechip or other embedded systems in a coordinate point mode according to a path calculation result, so that the unmanned aerial vehicle can automatically plan the flight path in the three-dimensional coordinate system to carry out surrounding tracking shooting on a target object.
7. The unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition is characterized in that: the unmanned aerial vehicle flight control system further comprises an interaction module, wherein the interaction module is used for obtaining an operation instruction through an interaction interface, receiving the operation instruction and manually setting the flight path of the unmanned aerial vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010816328.6A CN114079727A (en) | 2020-08-17 | 2020-08-17 | Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010816328.6A CN114079727A (en) | 2020-08-17 | 2020-08-17 | Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114079727A true CN114079727A (en) | 2022-02-22 |
Family
ID=80280691
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010816328.6A Pending CN114079727A (en) | 2020-08-17 | 2020-08-17 | Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114079727A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180046188A1 (en) * | 2015-08-19 | 2018-02-15 | Eyedea Inc. | Unmanned aerial vehicle having automatic tracking function and method of controlling the same |
CN109977770A (en) * | 2019-02-21 | 2019-07-05 | 安克创新科技股份有限公司 | A kind of auto-tracking shooting method, apparatus, system and storage medium |
CN110162102A (en) * | 2019-05-17 | 2019-08-23 | 广东技术师范大学 | Unmanned plane automatic identification tracking and system based on cloud platform and machine vision |
-
2020
- 2020-08-17 CN CN202010816328.6A patent/CN114079727A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180046188A1 (en) * | 2015-08-19 | 2018-02-15 | Eyedea Inc. | Unmanned aerial vehicle having automatic tracking function and method of controlling the same |
CN109977770A (en) * | 2019-02-21 | 2019-07-05 | 安克创新科技股份有限公司 | A kind of auto-tracking shooting method, apparatus, system and storage medium |
CN110162102A (en) * | 2019-05-17 | 2019-08-23 | 广东技术师范大学 | Unmanned plane automatic identification tracking and system based on cloud platform and machine vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111958592B (en) | Image semantic analysis system and method for transformer substation inspection robot | |
Karakostas et al. | Shot type constraints in UAV cinematography for autonomous target tracking | |
US20110115909A1 (en) | Method for tracking an object through an environment across multiple cameras | |
CN107390704B (en) | IMU attitude compensation-based multi-rotor unmanned aerial vehicle optical flow hovering method | |
Bian et al. | A novel monocular-based navigation approach for UAV autonomous transmission-line inspection | |
CN111080679A (en) | Method for dynamically tracking and positioning indoor personnel in large-scale place | |
CN107093171A (en) | A kind of image processing method and device, system | |
US11475671B2 (en) | Multiple robots assisted surveillance system | |
CA2809888A1 (en) | System and method for tracking | |
CN112085003A (en) | Automatic identification method and device for abnormal behaviors in public places and camera equipment | |
CN111679695A (en) | Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology | |
CN109547769B (en) | Highway traffic dynamic three-dimensional digital scene acquisition and construction system and working method thereof | |
CN110334701A (en) | Collecting method based on deep learning and multi-vision visual under the twin environment of number | |
CN105760846A (en) | Object detection and location method and system based on depth data | |
CN112115607A (en) | Mobile intelligent digital twin system based on multidimensional Sayboat space | |
US20210221502A1 (en) | Method and a system for real-time data processing, tracking, and monitoring of an asset using uav | |
Karakostas et al. | UAV cinematography constraints imposed by visual target tracking | |
CN113191388A (en) | Image acquisition system for target detection model training and sample generation method | |
CN114897988A (en) | Multi-camera positioning method, device and equipment in hinge type vehicle | |
CN112669205A (en) | Three-dimensional video fusion splicing method | |
CN107607939B (en) | Optical target tracking and positioning radar device based on real map and image | |
Wong et al. | Visual gaze analysis of robotic pedestrians moving in urban space | |
CN113110597A (en) | Indoor unmanned aerial vehicle autonomous flight system based on ROS system | |
CN114079727A (en) | Unmanned aerial vehicle automatic tracking shooting system based on 5G and face recognition | |
CN112785564A (en) | Pedestrian detection tracking system and method based on mechanical arm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |