Nothing Special   »   [go: up one dir, main page]

CN108646776B - Imaging system and method based on unmanned aerial vehicle - Google Patents

Imaging system and method based on unmanned aerial vehicle Download PDF

Info

Publication number
CN108646776B
CN108646776B CN201810637815.9A CN201810637815A CN108646776B CN 108646776 B CN108646776 B CN 108646776B CN 201810637815 A CN201810637815 A CN 201810637815A CN 108646776 B CN108646776 B CN 108646776B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
head
display screen
mounted display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810637815.9A
Other languages
Chinese (zh)
Other versions
CN108646776A (en
Inventor
芦振华
甘靖山
陈康兴
蒋晓光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Jinshan Shiyou Technology Co.,Ltd.
Original Assignee
Zhuhai Kingsoft Online Game Technology Co Ltd
Chengdu Xishanju Interactive Entertainment Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Kingsoft Online Game Technology Co Ltd, Chengdu Xishanju Interactive Entertainment Technology Co Ltd filed Critical Zhuhai Kingsoft Online Game Technology Co Ltd
Priority to CN201810637815.9A priority Critical patent/CN108646776B/en
Publication of CN108646776A publication Critical patent/CN108646776A/en
Application granted granted Critical
Publication of CN108646776B publication Critical patent/CN108646776B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an imaging system based on an unmanned aerial vehicle, which comprises: an unmanned aerial vehicle and an imaging control device; wherein, unmanned aerial vehicle includes: more than 3 unmanned aerial vehicle cameras; a multi-channel video processing module; and unmanned aerial vehicle communication module, formation of image controlling means includes: the head-mounted display screen is used for displaying the image sent by the unmanned aerial vehicle; an eyeball tracking unit for tracking changes in azimuth and elevation of an iris of a user; and a control device communication unit for communicating with the drone communication module. An unmanned aerial vehicle-based imaging method is also provided. The response speed of visual angle switching of the image can be improved, the visual angle of the head-mounted display screen can be changed rapidly, and immersion experience is achieved.

Description

Imaging system and method based on unmanned aerial vehicle
Technical Field
The invention relates to the technical field of image processing, in particular to an imaging system and method based on an unmanned aerial vehicle.
Background
Unmanned aerial vehicle and Augmented Reality (AR) technology are two firest technologies at present, in the field of remote control unmanned aerial vehicles, various unmanned aerial vehicles such as toy unmanned aerial vehicles, aerial photography unmanned aerial vehicles and agricultural unmanned aerial vehicles compete in a hundred flowers, and in the field of Augmented Reality, various AR games are layered endlessly and are abnormally exploded. However, how to combine the two has been a problem.
For example, an augmented reality game based on an unmanned aerial vehicle is developed, namely, an unmanned aerial vehicle is used as a carrier, an operator plays a pilot, and the real air combat experience is brought to the operator by synchronizing the (camera) view of the unmanned aerial vehicle with the (display) view of the operator through an augmented reality technology.
Disclosure of Invention
In order to solve the above technical problem, according to a first aspect of the present invention, there is provided a drone-based imaging system comprising: an unmanned aerial vehicle and an imaging control device; wherein, unmanned aerial vehicle includes: more than 3 unmanned aerial vehicle cameras; a multi-channel video processing module; and unmanned aerial vehicle communication module, formation of image controlling means includes: the head-mounted display screen is used for displaying the image sent by the unmanned aerial vehicle; an eyeball tracking unit for tracking changes in azimuth and elevation of an iris of a user; and a control device communication unit for communicating with the drone communication module.
Further, the imaging control apparatus further includes: a head posture sensing unit for capturing the azimuth angle and elevation angle changes of the head of the user; a shoulder posture sensing unit for capturing the azimuth angle change of the user's shoulder.
Further, the imaging control device further comprises an image processing unit, and the image processing unit is used for overlaying the virtual image on the image sent by the unmanned aerial vehicle and sending the image to the head-mounted display screen for display.
Furthermore, the eyeball tracking unit comprises more than two infrared cameras; the head posture sensing unit comprises an angular acceleration sensor, and the shoulder posture sensing unit comprises more than two acceleration sensors.
Further, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the coincident visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view of each unmanned aerial vehicle camera has the coincidence with the view of adjacent unmanned aerial vehicle camera.
Further, the overlapping area between the vision fields of the cameras of the adjacent unmanned aerial vehicles is larger than or equal to the vision field of the image sent by the unmanned aerial vehicle to the imaging control device.
According to a second aspect of the present invention, there is provided a drone-based imaging method for use with the imaging system of the first aspect of the present invention, the method comprising the steps of: tracking the pose of the user's eyes and head; based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras; setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode; when the display area of the head-mounted display screen comprises the overlapped area of the vision fields of the plurality of unmanned cameras, splicing the images shot by the plurality of cameras, and positioning the position of the display area of the head-mounted display screen in the spliced images; and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle.
According to a third aspect of the present invention, there is provided a drone-based imaging method for use with the imaging system of the first aspect of the present invention, the method comprising the steps of: tracking the pose of the user's eyes and head; based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras; setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode; when the display area of the head-mounted display screen is completely positioned in the overlapping area of the vision fields of the unmanned cameras, switching the unmanned cameras based on the moving direction of the eyeballs and the head postures, and positioning the position of the display area of the head-mounted display screen in the switched image; and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle.
The invention has the beneficial effects that: the response speed of visual angle switching of the image can be improved, the visual angle of the head-mounted display screen can be changed rapidly, and immersion experience is achieved. The transmission quantity of data is reduced, and the cost for manufacturing the ultra-wide angle lens of more than 180 degrees is reduced.
Drawings
Fig. 1 shows an imaging control apparatus according to an embodiment of the present invention;
figure 2 shows a drone according to an embodiment of the invention;
fig. 3 shows the view of the drone camera of the drone in fig. 2 in the horizontal plane.
Detailed Description
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct combination or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated into two processing units, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in two computer readable storage media. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those within the specification of the present application that terms such as "first," "second," "A," "B," and the like, if not specifically stated, do not denote a limitation of order. For example, "step three" may precede "step one," and "D" may also be performed simultaneously with "a".
Unmanned aerial vehicle and Augmented Reality (AR) technology are two firest technologies at present, in the field of remote control unmanned aerial vehicles, various unmanned aerial vehicles such as toy unmanned aerial vehicles, aerial photography unmanned aerial vehicles and agricultural unmanned aerial vehicles compete in a hundred flowers, and in the field of Augmented Reality, various AR games are layered endlessly and are abnormally exploded. However, how to combine the two has been a problem. For example, an augmented reality game based on an unmanned aerial vehicle is developed, namely, an unmanned aerial vehicle is used as a carrier, an operator plays a pilot, and the real air combat experience is brought to the operator by synchronizing the (camera) view of the unmanned aerial vehicle with the (display) view of the operator through an augmented reality technology. According to the traditional synchronization method, an image with a fixed visual angle is transmitted to a control device on the ground through a camera carried by an unmanned aerial vehicle, and with the rise of a panoramic camera technology, a part of the unmanned aerial vehicle provides a panoramic image with a changeable visual angle for a head-mounted display by utilizing a panoramic camera (provided with two front and back opposite ultra-wide-angle lenses with the visual angles of more than 180 degrees) loaded on the unmanned aerial vehicle, so that more interactive immersive synchronization experience is provided. However, on one hand, a high-quality ultra-wide-angle lens (e.g. with an angle of view of about 180 degrees) is expensive and difficult to control distortion and dispersion, and on the other hand, the data stream generated by the panoramic image is very large, which is a serious challenge to the transmission rate and the endurance time of the drone.
In addition, the visual angle conversion of the current head-mounted display is mainly completed by the rotation of the head, along with the development of the technology, the expected requirement of people on the immersive experience is higher and higher, and the interactive mode of realizing the visual angle conversion by the rotation of the head only depends on the gyroscope and cannot bring satisfactory and flamboyant product experience for people. In addition, it has been found that people are more accustomed to observing the picture within the display range through the rotation of the eyeball rather than turning the head. When a scene is contrary to real habits, not only the sense of immersion is greatly discounted, but also vertigo is caused.
The inventors of the present invention have recognized that: the combination of eyeball tracking and head rotation can solve the problem, and the eyeball tracking and the head rotation are mutually cooperated to jointly control the visual angle change so as to simulate the visual angle change in reality. Some menu operations are finished through eye control, the head rotation frequency is reduced, and people can get rid of unnatural head operations and swaying pictures. The human eye visibility refers to the degree of the visual angle of human eyes. The horizontal visual angle of a single eye of a human can reach 156 degrees at most, and the theoretical maximum value of the horizontal visual angle of two eyes can reach 188 degrees. The human eyes have a coincidence vision field of 124 degrees and a single eye comfort vision field of 60 degrees. When attention is focused, the range is also greatly reduced to one fifth of the original range. In addition, it is common practice for people to observe a range of angles of ± 25 ° horizontally and ± 10 ° vertically by the rotation of the eyeball. Therefore, the human eye can achieve a comfortable visual field of about 150 degrees horizontally and 70 degrees vertically at a high speed without rotating the head, and has a substantially elliptical shape (a nearly rounded rectangle), and we define the region where the high-speed visual angle switching is achieved only by the eye movement without rotating the head as a high-speed region. When the user further combines the rotation of the head, a comfortable approximately elliptical visual field which is approximately expanded to 200 degrees horizontally and approximately 80 degrees vertically can be realized, and the visual field expanded after the rotation of the head is added is defined as a middle speed area. When the user further adds body motion mainly including shoulders, the visual field can be expanded to a panoramic angle of 360 degrees in each of the horizontal and vertical directions, and the range excluding the high-speed region and the medium-speed region is defined as a low-speed region. In addition, the camera mounted on the drone itself may also change the angle of view from which the camera takes images by the motion of the drone (e.g., flipping, yawing, pitching lights within a spherical coordinate system). When the visual angle of a user moves in a high-speed area and a medium-speed area, the response speed of visual angle switching of the displayed image is high, and the image synchronization with low time lag is difficult to realize only by means of the movement of the unmanned aerial vehicle. When user's visual angle moved in the low-speed district, the response speed to the visual angle switching of the image that shows is lower, can switch the visual angle through the mode with unmanned aerial vehicle motion itself. It is thus clear that when unmanned aerial vehicle adopted the panoramic camera, only need satisfy the user through the sight in the high-speed district that eyeball motion covered with the high-speed image synchronization in the middle-speed district that head motion covered can.
Referring to fig. 1-2, according to a first embodiment of the present disclosure, there is provided a drone-based imaging system comprising: a drone and imaging control device 100; wherein, unmanned aerial vehicle 200 includes: more than 3 unmanned aerial vehicle cameras 211, 212A, 212B; a multi-channel video processing module (not shown); and a drone communication module 104, the imaging control device 100 including: a head-mounted display screen 101 for displaying images transmitted by the drone; an eye tracking unit 102 for tracking changes in azimuth and elevation of the iris of a user; and a control device communication unit 103 for communicating with the drone communication module 104.
The unmanned aerial vehicle camera system comprises a plurality of unmanned aerial vehicle cameras, a camera controller and a camera controller, wherein more than 3 unmanned aerial vehicle cameras are arranged in parallel on a horizontal plane, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the overlapped visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view field of every unmanned aerial vehicle camera has the coincidence zone with the view field of adjacent unmanned aerial vehicle camera to guarantee that the coincidence view field of constituteing by this unmanned aerial vehicle camera more than 3 basically covers the high-speed district and at least partly middle speed district that this disclosure defined. Illustratively, A, B, C of fig. 3 represents the viewing angles of the drone cameras 211, 212A, 212B of fig. 2, respectively. The view angles A, B, C of the drone cameras 211, 212A, 212B are all 120 degrees and include two 30-degree coinciding zones on the left and right. However, those skilled in the art should appreciate that the drone cameras may be set to other viewing angles such as 140 degrees, 160 degrees, 180 degrees, etc., the overlap area may include other ranges such as 45 degrees, 60 degrees, 90 degrees, etc., and each of the drone cameras may be set to a different viewing angle range. Therefore, the processing precision of the ultra-wide-angle lens can be reduced, and the data volume of image transmission can be reduced. The multi-channel video processing module is configured to receive multiple channels of images (in this example, 3 unmanned aerial vehicle cameras, and therefore 3 channels of images) sent by the above 3 or more unmanned aerial vehicle cameras at the same time, process the multiple channels of images, and send the processed multiple channels of images to the imaging control device 100 through the unmanned aerial vehicle communication module 104, so as to output the processed multiple channels of images on the head-mounted display 101. The mode that multichannel video processing module handled the image includes two kinds, one kind is to be amalgamate 3 way images into a width in the horizontal direction and is greater than the amalgamation image of the approximate rectangle of the ascending height of vertical direction, with the scope that the fitting user visual angle can quick change (high-speed district and middle speed district promptly), the manufacturing cost of lens can be reduced to the camera that a plurality of visual angles are less than 180 degrees, the blind area that the quick visual angle of user switched has been left out to the visual angle that amalgamates formation amalgamation image in addition, the transmission capacity of data has been reduced, unmanned aerial vehicle system's time of endurance has been prolonged effectively. The multi-path video processing module can be formed by adopting a special DSP or FPGA.
The eye tracking unit 102 is an optical sensing device that is placed in front of the eye. The eyeball tracking unit 102 may track the movement of the eyeball by one of the following methods, first, tracking is performed based on the characteristic changes of the eyeball and the eyeball periphery; secondly, tracking according to the change of the iris angle; third, an infrared light beam or the like is actively projected onto the iris to extract features. In this example, the eyeball tracking unit 102 includes two or more infrared cameras, and performs tracking by capturing images of changes in iris angle of each eyeball by the two or more infrared cameras. Thereby obtaining a change in the user's field of view.
In one or more embodiments, the imaging control apparatus 100 further includes: a head posture sensing unit for capturing the azimuth angle and elevation angle changes of the head of the user; a shoulder posture sensing unit for capturing the azimuth angle change of the user's shoulder. Exemplarily, the head posture sensing unit is arranged at the back side of the neck and comprises an angular acceleration sensor; the shoulder posture sensing unit comprises more than two acceleration sensors which are respectively arranged on the shoulders of the user.
The imaging control device 100 further includes an image processing unit, which is configured to superimpose the virtual image on the image sent by the drone and send the image to the head-mounted display 101 for display. For example, generating a virtual cockpit interface, displaying the remaining power, etc.
Further, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the coincident visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view of each unmanned aerial vehicle camera has the coincidence with the view of adjacent unmanned aerial vehicle camera. Or alternatively, the overlapping area between the fields of view of adjacent drone cameras is greater than or equal to the field of view of the image transmitted by the drone to the imaging control device 100.
The method of operating the imaging system according to the present embodiment refers to the second embodiment described later.
According to a second embodiment of the present disclosure, there is provided an unmanned aerial vehicle-based imaging method for an imaging system of the first embodiment of the present disclosure, the method including the steps of: tracking the postures of the eyeball and the head of the user by utilizing the eyeball tracking unit 102 and the shoulder posture sensing unit; based on the pose of the user's eyes and head, locate the position of the display area of the head mounted display 101 (i.e., which part of the coincident field of view is displayed in the display) in the coincident field of view of the plurality of drone cameras (i.e., the sum of the images captured by the 3 drone cameras); setting the unmanned aerial vehicle camera with the visual field located in the display area of the head-mounted display screen 101 to be in a high-definition camera mode, and setting the unmanned aerial vehicle camera with the visual field located outside the display area of the head-mounted display screen 101 to be in a low-definition camera mode, for example, when the user's visual field is facing the front, i.e., the position of the camera 211 in the initial stage, since the display area of the head-mounted display screen 101 is covered in the visual field of the camera 211, the camera 211 is in the high-definition camera mode (e.g., higher video rate, frame rate, or resolution), and the cameras 212A and 212B are in the low-definition camera mode; when the display area of the head mounted display 101 includes the overlapping areas of the fields of view of the plurality of unmanned cameras, stitching the images taken by the plurality of cameras and positioning the position of the display area of the head mounted display 101 in the stitched images, for example, when the user moves the position of the display area of the head mounted display 101 to the overlapping areas of the fields of view of the cameras 211 and 212A (the overlapping areas of a and B in fig. 3) by moving the posture of the eyeball or the head, 212A is set to a high definition photography mode in accordance with the camera 211, and the images taken by the cameras 211 and 212A are stitched into a new stitched image by the multi-path video processing module; an image corresponding to the display area of the head mounted display 101 is transmitted to the head mounted display 101 of the imaging control apparatus 100. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on this shoulder gesture, dispose unmanned aerial vehicle's flight gesture to change the shooting angle of camera.
According to a third embodiment of the present disclosure, there is provided an unmanned aerial vehicle-based imaging method for an imaging system of the first embodiment of the present disclosure, the method including the steps of: tracking the postures of the eyeball and the head of the user by utilizing the eyeball tracking unit 102 and the shoulder posture sensing unit; based on the pose of the user's eyes and head, locate the position of the display area of the head mounted display 101 (i.e., which part of the coincident field of view is displayed in the display) in the coincident field of view of the plurality of drone cameras (i.e., the sum of the images captured by the 3 drone cameras); setting the unmanned aerial vehicle camera with the visual field located in the display area of the head-mounted display screen 101 to be in a high-definition camera mode, and setting the unmanned aerial vehicle camera with the visual field located outside the display area of the head-mounted display screen 101 to be in a low-definition camera mode, for example, when the user's visual field is facing the front, i.e., the position of the camera 211 in the initial stage, since the display area of the head-mounted display screen 101 is covered in the visual field of the camera 211, the camera 211 is in the high-definition camera mode (e.g., higher video rate, frame rate, or resolution), and the cameras 212A and 212B are in the low-definition camera mode; when the display area of the head mounted display 101 includes an overlap of the fields of view of the plurality of unmanned cameras, stitching the images taken by the plurality of cameras and positioning the position of the display area of the head mounted display 101 in the stitched image, for example, when the user moves the position of the display area of the head mounted display 101 to an overlap of the fields of view of the cameras 211 and 212A (an overlap of a and B in fig. 3) by moving the posture of the eyeball or the head, 212A is set to a high definition photography mode in accordance with the camera 211, and the image output to the head mounted display 101 is switched from the influence output by the camera 211 to the image taken by 212A through the multi-path video processing module; an image corresponding to the display area of the head mounted display 101 is transmitted to the head mounted display 101 of the imaging control apparatus 100. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Description of the reference numerals
100 imaging control device
101 head-mounted display screen
102 eyeball tracking unit
103 control device communication unit
104 unmanned aerial vehicle communication module
200 unmanned plane
211 unmanned aerial vehicle camera
212A unmanned aerial vehicle camera
212B drone camera.

Claims (6)

1. An imaging system based on unmanned aerial vehicle, its characterized in that includes:
an unmanned aerial vehicle and an imaging control device;
wherein, unmanned aerial vehicle includes:
more than three unmanned aerial vehicle cameras;
a multi-channel video processing module; and
an unmanned aerial vehicle communication module, a communication module,
the imaging control apparatus includes:
the head-mounted display screen is used for displaying the image sent by the unmanned aerial vehicle;
an eyeball tracking unit for tracking changes in azimuth and elevation of an iris of a user; and
a control device communication unit for communicating with the drone communication module;
wherein the imaging control apparatus further includes:
a head posture sensing unit for capturing the azimuth angle and elevation angle changes of the head of the user;
a shoulder posture sensing unit for capturing the azimuth angle change of the user's shoulder;
the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the coincident visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view field of each unmanned aerial vehicle camera and the view field of the adjacent unmanned aerial vehicle camera have an overlapping area;
wherein the coincidence area between the vision fields of the cameras of the adjacent unmanned aerial vehicles is more than or equal to the vision field of the image sent by the unmanned aerial vehicle to the imaging control device;
the display area of the head-mounted display screen can be located in the position of the overlapped vision areas of the unmanned aerial vehicle cameras based on the postures of eyeballs and the head;
the unmanned aerial vehicle camera that the field of vision is located the wear-type display screen display area is the high definition mode of making a video recording, and the unmanned aerial vehicle camera that the field of vision is located outside the wear-type display screen display area is the low definition mode of making a video recording.
2. The imaging system of claim 1, wherein the imaging control device further comprises an image processing unit for superimposing the virtual image on the image sent by the drone and sending the image to the head-mounted display screen for display.
3. The imaging system of claim 1, wherein the eye tracking unit comprises more than two infrared cameras; the head posture sensing unit comprises an angular acceleration sensor, and the shoulder posture sensing unit comprises more than two acceleration sensors.
4. A drone-based imaging method for use with the imaging system of any one of claims 1-3, the method characterized by the steps of:
tracking the pose of the user's eyes and head;
based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras;
setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode;
when the display area of the head-mounted display screen comprises the overlapped area of the vision fields of the unmanned aerial vehicle cameras, splicing the images shot by the unmanned aerial vehicle cameras, and positioning the position of the display area of the head-mounted display screen in the spliced images;
and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
5. A drone-based imaging method for use with the imaging system of any one of claims 1-3, the method characterized by the steps of:
tracking the pose of the user's eyes and head;
based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras;
setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode;
when the display area of the head-mounted display screen is completely positioned in the overlapping area of the vision fields of the unmanned cameras, switching the unmanned cameras based on the moving direction of the eyeballs and the head postures, and positioning the position of the display area of the head-mounted display screen in the switched image;
and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
6. The imaging method according to claim 4 or 5, further comprising the steps of:
tracking a shoulder pose of the user;
based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle.
CN201810637815.9A 2018-06-20 2018-06-20 Imaging system and method based on unmanned aerial vehicle Active CN108646776B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810637815.9A CN108646776B (en) 2018-06-20 2018-06-20 Imaging system and method based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810637815.9A CN108646776B (en) 2018-06-20 2018-06-20 Imaging system and method based on unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN108646776A CN108646776A (en) 2018-10-12
CN108646776B true CN108646776B (en) 2021-07-13

Family

ID=63752989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810637815.9A Active CN108646776B (en) 2018-06-20 2018-06-20 Imaging system and method based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN108646776B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109672837A (en) * 2019-01-24 2019-04-23 深圳慧源创新科技有限公司 Equipment of taking photo by plane real-time video method for recording, mobile terminal and computer storage medium
CN115442510A (en) * 2021-06-02 2022-12-06 影石创新科技股份有限公司 Video display method and system for view angle of unmanned aerial vehicle
CN115022611B (en) * 2022-03-31 2023-12-29 青岛虚拟现实研究院有限公司 VR picture display method, electronic device and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111658A (en) * 2014-07-17 2014-10-22 金陵科技学院 Unmanned aerial vehicle capable of performing monitoring shooting and controlling through smart glasses
CN105334864A (en) * 2015-11-24 2016-02-17 杨珊珊 Intelligent glasses and control method for controlling unmanned aerial vehicle
CN105373137A (en) * 2015-11-03 2016-03-02 上海酷睿网络科技股份有限公司 Unmanned system
CN107065905A (en) * 2017-03-23 2017-08-18 东南大学 A kind of immersion unmanned aerial vehicle control system and its control method
CN107256027A (en) * 2017-06-29 2017-10-17 北京小米移动软件有限公司 The helmet and its control method for unmanned plane
CN107589837A (en) * 2017-08-22 2018-01-16 努比亚技术有限公司 A kind of AR terminals picture adjusting method, equipment and computer-readable recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000029617A (en) * 1998-07-08 2000-01-28 Olympus Optical Co Ltd Video display device
US10339711B2 (en) * 2013-03-15 2019-07-02 Honda Motor Co., Ltd. System and method for providing augmented reality based directions based on verbal and gestural cues
KR101683275B1 (en) * 2014-10-14 2016-12-06 (주)세이프텍리서치 Remote navigating simulation system for unmanned vehicle
FR3028767B1 (en) * 2014-11-26 2017-02-10 Parrot VIDEO SYSTEM FOR DRIVING A DRONE IN IMMERSIVE MODE

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111658A (en) * 2014-07-17 2014-10-22 金陵科技学院 Unmanned aerial vehicle capable of performing monitoring shooting and controlling through smart glasses
CN105373137A (en) * 2015-11-03 2016-03-02 上海酷睿网络科技股份有限公司 Unmanned system
CN105334864A (en) * 2015-11-24 2016-02-17 杨珊珊 Intelligent glasses and control method for controlling unmanned aerial vehicle
CN107065905A (en) * 2017-03-23 2017-08-18 东南大学 A kind of immersion unmanned aerial vehicle control system and its control method
CN107256027A (en) * 2017-06-29 2017-10-17 北京小米移动软件有限公司 The helmet and its control method for unmanned plane
CN107589837A (en) * 2017-08-22 2018-01-16 努比亚技术有限公司 A kind of AR terminals picture adjusting method, equipment and computer-readable recording medium

Also Published As

Publication number Publication date
CN108646776A (en) 2018-10-12

Similar Documents

Publication Publication Date Title
US20200142480A1 (en) Immersive displays
CN105103034B (en) Display
US9747725B2 (en) Video system for piloting a drone in immersive mode
US10268276B2 (en) Autonomous computing and telecommunications head-up displays glasses
US7429997B2 (en) System and method for spherical stereoscopic photographing
CN104781873A (en) Image display device and image display method, mobile body device, image display system, and computer program
JP6576536B2 (en) Information processing device
CN108646776B (en) Imaging system and method based on unmanned aerial vehicle
JP2020530971A (en) Head-mounted display and its display screen, head-mounted bracket and video
US11143876B2 (en) Optical axis control based on gaze detection within a head-mountable display
CN106168855B (en) Portable MR glasses, mobile phone and MR glasses system
JP2002176661A (en) Image display device
KR20160102845A (en) Flight possible omnidirectional image-taking camera system
CN111602391B (en) Method and apparatus for customizing a synthetic reality experience from a physical environment
JP6649010B2 (en) Information processing device
CN113941138A (en) AR interaction control system, device and application
US20240290087A1 (en) Video display method and display system based on unmanned aerial vehicle viewing angle
KR20240052823A (en) Eyewear synchronized with UAV image capturing system
CN115617160A (en) Video processing and playback system and method
JP6718928B2 (en) Video output system
JP6826082B2 (en) Programs, information processing equipment, and methods
WO2021131935A1 (en) Program, method, and information processing device
JP6718930B2 (en) Program, information processing apparatus, and method
KR102724897B1 (en) System for providing augmented realty
CN115346025A (en) AR interaction control system, device and application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211215

Address after: 430000 Room 408, floor 4, building B24, phase 2.7, financial background service center base construction project, No. 77, Guanggu Avenue, Donghu New Technology Development Zone, Wuhan, Hubei Province

Patentee after: Wuhan Jinshan Shiyou Technology Co.,Ltd.

Address before: 519000 building 3, Jinshan Software Park, 325 Qiandao Ring Road, Xiangzhou District, Zhuhai City, Guangdong Province

Patentee before: ZHUHAI KINGSOFT ONLINE GAME TECHNOLOGY Co.,Ltd.

Patentee before: Chengdu Xishanju Interactive Entertainment Technology Co., Ltd