Disclosure of Invention
In order to solve the above technical problem, according to a first aspect of the present invention, there is provided a drone-based imaging system comprising: an unmanned aerial vehicle and an imaging control device; wherein, unmanned aerial vehicle includes: more than 3 unmanned aerial vehicle cameras; a multi-channel video processing module; and unmanned aerial vehicle communication module, formation of image controlling means includes: the head-mounted display screen is used for displaying the image sent by the unmanned aerial vehicle; an eyeball tracking unit for tracking changes in azimuth and elevation of an iris of a user; and a control device communication unit for communicating with the drone communication module.
Further, the imaging control apparatus further includes: a head posture sensing unit for capturing the azimuth angle and elevation angle changes of the head of the user; a shoulder posture sensing unit for capturing the azimuth angle change of the user's shoulder.
Further, the imaging control device further comprises an image processing unit, and the image processing unit is used for overlaying the virtual image on the image sent by the unmanned aerial vehicle and sending the image to the head-mounted display screen for display.
Furthermore, the eyeball tracking unit comprises more than two infrared cameras; the head posture sensing unit comprises an angular acceleration sensor, and the shoulder posture sensing unit comprises more than two acceleration sensors.
Further, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the coincident visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view of each unmanned aerial vehicle camera has the coincidence with the view of adjacent unmanned aerial vehicle camera.
Further, the overlapping area between the vision fields of the cameras of the adjacent unmanned aerial vehicles is larger than or equal to the vision field of the image sent by the unmanned aerial vehicle to the imaging control device.
According to a second aspect of the present invention, there is provided a drone-based imaging method for use with the imaging system of the first aspect of the present invention, the method comprising the steps of: tracking the pose of the user's eyes and head; based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras; setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode; when the display area of the head-mounted display screen comprises the overlapped area of the vision fields of the plurality of unmanned cameras, splicing the images shot by the plurality of cameras, and positioning the position of the display area of the head-mounted display screen in the spliced images; and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle.
According to a third aspect of the present invention, there is provided a drone-based imaging method for use with the imaging system of the first aspect of the present invention, the method comprising the steps of: tracking the pose of the user's eyes and head; based on the postures of the eyeballs and the head, positioning the position of a display area of the head-mounted display screen in the overlapped vision areas of the unmanned aerial vehicle cameras; setting an unmanned aerial vehicle camera with a visual field positioned in a head-mounted display screen display area as a high-definition camera shooting mode, and setting an unmanned aerial vehicle camera with a visual field positioned outside the head-mounted display screen display area as a low-definition camera shooting mode; when the display area of the head-mounted display screen is completely positioned in the overlapping area of the vision fields of the unmanned cameras, switching the unmanned cameras based on the moving direction of the eyeballs and the head postures, and positioning the position of the display area of the head-mounted display screen in the switched image; and sending the image corresponding to the display area of the head-mounted display screen to the head-mounted display screen of the imaging control device.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle.
The invention has the beneficial effects that: the response speed of visual angle switching of the image can be improved, the visual angle of the head-mounted display screen can be changed rapidly, and immersion experience is achieved. The transmission quantity of data is reduced, and the cost for manufacturing the ultra-wide angle lens of more than 180 degrees is reduced.
Detailed Description
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct combination or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated into two processing units, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in two computer readable storage media. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those within the specification of the present application that terms such as "first," "second," "A," "B," and the like, if not specifically stated, do not denote a limitation of order. For example, "step three" may precede "step one," and "D" may also be performed simultaneously with "a".
Unmanned aerial vehicle and Augmented Reality (AR) technology are two firest technologies at present, in the field of remote control unmanned aerial vehicles, various unmanned aerial vehicles such as toy unmanned aerial vehicles, aerial photography unmanned aerial vehicles and agricultural unmanned aerial vehicles compete in a hundred flowers, and in the field of Augmented Reality, various AR games are layered endlessly and are abnormally exploded. However, how to combine the two has been a problem. For example, an augmented reality game based on an unmanned aerial vehicle is developed, namely, an unmanned aerial vehicle is used as a carrier, an operator plays a pilot, and the real air combat experience is brought to the operator by synchronizing the (camera) view of the unmanned aerial vehicle with the (display) view of the operator through an augmented reality technology. According to the traditional synchronization method, an image with a fixed visual angle is transmitted to a control device on the ground through a camera carried by an unmanned aerial vehicle, and with the rise of a panoramic camera technology, a part of the unmanned aerial vehicle provides a panoramic image with a changeable visual angle for a head-mounted display by utilizing a panoramic camera (provided with two front and back opposite ultra-wide-angle lenses with the visual angles of more than 180 degrees) loaded on the unmanned aerial vehicle, so that more interactive immersive synchronization experience is provided. However, on one hand, a high-quality ultra-wide-angle lens (e.g. with an angle of view of about 180 degrees) is expensive and difficult to control distortion and dispersion, and on the other hand, the data stream generated by the panoramic image is very large, which is a serious challenge to the transmission rate and the endurance time of the drone.
In addition, the visual angle conversion of the current head-mounted display is mainly completed by the rotation of the head, along with the development of the technology, the expected requirement of people on the immersive experience is higher and higher, and the interactive mode of realizing the visual angle conversion by the rotation of the head only depends on the gyroscope and cannot bring satisfactory and flamboyant product experience for people. In addition, it has been found that people are more accustomed to observing the picture within the display range through the rotation of the eyeball rather than turning the head. When a scene is contrary to real habits, not only the sense of immersion is greatly discounted, but also vertigo is caused.
The inventors of the present invention have recognized that: the combination of eyeball tracking and head rotation can solve the problem, and the eyeball tracking and the head rotation are mutually cooperated to jointly control the visual angle change so as to simulate the visual angle change in reality. Some menu operations are finished through eye control, the head rotation frequency is reduced, and people can get rid of unnatural head operations and swaying pictures. The human eye visibility refers to the degree of the visual angle of human eyes. The horizontal visual angle of a single eye of a human can reach 156 degrees at most, and the theoretical maximum value of the horizontal visual angle of two eyes can reach 188 degrees. The human eyes have a coincidence vision field of 124 degrees and a single eye comfort vision field of 60 degrees. When attention is focused, the range is also greatly reduced to one fifth of the original range. In addition, it is common practice for people to observe a range of angles of ± 25 ° horizontally and ± 10 ° vertically by the rotation of the eyeball. Therefore, the human eye can achieve a comfortable visual field of about 150 degrees horizontally and 70 degrees vertically at a high speed without rotating the head, and has a substantially elliptical shape (a nearly rounded rectangle), and we define the region where the high-speed visual angle switching is achieved only by the eye movement without rotating the head as a high-speed region. When the user further combines the rotation of the head, a comfortable approximately elliptical visual field which is approximately expanded to 200 degrees horizontally and approximately 80 degrees vertically can be realized, and the visual field expanded after the rotation of the head is added is defined as a middle speed area. When the user further adds body motion mainly including shoulders, the visual field can be expanded to a panoramic angle of 360 degrees in each of the horizontal and vertical directions, and the range excluding the high-speed region and the medium-speed region is defined as a low-speed region. In addition, the camera mounted on the drone itself may also change the angle of view from which the camera takes images by the motion of the drone (e.g., flipping, yawing, pitching lights within a spherical coordinate system). When the visual angle of a user moves in a high-speed area and a medium-speed area, the response speed of visual angle switching of the displayed image is high, and the image synchronization with low time lag is difficult to realize only by means of the movement of the unmanned aerial vehicle. When user's visual angle moved in the low-speed district, the response speed to the visual angle switching of the image that shows is lower, can switch the visual angle through the mode with unmanned aerial vehicle motion itself. It is thus clear that when unmanned aerial vehicle adopted the panoramic camera, only need satisfy the user through the sight in the high-speed district that eyeball motion covered with the high-speed image synchronization in the middle-speed district that head motion covered can.
Referring to fig. 1-2, according to a first embodiment of the present disclosure, there is provided a drone-based imaging system comprising: a drone and imaging control device 100; wherein, unmanned aerial vehicle 200 includes: more than 3 unmanned aerial vehicle cameras 211, 212A, 212B; a multi-channel video processing module (not shown); and a drone communication module 104, the imaging control device 100 including: a head-mounted display screen 101 for displaying images transmitted by the drone; an eye tracking unit 102 for tracking changes in azimuth and elevation of the iris of a user; and a control device communication unit 103 for communicating with the drone communication module 104.
The unmanned aerial vehicle camera system comprises a plurality of unmanned aerial vehicle cameras, a camera controller and a camera controller, wherein more than 3 unmanned aerial vehicle cameras are arranged in parallel on a horizontal plane, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the overlapped visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view field of every unmanned aerial vehicle camera has the coincidence zone with the view field of adjacent unmanned aerial vehicle camera to guarantee that the coincidence view field of constituteing by this unmanned aerial vehicle camera more than 3 basically covers the high-speed district and at least partly middle speed district that this disclosure defined. Illustratively, A, B, C of fig. 3 represents the viewing angles of the drone cameras 211, 212A, 212B of fig. 2, respectively. The view angles A, B, C of the drone cameras 211, 212A, 212B are all 120 degrees and include two 30-degree coinciding zones on the left and right. However, those skilled in the art should appreciate that the drone cameras may be set to other viewing angles such as 140 degrees, 160 degrees, 180 degrees, etc., the overlap area may include other ranges such as 45 degrees, 60 degrees, 90 degrees, etc., and each of the drone cameras may be set to a different viewing angle range. Therefore, the processing precision of the ultra-wide-angle lens can be reduced, and the data volume of image transmission can be reduced. The multi-channel video processing module is configured to receive multiple channels of images (in this example, 3 unmanned aerial vehicle cameras, and therefore 3 channels of images) sent by the above 3 or more unmanned aerial vehicle cameras at the same time, process the multiple channels of images, and send the processed multiple channels of images to the imaging control device 100 through the unmanned aerial vehicle communication module 104, so as to output the processed multiple channels of images on the head-mounted display 101. The mode that multichannel video processing module handled the image includes two kinds, one kind is to be amalgamate 3 way images into a width in the horizontal direction and is greater than the amalgamation image of the approximate rectangle of the ascending height of vertical direction, with the scope that the fitting user visual angle can quick change (high-speed district and middle speed district promptly), the manufacturing cost of lens can be reduced to the camera that a plurality of visual angles are less than 180 degrees, the blind area that the quick visual angle of user switched has been left out to the visual angle that amalgamates formation amalgamation image in addition, the transmission capacity of data has been reduced, unmanned aerial vehicle system's time of endurance has been prolonged effectively. The multi-path video processing module can be formed by adopting a special DSP or FPGA.
The eye tracking unit 102 is an optical sensing device that is placed in front of the eye. The eyeball tracking unit 102 may track the movement of the eyeball by one of the following methods, first, tracking is performed based on the characteristic changes of the eyeball and the eyeball periphery; secondly, tracking according to the change of the iris angle; third, an infrared light beam or the like is actively projected onto the iris to extract features. In this example, the eyeball tracking unit 102 includes two or more infrared cameras, and performs tracking by capturing images of changes in iris angle of each eyeball by the two or more infrared cameras. Thereby obtaining a change in the user's field of view.
In one or more embodiments, the imaging control apparatus 100 further includes: a head posture sensing unit for capturing the azimuth angle and elevation angle changes of the head of the user; a shoulder posture sensing unit for capturing the azimuth angle change of the user's shoulder. Exemplarily, the head posture sensing unit is arranged at the back side of the neck and comprises an angular acceleration sensor; the shoulder posture sensing unit comprises more than two acceleration sensors which are respectively arranged on the shoulders of the user.
The imaging control device 100 further includes an image processing unit, which is configured to superimpose the virtual image on the image sent by the drone and send the image to the head-mounted display 101 for display. For example, generating a virtual cockpit interface, displaying the remaining power, etc.
Further, the visual field of each unmanned aerial vehicle camera on the horizontal plane is less than 180 degrees, and the coincident visual field of the plurality of unmanned aerial vehicle cameras on the horizontal plane is more than 180 degrees; the view of each unmanned aerial vehicle camera has the coincidence with the view of adjacent unmanned aerial vehicle camera. Or alternatively, the overlapping area between the fields of view of adjacent drone cameras is greater than or equal to the field of view of the image transmitted by the drone to the imaging control device 100.
The method of operating the imaging system according to the present embodiment refers to the second embodiment described later.
According to a second embodiment of the present disclosure, there is provided an unmanned aerial vehicle-based imaging method for an imaging system of the first embodiment of the present disclosure, the method including the steps of: tracking the postures of the eyeball and the head of the user by utilizing the eyeball tracking unit 102 and the shoulder posture sensing unit; based on the pose of the user's eyes and head, locate the position of the display area of the head mounted display 101 (i.e., which part of the coincident field of view is displayed in the display) in the coincident field of view of the plurality of drone cameras (i.e., the sum of the images captured by the 3 drone cameras); setting the unmanned aerial vehicle camera with the visual field located in the display area of the head-mounted display screen 101 to be in a high-definition camera mode, and setting the unmanned aerial vehicle camera with the visual field located outside the display area of the head-mounted display screen 101 to be in a low-definition camera mode, for example, when the user's visual field is facing the front, i.e., the position of the camera 211 in the initial stage, since the display area of the head-mounted display screen 101 is covered in the visual field of the camera 211, the camera 211 is in the high-definition camera mode (e.g., higher video rate, frame rate, or resolution), and the cameras 212A and 212B are in the low-definition camera mode; when the display area of the head mounted display 101 includes the overlapping areas of the fields of view of the plurality of unmanned cameras, stitching the images taken by the plurality of cameras and positioning the position of the display area of the head mounted display 101 in the stitched images, for example, when the user moves the position of the display area of the head mounted display 101 to the overlapping areas of the fields of view of the cameras 211 and 212A (the overlapping areas of a and B in fig. 3) by moving the posture of the eyeball or the head, 212A is set to a high definition photography mode in accordance with the camera 211, and the images taken by the cameras 211 and 212A are stitched into a new stitched image by the multi-path video processing module; an image corresponding to the display area of the head mounted display 101 is transmitted to the head mounted display 101 of the imaging control apparatus 100. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on this shoulder gesture, dispose unmanned aerial vehicle's flight gesture to change the shooting angle of camera.
According to a third embodiment of the present disclosure, there is provided an unmanned aerial vehicle-based imaging method for an imaging system of the first embodiment of the present disclosure, the method including the steps of: tracking the postures of the eyeball and the head of the user by utilizing the eyeball tracking unit 102 and the shoulder posture sensing unit; based on the pose of the user's eyes and head, locate the position of the display area of the head mounted display 101 (i.e., which part of the coincident field of view is displayed in the display) in the coincident field of view of the plurality of drone cameras (i.e., the sum of the images captured by the 3 drone cameras); setting the unmanned aerial vehicle camera with the visual field located in the display area of the head-mounted display screen 101 to be in a high-definition camera mode, and setting the unmanned aerial vehicle camera with the visual field located outside the display area of the head-mounted display screen 101 to be in a low-definition camera mode, for example, when the user's visual field is facing the front, i.e., the position of the camera 211 in the initial stage, since the display area of the head-mounted display screen 101 is covered in the visual field of the camera 211, the camera 211 is in the high-definition camera mode (e.g., higher video rate, frame rate, or resolution), and the cameras 212A and 212B are in the low-definition camera mode; when the display area of the head mounted display 101 includes an overlap of the fields of view of the plurality of unmanned cameras, stitching the images taken by the plurality of cameras and positioning the position of the display area of the head mounted display 101 in the stitched image, for example, when the user moves the position of the display area of the head mounted display 101 to an overlap of the fields of view of the cameras 211 and 212A (an overlap of a and B in fig. 3) by moving the posture of the eyeball or the head, 212A is set to a high definition photography mode in accordance with the camera 211, and the image output to the head mounted display 101 is switched from the influence output by the camera 211 to the image taken by 212A through the multi-path video processing module; an image corresponding to the display area of the head mounted display 101 is transmitted to the head mounted display 101 of the imaging control apparatus 100. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
Further, the method comprises the steps of: tracking a shoulder pose of the user; based on the shoulder attitude, configuring the flight attitude of the unmanned aerial vehicle. By the method, the visual angle of the head-mounted display screen 101 can be changed rapidly, and the immersive experience is realized.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Description of the reference numerals
100 imaging control device
101 head-mounted display screen
102 eyeball tracking unit
103 control device communication unit
104 unmanned aerial vehicle communication module
200 unmanned plane
211 unmanned aerial vehicle camera
212A unmanned aerial vehicle camera
212B drone camera.