Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Precise Quadrotor Autonomous Landing with SRUKF Vision Perception

2015 IEEE International Conference on Robotics and Automation (ICRA) Washington State Convention Center Seattle, Washington, May 26-30, 2015 Precise Quadrotor Autonomous Landing with SRUKF Vision Perception Shuo Yang1 , Jiahang Ying1 , Yang Lu2 , Zexiang Li3 Abstract— We present an autonomous quadrotor system that is able to perform high precision landing on small platform in both indoor and outdoor environment. Its taking off and landing processes are fully autonomous. We use vision sensor to detect the landing platform, and the vision measurement is enhanced by IMU with SRUKF based sensor fusion method. All computation are done real-time and on-board. We implement the system and carry a series of experiments under various environmental conditions. The experiment results confirm the robustness and precision of our system in real use cases. I. I NTRODUCTION In recent years, as micro Unmanned Aerial Vehicles (UAVs) related technologies become mature, more and more applications involving micro UAVs are emerged. Among all kinds of UAVs, the multirotor is especially suitable for applications where a runway for takeoff and landing is not available, because of its agility and vertical-take-offand-landing (VTOL) ability. However, multirotors inherently have limited flight time, normally ranging between 10-30 minutes. So for applications that require the UAV patrols large areas of land, the UAV must land and recharge its battery during traveling. To make this procedure feasible for commercial products, the landing and recharging action must be done autonomously. The key step is let the UAV land on a recharging device, with battery charging mechanical and electrical structures on both the UAV and the recharging device coming to contact each other exactly. Therefore the autonomous landing procedure must be highly precise. The only commercialized autonomous landing and recharging UAV system is Skycatch [8], the system structure is shown in Figure 1. The recharging device located in a cube ground station, with landing zone be a cone-shaped hole. The UAV equipped with a trapezoidal frame on its bottom. The frame can guide the UAV to slide into the landing zone to get the rechargeable batteries. The UAV obtains the location of the ground station by radio beacons. Several features of this autonomous landing system shed light on future autonomous landing system design. First of all, the recharging device should be on a standalone platform higher than the ground. This is because rotor crafts suffer from ground effect [3] during landing. Ground effect will cause the UAV gain additional nonlinear lifting force, making autonomous landing controller hard to stabilize. A platform above the 1 Shuo Yang and Jiahang Ying are with DJI Innovations and the Department of Electronic & Computer Engineering, Hong Kong University of Science & Technology {syangag, jying}@ust.hk 2 Yang Lu is with DJI Innovations and Shandong University at Weihai, China gaokai123@mail.sdu.edu.cn 3 Zexiang Li is with the Department of Electronic & Computer Engineering, Hong Kong University of Science & Technology eezxli@ust.hk 978-1-4799-6923-4/15/$31.00 ©2015 IEEE ground can reduce the ground effect. Secondly, the precision requirement of landing on the recharging device can be loosed by adding guiding mechanical structures. The design of Skycatch system enables the UAV to land quickly and robustly. Nevertheless, Skycatch system has very low landing precision, and the supporting frame makes it hard to utilize the bottom space of the UAV to mount other equipments. In research community, the UAV autonomous landing problem has been extensively studied. According to a recent survey [2], more than twenty autonomous landing techniques involve different controllers and perception models are proposed. Many of these methods has a real system implementation. However, to the best knowledge of authors, none of existing autonomous landing techniques can achieve landing precision Fig. 1: The Skycatch within 5 centimeters using large landing and recharging system quadrotor system (>1.5kg). In this paper we describe the (Image source: skycatch design and implementation of a website video clip) quadrotor UAV system that performs autonomous landing with no more than 5 centimeters repeated error. Both position and orientation of the quadrotor are controlled. The sensing and control algorithms make the UAV realize quick and accurate landing action with minimum mechanical support. The UAV frame design and the power of the propulsion system leaves enough bottom space and payload capacity for mounting additional equipments. The paper is organized as follows. In Section II we present the mechanical design, the hardware structure and general software structure of our system. Section III is devoted to explain the algorithm to observe the position of the landing marker on the landing platform. Section IV discussed the control of UAV basing on observed marker position. Section V demonstrates the system performance by providing data and results of series of experiments. Section VI concludes the paper. II. Q UADROTOR D ESIGN We implement the quadrotor system to mainly demonstrate autonomous landing system. But the design of the system also considered extensibility and scalability of the system. Our system is designed to be able to carry more payload and 2196 Fig. 2: Our quadrotor system Motors & ESCs Camera Fig. 4: The configuration of downward looking cameras. USB serial DJI autopilot WIFI Odroid XU CAN DJI Optical Flow Sensor Autopilot, IMU & GPS Computing platform Rotor Propeller ESC Camera1 Camera2 Ground station Communication Ground Control power Robot Frame Battery Fig. 3: The hardware configuration of our quadrotor system. run more sophisticated algorithms while maintaining similar landing precision. A. Mechanical Structure Our quadrotor, as shown in Figure 2, has following mechanical specifications: Minimum Takeoff Weight Maximum Takeoff Weight Flight Duration Propeller Blade Size Rotor Shaft Distance Number Of Cameras 1.8kg 3.8kg 15 minutes 13 inches 500 mm to 600 mm Two Our landing pad is a 50cm×60cm metal plate. The landing marker is generated using method presented in [1]. For battery recharging applications, landing pad will be put on a platform higher than the ground. So we emphasize the landing performance of the quadrotor that lands on the high platform. DJI A2 Hardkernel Odroid XU DJI E600 Propulsion System DJI E600 Propulsion System DJI E600 Propulsion System ZWO ASC120MC DJI Optical Flow Sensor Common Laptop Computer Standard Wifi Module Camera 1 is used to detect the landing pad, while Camera 2 is for measuring the speed of the quadrotor relative to the ground. Both cameras are downward looking. The configuration of two cameras is as Figure 4. Figure 3 illustrates the general hardware structure. C. Software Quadrotor attitude control algorithm is running on DJI A2 autopilot system. The sensor data processed by A2 including IMU data, GPS data and optical flow sensor velocity data. We modify the original A2 controller program to add auto takeoff and landing functions. We also add a control API in A2 to send out sensor data and receive attitude commands via serial port. XU microcomputer connects to A2 through the control API. The image data from camera1 is also handled by XU. The core part of the landing algorithm executes in XU, including SRUKF position estimation and position PID controller. Figure 5 gives an overview of the software architecture. In the following sections, we will explain the details of each building block of the whole software. And software implementation is discussed in Section V. In Figure 5, the quadrotor controller resides in A2 autopilot while the high-level controller works on Odroid XU. The communication protocol in between is designed as follows: B. Hardware Major flight related components on the quadrotor are listed in the following table. 2197 XU to autopilot package header pitch angle command (-15◦ to +15◦ ) roll angle command (-15◦ to +15◦ ) yaw angluar rate command (-100◦ /s to +100◦ /s) vertical velocity (-2m/s to +2m/s) package tail with CRC body (b) o o o landing pad (t) Fig. 6: Problem formation of the landing pad position estimation problem. The result of identification step is the corner points of the detected marker. Since the real size of the marker is known, the detected corner points on camera image is easily linked to real corners. The correspondences between image points and real points then form a PnP problem, from which Rtb and T tb can be solved. However, due to image noise and motion blur, the solved rotation and translation are noisy. We use two processing steps to stabilize and eliminate noise in the final Rtb and T tb . The first step is rotation compensation, and the second step is SRUKF filtering. Fig. 5: Software architecture of the quadrotor system. autopilot to XU package header IMU orientation quaternion IMU linear acceleration vector Optical flow linear velocity vector IMU angular velocity vector Ultrasonic height package tail with CRC C. Rotation Compensation III. L ANDING PAD P OSITION E STIMATION The landing problem can be divided into two parts. First part is an estimation problem. The UAV use its download looking camera to identify landing pad in its view, and then estimate the relative position between the UAV and the landing pad. The second is a position control problem. The UAV performs moving action according to its current status and position relative to the landing pad. In this section we first address the estimation problem. A. Problem Formation The quadrotor has a body frame attached to its center of mass, which is abbreviated as b. This body frame is standard pitch-roll-yaw coordinate of UAVs. The landing pad has a coordinate frame denoted by t. The direction of axes of frame t is shown in Figure 6. The position estimation problem aims to robustly estimate the rotation Rtb and translation T tb between frame b and frame t. Since the landing platform is parallel to the ground, we can assume that when the quadrotor is in the vicinity of the landing platform, the north-east-ground (NEG) frame of the quadrotor and the landing marker frame t only differ by a yaw rotation. The rotation measured by IMU is definitely more accurate than that measured using PnP solver, because IMU involves multiple dedicated sensors that specialized for rotation measurement. And we also found that within the PnP solved rotation, yaw angle is less affected by noise compared with pitch and roll angles. Therefore, we could use the IMU rotation to compensate the PnP solved rotation. Figure 7 explains the procedure of rotation compensation. In this figure, frame e is the NEG frame, frame i is an intermediate frame. Reb is the rotation measured by IMU, and Rtb is the rotation measured with PnP solver. By yaw rotations with different angles, the NEG frame and the marker frame can both be rotated to the intermediate frame. The rotation from the intermediate frame to the body frame only involves rotation on pitch and roll directions. Therefore we have Reb = Rei Rib B. Marker Detection Rtb = Rti Rib We use ArUco library for marker generation and detection [1]. The detection and identification steps are: • Image segmentation • Contour extraction and filtering • Marker Code extraction • Marker identification and error correction In the above equations, the known values are Reb and Rtb . So we have to decompose two rotations. We have to utilize a decomposition method called tilt-torsion decomposition, for any rotation R we have 2198 R = Rtorsion Rtilt Body frame (b) yaw axis of target frame yaw axis of current frame Intermediate frame (i) NEG frame (e) axis of rotation Fig. 8: To calculate Rtilt , find the rotation axis and angle between the yaw axis of current frame and the yaw axis of intermediate frame. Marker frame (t) write Rtb and q tb interchangeably without ambiguity.  tb  T  q tb   xi =   vt  ωb Fig. 7: Frame transformations for rotation compensation. where Rtorsion only involves rotation around yaw axis, while Rtilt contains the rotation on pitch and roll axes. The calculation of Rtilt takes the inner product and cross product of the yaw axes of the current frame and the target frame. Let zc = zt = [0 0 0]T be the yaw axis of the target frame, then ReT zt is the coordinate of this yaw axis in the current frame. The cross product of zc and ReT zt is the rotation axis r of Rtilt . And the angle φ between this two axes is the rotation angle. Hence, Rtilt can be constructed with r and φ. This calculation is summarized in Figure 8. So we can use the Rib component of Reb to replace the same term in Rtb , resulting in a more precise rotation measure. D. SRUKF Position Estimation The compensated rotation only reduce noise in pitch and roll directions. The noise in yaw direction and translation is kept the same. We then use a Kalman filter to further reduce the noise. To avoid the computation of Jacobian matrices of state transition function and measurement function, UKF is chosen. For state has small size, the running time of UKF and EKF are roughly in the same order [10]. In this paper we use SRUKF instead of standard UKF to increase the estimation output rate. Due to page limit, We omit the details of the algorithm here. A complete explanation of SRUKF can be found in [9]. 1) State Update Equation: The state of the filter contains T tb , Rtb , the velocity of the quadrotor v t and the angular velocity of the quadrotor ω b . Notice that the velocity is expressed in frame t while the angular velocity is in frame b. Since Rtb is a 3-by-3 matrix, matrix representation is not suitable for filter state. A better approach is represent rotation as quaternion, denoted as q tb . In the following text, we will (1) The state update equation of the UKF xi = f (xi−1 ) is  tb   tb  T i−1 + v ti−1 ∆t T i b   q tb  q tb i  i−1 × q(ω i−1 ∆t) xi =  (2)  v ti  =  v ti−1 + ati−1 ∆t  ω bi ω bi−1 where function q() is a quaternion defined by the rotation vector ω bi−1 ∆t. The symbol × means quaternion multiplication. ati−1 is obtained from IMU measurement. 2) Measurement Update Equation: By solving the PnP problem we get T tb and Rtb . Rtb is then processed by rotation compensation step. So T tb and q tb can be measured. Linear acceleration and angular velocity are be obtained from IMU. For the velocity measurement, one method is differentiate T tb , the other is using optical flow sensor. Our optical flow sensor is developed by DJI [6]. Both velocity measurement method have advantages and disadvantages. The first method is sensitive to noise, while the second method may generate incorrect measurement if there are moving objects in the field of view of the optical flow sensor. Our measurement function yi = h(xi ) is as follows:   T tb tb   q  (3) yi =   1 [w1 (v t + at ∆t) + w2 T tb −T tbi ] w1 +w2 i ∆t ωb In the equation, T tb and q tb are the result of PnP problem and rotation compensation. at and ω b are from IMU data. v ti + at ∆t represents the integration of IMU linear acceleration T tb −T tbi and velocity from optical flow sensor, while is the ∆t differentiation of position measurement from camera. Since we have two sources of velocity measurements, we use two adjustable weights w1 and w2 to give different trusts to the two term. The estimated T tb and q tb will be the feedback information of the landing position controller. Moreover, They can be put 2199 UKF IMU 110 current position current velocity current acceleration 100 90 80 target position Position Controller target velocity Velocity Controller target acceleration Acceleration Controller 70 60 50 −40 −20 0 −30 target angle −20 −10 20 0 10 20 30 40 x − axis y − axis (a) Measurement record of a horizontal back-and-forth movement. The movement is only in pitch direction. Command Generator Fig. 9: The cascade position controller. into the PnP solver in next step as initial extrinsic guess to speed up the convergence of Gauss-Newton Optimization [4]. 110 100 90 IV. L ANDING C ONTROL 80 20 15 15 The landing process involves position based visual servoing of 3-dimension position and yaw angle. A complete landing control action sequence is: 1) Approach the landing platform with GPS navigation information and detect the landing pad marker with camera. 2) Obtain estimated position and orientation relative to the landing pad by Camera 1 with SRUKF. 3) Rotate to target yaw angle 0. So the head of the quadrotor is aligned with the direction of x-axis of the landing pad frame t. 4) Control to target position (0, 0, 50), expressed in landing pad frame t. In other words, eliminate xy-plane displacements and reduce height to 50cm above the landing pad. 5) Control to target speed (0, 0, -30), expressed in body frame b. This action only controls velocity, because at this height the landing marker has large probability being unobservable, while the velocity information coming from optical flow sensor is still accurate in this situation. The position and velocity controllers are standard PID controllers. For any target position or target velocity, they will be processed by cascaded PID controllers to generate final target angle. Then the quadrotor will change its attitude according to the target angle, further changing position or velocity to the target ones. Figure 9 outlines the principle of cascade PID controllers. Additionally, since the orientation is estimated, yaw rate control command is also generated along with position control. V. I MPLEMENTATION AND E XPERIMENT R ESULTS We build the quadrotor system with selected hardware components as listed in Section II. For software part, the majority programs of landing process are on Odroid XU. We use ROS [5] to organize programs on Odroid XU. 10 10 5 5 0 y − axis −5 −10 −15 −20 0 −5 −10 −15 −20 x − axis −25 (b) The movement of this set of record is stationary hovering with large yaw rotation only. Blue dots are filtered results while red dots are direct measurement of PnP solver. Fig. 10: SRUKF experiment records. Red dots are measurement position, and blue dots are filtered position. A. Runtime Resource Consumption Figure 5 contains the output frequencies of each components. On each camera frame, the marker detection program detects landing marker within 40ms. The UKF takes in IMU data (100Hz) and maker information (50Hz) to generate output estimation with frequency 50Hz. The estimation will be used to generate 50Hz control command to the autopilot. During the landing procedure, the CPU and memory usage of the software is recorded below: CPU1 CPU2 CPU3 CPU4 Memory Usage 49.5% 12.7% 4.4 % 7.6% 449MB/1.7GB In total the landing controller only cost 1/4 of computing power. More navigation algorithms can execute simultaneously with the landing controller. B. SRUKF Performance We let the quadrotor to hover above the landing marker, and then collect observed relative position from the camera system. We control the quadrotor to perform some movements and compare the measurement position and filtered position. Two sets of experiment record are depicted in Figure 10. In both records, red dots are the direct measurement position from PnP solutions, while blue dots are filtered position. In Figure 10a, a clear path is formed by blue dots, 2200 1.1 1 1 1.1 1 1 1 1 0.9 0.9 0.9 0.9 0.9 0.9 0.8 0.8 0.8 0.8 0.8 0.6 0.5 0.7 0.6 0.5 0.5 z direction (m) 0.7 0.8 0.7 z direction (m) 0.6 z direction (m) z direction (m) z direction (m) 0.6 z direction (m) 0.7 0.7 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.3 0.3 0.3 0.3 0.3 0.3 0.2 0.2 0.2 0.2 −0.1−0.05 0 0.05 x direction (m) −0.15−0.1−0.05 0 0.05 x direction (m) (a) Experiment 1 (b) Experiment 2 0.2 −0.05 0 0.05 0.1 x direction (m) 0 −0.05 −0.1 y direction (m) (c) Experiment 3 but red dots don’t form any regular movement patterns. For the yaw rotation only movement in Figure 10b, we can see blue dots contain little horizontal movement noise, but red dots are too noisy to exhibit the stationary point. C. Landing Experiments We conducted experiments in both indoor and outdoor environments. Under each condition, 20 landing trials are repeated. Vicon [7] system is used to measure the landing precision in indoor experiments. Figure 11 and 12 record results of some typical experiment run. Vicon data is 200Hz while SRUKF position data is 50Hz. In outdoor experiments, we measure the horizontal distance between camera lens and the center of the marker on X and Y directions. Landing errors are recorded below Mean 3.036cm 3.102cm 2.882cm 2.950cm Variance 1.844 2.659 3.216 1.593 Max 4.45cm 5.35cm 5.75cm 4.80cm 0.2 Min 1.15cm 0.85cm 0.95cm 1.35cm A video demonstrates several landing procedures under different environmental conditions can be viewed at https://www.youtube.com/watch?v=ZkYrlb8Nj8o. VI. C ONCLUSION & F UTURE W ORK We presents a quadrotor system that can performance high precision landing. The quadrotor onboard computing platform observes landing marker by camera and uses SRUKF to estimate the position of the landing pad. Experiments prove the precision of our system is within 5 centimeters in both X and Y direction, render it a qualified system for applications require high precision landing. −0.1 0 0.1 y direction (m) −0.2 −0.1 0 y direction (m) (a) Experiment (b) Experiment 2 1 Fig. 11: Experiment data records X-Z view. Red trajectory is Vicon measurement. Blue line is the record of our system. Indoor X Indoor Y Outdoor X Outdoor Y 0.6 0.5 0.5 0.4 0.7 (c) Experiment 3 Fig. 12: Experiment data records Y-Z view. In our future work, we will modify the landing pad design to make the marker observable in more conditions. Moreover, the quadrotor must deal with landing failure if the precision cannot meet application requirements. ACKNOWLEDGMENT This research is supported by DJI Innovations and Automation Technology Center of HKUST. Our quadrotor mechanical CAD design is done by DJI engineer Yuan Lin. Authors would like to express sincere thanks to other DJI engineers who provided valuable supports. R EFERENCES [1] S Garrido-Jurado, Rafael Muñoz-Salinas, Francisco José MadridCuevas, and Manuel Jesús Marı́n-Jiménez. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition, 47(6):2280–2292, 2014. [2] Alvika Gautam, PB Sujit, and Srikanth Saripalli. A survey of autonomous landing techniques for uavs. In Unmanned Aircraft Systems (ICUAS), 2014 International Conference on, pages 1210– 1218. IEEE, 2014. [3] J Gordon Leishman. Principles of Helicopter Aerodynamics with CD Extra. Cambridge university press, 2006. [4] Vincent Lepetit, Francesc Moreno-Noguer, and Pascal Fua. Epnp: An accurate o (n) solution to the pnp problem. International journal of computer vision, 81(2):155–166, 2009. [5] Morgan Quigley, Josh Faust, Tully Foote, and Jeremy Leibs. Ros: an open-source robot operating system. [6] Ke Sun, Yun Yu, Wancheng Zhou, Guyue Zhou, Tao Wang, and Zexiang Li. A low-cost and robust optical flow cmos camera for velocity estimation. In Robotics and Biomimetics (ROBIO), 2013 IEEE International Conference on, pages 1181–1186. IEEE, 2013. [7] Vicon Motion Capture System. http://www.vicon.com/. [8] Skycatch Technology. http://www.skycatch.com/ technology.html. [9] Rudolph Van Der Merwe and Eric A Wan. The square-root unscented kalman filter for state and parameter-estimation. In Acoustics, Speech, and Signal Processing, 2001. Proceedings.(ICASSP’01). 2001 IEEE International Conference on, volume 6, pages 3461–3464. IEEE, 2001. [10] K Xiong, HY Zhang, and CW Chan. Performance evaluation of ukfbased nonlinear filtering. Automatica, 42(2):261–270, 2006. 2201