3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios
<p>Quadrotor dynamics reference frames: inertial frame (red) and body frame (green).</p> "> Figure 2
<p>Example of roto-translated approximating plane (yellow plane), intersecting the 3D LCG map and generating a 2D slice (green polyhedron).</p> "> Figure 3
<p>Ultrasound measurement <math display="inline"><semantics> <msubsup> <mi>d</mi> <mrow> <mi>k</mi> <mo>−</mo> <mn>1</mn> </mrow> <mo>⊥</mo> </msubsup> </semantics></math> with respect to the 2D slice (green area) obtained by intersecting the 3D LCG map with the 2D (yellow) plane, passing throug the UAV CoM.</p> "> Figure 4
<p>Work logic of the proposed sensor fusion strategy, in which the 3D distance filter is enclosed in the green area. In this scheme, the sensors’ data are represented by triangular blocks, while the LCG maps are represented by an ellipsoidal block.</p> "> Figure 5
<p>Identification of lower-<math display="inline"><semantics> <munder> <msubsup> <mi mathvariant="normal">d</mi> <mi>k</mi> <mi>O</mi> </msubsup> <mo>̲</mo> </munder> </semantics></math> and upper-<math display="inline"><semantics> <mover> <msubsup> <mi mathvariant="normal">d</mi> <mi>k</mi> <mi>O</mi> </msubsup> <mo>¯</mo> </mover> </semantics></math> offsets, defining the parallel cuts (red and blue lines) with respect to the approximating plane (green line) with orientation <math display="inline"><semantics> <msub> <mi>β</mi> <mi>i</mi> </msub> </semantics></math>, thus identifying the section of the uncertain ellipsoid (yellow area) whose distance complies with the measured one.</p> "> Figure 6
<p>Propagation of the uncertainty ellipsoid <math display="inline"><semantics> <msub> <mi mathvariant="script">E</mi> <mi>k</mi> </msub> </semantics></math> at time <math display="inline"><semantics> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </semantics></math>, i.e., <math display="inline"><semantics> <msubsup> <mi mathvariant="script">E</mi> <mi>k</mi> <msup> <mrow/> <mo>′</mo> </msup> </msubsup> </semantics></math>, using the parallel-cuts method. <math display="inline"><semantics> <msubsup> <mi mathvariant="script">E</mi> <mi>k</mi> <msup> <mrow/> <mo>′</mo> </msup> </msubsup> </semantics></math> represents the ellipsoid at minimum volume containing the feasible point set (yellow area).</p> "> Figure 7
<p>Evolution of the quadrotor center-of-mass position (black circles) within the LCG maps together with the approximating planes (yellow rectangles) and the generated 2D slices (dark green areas).</p> "> Figure 8
<p>Time evolution of the lateral (<b>left column</b>) and longitudinal (<b>right column</b>) estimation errors for different ultrasound sensor configurations. The subscript in the vertical axis label, i.e., <math display="inline"><semantics> <msub> <mi mathvariant="normal">e</mi> <mi>i</mi> </msub> </semantics></math>, with <math display="inline"><semantics> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mo>,</mo> <mn>4</mn> </mrow> </semantics></math>, identifies the number of ultrasound sensors equipped onboard. In this figure, the estimation error obtained using only GPS and IMU (black lines) is compared with the one obtained including data from ultrasound sensors and LCG maps (red line).</p> "> Figure 9
<p>In (<b>a</b>), the UAV real position evolution (yellow circles) within vine rows and corresponding uncertainty ellipsods when the LCG maps and ultrasounds are (red lines) or not (black lines) included into the navigation filter. In (<b>b</b>), it is possible to observe that the true position is always inside the uncertainty ellipsoids, but it is almost aligned with the red ellipses’ centers.</p> "> Figure 10
<p>Frequency distribution of the lateral (<b>left histogram</b>) and longitudinal (<b>right histogram</b>) estimation error, comparing the results when the maps are combined with the onboard sensors (red bars) or when they are not exploited (blue bars).</p> "> Figure 11
<p>Elapsed time for each phase of the proposed filter, compared to the minimum sampling time (MST). In particular, we have: P1 = prediction phase, P2 = 3D/2D map conversion phase, P3 = row selection phase, P4 = distance filter phase, P5 = update phase, OT = overall Time.</p> ">
Abstract
:1. Introduction
2. Selected Framework
2.1. Rotary-Wing UAV Modelling
- the thrust command is the sum of the thrust contributions generated by each rotor, i.e.,
- the rolling torque is produced by a different angular velocity variation of the rotors 2 and 4, i.e.,
- the pitching torque is produced instead by a different angular velocity variation of the rotors 1 and 3, i.e.,
- the yawing torque derives from the drag generated by the propellers on the quadrotor itself, with a torque direction opposite to the one of the rotors’ motion, such that:
2.2. Vine Row Modelling from LCG Maps
- Once the vertices as and are defined in the body frame, we have:
- We retrieve the coefficients and as:
- Once the straight line passing through the vertices is identified, we compute its characteristic slope , which will later be used in the ellipsoid method, as:
2.3. Ultrasound Sensor Modelling
3. 3D Navigation Filtering Scheme
3.1. 3D Distance Filter
3.2. Kalman Filter for Sensor Fusion
4. Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Elkaim, G.H.; Lie, F.A.P.; Gebre-Egziabher, D. Principles of guidance, navigation, and control of UAVs. In Handbook of Unmanned Aerial Vehicles; Springer: Dordrecht, The Netherlands, 2015; pp. 347–380. [Google Scholar] [CrossRef] [Green Version]
- Yao, H.; Qin, R.; Chen, X. Unmanned aerial vehicle for remote sensing applications—A review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
- Alatise, M.B.; Hancke, G.P. A review on challenges of autonomous mobile robot and sensor fusion methods. IEEE Access 2020, 8, 39830–39846. [Google Scholar] [CrossRef]
- Schirrmann, M.; Hamdorf, A.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Dammer, K.H. Regression kriging for improving crop height models fusing ultra-sonic sensing with UAV imagery. Remote Sens. 2017, 9, 665. [Google Scholar] [CrossRef] [Green Version]
- White, F.E. Data Fusion Lexicon; Technical Panel For C3; Joint Directors of Laboratories (JDL): Washington, DC, USA, 1991. [Google Scholar]
- Castanedo, F. A review of data fusion techniques. Sci. World J. 2013, 2013, 704504. [Google Scholar] [CrossRef] [PubMed]
- Kalman, R.E. A new approach to linear filtering and prediction problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- De Marina, H.G.; Pereda, F.J.; Giron-Sierra, J.M.; Espinosa, F. UAV attitude estimation using unscented Kalman filter and TRIAD. IEEE Trans. Ind. Electron. 2011, 59, 4465–4474. [Google Scholar] [CrossRef] [Green Version]
- Mao, G.; Drake, S.; Anderson, B.D. Design of an extended Kalman filter for UAV localization. In Proceedings of the 2007 IEEE Information, Decision and Control, Adelaide, Australia, 12–14 February 2007; pp. 224–229. [Google Scholar] [CrossRef]
- Arellano-Cruz, L.A.; Galvan-Tejada, G.M.; Lozano-Leal, R. Performance comparison of positioning algorithms for UAV navigation purposes based on estimated distances. In Proceedings of the 2020 IEEE 17th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE), Mexico City, Mexico, 11–13 November 2020; pp. 1–8. [Google Scholar] [CrossRef]
- Steele, B.M. Maximum posterior probability estimators of map accuracy. Remote Sens. Environ. 2005, 99, 254–270. [Google Scholar] [CrossRef]
- Zhang, J.; Liu, W.; Wu, Y. Novel technique for vision-based UAV navigation. IEEE Trans. Aerosp. Electron. Syst. 2011, 47, 2731–2741. [Google Scholar] [CrossRef]
- Xie, W.; Wang, L.; Bai, B.; Peng, B.; Feng, Z. An improved algorithm based on particle filter for 3D UAV target tracking. In Proceedings of the IEEE International Conference on Communications (ICC 2019), Shanghai, China, 20–24 May 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Santos, N.P.; Lobo, V.; Bernardino, A. Unmanned aerial vehicle tracking using a particle filter based approach. In Proceedings of the 2019 IEEE Underwater Technology (UT), Kaohsiung, Taiwan, 16–19 April 2019; pp. 1–10. [Google Scholar] [CrossRef]
- Rigatos, G. Distributed particle filtering over sensor networks for autonomous navigation of UAVs. In Proceedings of the 2010 IEEE 72nd Vehicular Technology Conference, Ottawa, ON, Canada, 6–9 September 2010. [Google Scholar]
- Won, D.H.; Chun, S.; Sung, S.; Lee, Y.J.; Cho, J.; Joo, J.; Park, J. INS/vSLAM system using distributed particle filter. Int. J. Control. Autom. Syst. 2010, 8, 1232–1240. [Google Scholar] [CrossRef]
- Uhlmann, J.K. Covariance consistency methods for fault-tolerant distributed data fusion. Inf. Fusion 2003, 4, 201–215. [Google Scholar] [CrossRef]
- Gai, J.; Xiang, L.; Tang, L. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
- Weiss, U.; Biber, P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A theoretical framework. Biosyst. Eng. 2021; in press. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Gregorio, E.; Cheein, F.A.; Guevara, J.; Llorens, J.; Sanz-Cortiella, R.; Escolà, A.; Rosell-Polo, J.R. Fruit detection, yield prediction and canopy geometric characterization using LiDAR with forced air flow. Comput. Electron. Agric. 2020, 168, 105121. [Google Scholar] [CrossRef]
- Chen, Y.; Zhang, B.; Zhou, J.; Wang, K. Real-time 3D unstructured environment reconstruction utilizing VR and Kinect-based immersive teleoperation for agricultural field robots. Comput. Electron. Agric. 2020, 175, 105579. [Google Scholar] [CrossRef]
- Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Barge, P.; Tortia, C.; Gay, P. 2D and 3D data fusion for crop monitoring in precision agriculture. In Proceedings of the 2019 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Portici, Italy, 24–26 October 2019; pp. 62–67. [Google Scholar] [CrossRef]
- Rovira-Más, F.; Zhang, Q.; Reid, J.F. Stereo vision three-dimensional terrain maps for precision agriculture. Comput. Electron. Agric. 2008, 60, 133–143. [Google Scholar] [CrossRef]
- Kim, W.S.; Lee, D.H.; Kim, Y.J.; Kim, T.; Lee, W.S.; Choi, C.H. Stereo-vision-based crop height estimation for agricultural robots. Comput. Electron. Agric. 2021, 181, 105937. [Google Scholar] [CrossRef]
- Chen, Y.; Xiong, Y.; Zhang, B.; Zhou, J.; Zhang, Q. 3D point cloud semantic segmentation toward large-scale unstructured agricultural scene classification. Comput. Electron. Agric. 2021, 190, 106445. [Google Scholar] [CrossRef]
- Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index evaluation in vineyards using 3D point clouds from UAV imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef] [Green Version]
- Barclay, M.; Galton, A. Comparison of region approximation techniques based on Delaunay triangulations and Voronoi diagrams. Comput. Environ. Urban Syst. 2008, 32, 261–267. [Google Scholar] [CrossRef]
- Heinzer, T.J.; Williams, M.D.; Dogrul, E.C.; Kadir, T.N.; Brush, C.F.; Chung, F.I. Implementation of a feature-constraint mesh generation algorithm within a GIS. Comput. Geosci. 2012, 49, 46–52. [Google Scholar] [CrossRef]
- Mebatsion, H.; Verboven, P.; Verlinden, B.; Ho, Q.T.; Nguyen, T.A.; Nicolaï, B. Microscale modelling of fruit tissue using Voronoi tessellations. Comput. Electron. Agric. 2006, 52, 36–48. [Google Scholar] [CrossRef]
- Yang, Z.; Seo, Y.H.; Kim, T.w. Adaptive triangular-mesh reconstruction by mean-curvature-based refinement from point clouds using a moving parabolic approximation. Comput.-Aided Des. 2010, 42, 2–17. [Google Scholar] [CrossRef]
- Kallinderis, Y.; Vijayan, P. Adaptive refinement-coarsening scheme for three-dimensional unstructured meshes. AIAA J. 1993, 31, 1440–1447. [Google Scholar] [CrossRef]
- Almasi, O.N.; Rouhani, M. A geometric-based data reduction approach for large low dimensional datasets: Delaunay triangulation in SVM algorithms. Mach. Learn. Appl. 2021, 4, 100025. [Google Scholar] [CrossRef]
- Chen, M.; Tang, Y.; Zou, X.; Huang, Z.; Zhou, H.; Chen, S. 3D global mapping of large-scale unstructured orchard integrating eye-in-hand stereo vision and SLAM. Comput. Electron. Agric. 2021, 187, 106237. [Google Scholar] [CrossRef]
- Comba, L.; Zaman, S.; Biglia, A.; Ricauda Aimonino, D.; Dabbene, F.; Gay, P. Semantic interpretation and complexity reduction of 3D point clouds of vineyards. Biosyst. Eng. 2020, 197, 216–230. [Google Scholar] [CrossRef]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A case study in a vineyard. Biosyst. Eng. 2021; in press. [Google Scholar] [CrossRef]
- Donati, C.; Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Improving agricultural drone localization using georeferenced low-complexity maps. In Proceedings of the 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento-Bolzano, Italy, 3–5 November 2021. [Google Scholar] [CrossRef]
- Capello, E.; Quagliotti, F.; Tempo, R. Randomized approaches and adaptive control for quadrotor UAVs. In Proceedings of the IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 461–470. [Google Scholar] [CrossRef]
- Powers, C.; Mellinger, D.; Kumar, V. Quadrotor kinematics and dynamics. In Handbook of Unmanned Aerial Vehicles; Springer: Dordrecht, The Netherlands, 2015. [Google Scholar]
- Liao, A.; Todd, M.J. The ellipsoid algorithm using parallel cuts. Comput. Optim. Appl. 1993, 2, 299–316. [Google Scholar] [CrossRef] [Green Version]
- Kailath, T.; Sayed, A.H.; Hassibi, B. Linear Estimation; Prentice Hall: Upper Saddle River, NJ, USA, 2000. [Google Scholar]
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Puri, V.; Nayyar, A.; Raja, L. Agriculture drones: A modern breakthrough in precision agriculture. J. Stat. Manag. Syst. 2017, 20, 507–518. [Google Scholar] [CrossRef]
- DJI Matrice 100 Technical Sheet. Available online: https://www.dji.com/it/matrice100/info#specs (accessed on 31 January 2022).
- VectorNav VN-200 GNSS/IMU Datasheet. Available online: https://www.vectornav.com/products/detail/vn-200 (accessed on 31 January 2022).
- Ultrasonic Ranging Module HC-SR04 Datasheet. Available online: https://www.elecrow.com/hcsr04-ultrasonic-ranging-sensor-p-316.html (accessed on 31 January 2022).
- Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and testing a UAV mapping system for agricultural field surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
- Rambabu, R.; Bahiki, M.R.; Azrad, S. Multi-sensor fusion based UAV collision avoidance system. J. Teknol. 2015, 76, 89–93. [Google Scholar] [CrossRef] [Green Version]
- Oh, K.H.; Ahn, H.S. Extended Kalman filter with multi-frequency reference data for quadrotor navigation. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea, 13–16 October 2015; pp. 201–206. [Google Scholar] [CrossRef]
- Wang, Y.; Nguyen, B.M.; Kotchapansompote, P.; Fujimoto, H.; Hori, Y. Vision-based vehicle body slip angle estimation with multi-rate Kalman filter considering time delay. In Proceedings of the 2012 IEEE International Symposium on Industrial Electronics, Hangzhou, China, 28–31 May 2012; pp. 1506–1511. [Google Scholar] [CrossRef] [Green Version]
Parameter | Value | Parameter | Value |
---|---|---|---|
Diagonal Wheelbase | 650 (mm) | x-axis inertia | 0.0617 (kg m) |
Weight | 2431 (g) | y-axis inertia | 0.0619 (kg m) |
Max. Takeoff Weight | 3600 (g) | z-axis inertia | 0.1231 (kg m) |
Nominal rotor rate | 580 (rad/s) | Rotor inertia | 0.001 (kg m) |
Sensor | Parameter | Value |
---|---|---|
Ultrasound | Distance range | 0.02–4.00 (m) |
Accuracy | 3 (mm) | |
GPS | Position Accuracy (1) | 1 (m) |
IMU | Horizontal position accuracy (1) | 1 (m) |
Vertical position accuracy (1) | 1.5 (m) | |
Velocity accuracy | 0.05 (m/s) | |
/ range | 0–180 (deg) | |
range | 0–90 (deg) | |
/ accuracy (1) | 0.03 (deg) | |
accuracy (1) | 0.2 (deg) |
Configuration | Lateral Value | Longitudinal Value |
---|---|---|
with LCG map | 0.0632 (m) | 0.1529 (m) |
w/o LCG map | 0.2623 (m) | 0.2616 (m) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Donati, C.; Mammarella, M.; Comba, L.; Biglia, A.; Gay, P.; Dabbene, F. 3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios. Remote Sens. 2022, 14, 1374. https://doi.org/10.3390/rs14061374
Donati C, Mammarella M, Comba L, Biglia A, Gay P, Dabbene F. 3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios. Remote Sensing. 2022; 14(6):1374. https://doi.org/10.3390/rs14061374
Chicago/Turabian StyleDonati, Cesare, Martina Mammarella, Lorenzo Comba, Alessandro Biglia, Paolo Gay, and Fabrizio Dabbene. 2022. "3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios" Remote Sensing 14, no. 6: 1374. https://doi.org/10.3390/rs14061374
APA StyleDonati, C., Mammarella, M., Comba, L., Biglia, A., Gay, P., & Dabbene, F. (2022). 3D Distance Filter for the Autonomous Navigation of UAVs in Agricultural Scenarios. Remote Sensing, 14(6), 1374. https://doi.org/10.3390/rs14061374