Satellite Pose Estimation via Only a Single Spatial Circle
<p>The concept of measuring and grasping the docking ring.</p> "> Figure 2
<p>Coordinate frame definition.</p> "> Figure 3
<p>Two angles have a common edge.</p> "> Figure 4
<p>Two angles do not have a common edge.</p> "> Figure 5
<p>Three points are symmetrical.</p> "> Figure 6
<p>Four points are symmetrical (case 1).</p> "> Figure 7
<p>Four points are symmetrical (case 2).</p> "> Figure 8
<p>Five points are symmetrical.</p> "> Figure 9
<p>Six points are symmetrical.</p> "> Figure 10
<p>Sparse point distribution.</p> "> Figure 11
<p>The network structure Figure.</p> "> Figure 12
<p>Position response heatmap.</p> "> Figure 13
<p>Voting process.</p> "> Figure 14
<p>Pose measurement platform.</p> "> Figure 15
<p>Part of the image of the dataset.</p> "> Figure 16
<p>Position error Figure. (<b>a</b>) Error curve of <span class="html-italic">X</span>-axis direction. (<b>b</b>) Error curve of <span class="html-italic">Y</span>-axis direction. (<b>c</b>) Error curve of <span class="html-italic">Z</span>-axis direction.</p> "> Figure 16 Cont.
<p>Position error Figure. (<b>a</b>) Error curve of <span class="html-italic">X</span>-axis direction. (<b>b</b>) Error curve of <span class="html-italic">Y</span>-axis direction. (<b>c</b>) Error curve of <span class="html-italic">Z</span>-axis direction.</p> "> Figure 17
<p>Rotation angle error Figure. (<b>a</b>) Pitch angle error curve. (<b>b</b>) Yaw angle error curve. (<b>c</b>) Roll angle error curve.</p> ">
Abstract
:1. Introduction
2. Definition of Coordinate Frame and Ambiguity Elimination
2.1. Coordinate Frame Definition
2.2. Sparse Point Selection
- (1)
- Symmetry can be obtained if the central angles are equal
- (2)
- At least one pair of equal central angles can be obtained if there is symmetry
3. Pose Estimation Network, Hvnet, Based on Hough Voting
3.1. Backbone Feature Extraction Network
3.2. Heatmap Regression Network
3.3. Voting Strategy
4. Experiment
4.1. Measurement Parameters
4.2. Analysis of the Results
4.2.1. Analysis of the Experimental Results of the Position Error
4.2.2. Analysis of Experimental Results of the Rotation Angle Error
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Renato, V.; Marco, S.; Palmerini, G.B. Pose and Shape Reconstruction of a Noncooperative Spacecraft Using Camera and Range Measurements. Int. J. Aerosp. Eng. 2017, 2017, 4535316. [Google Scholar]
- Song, J.; Cao, C.; Pennock, G.R. Pose Self-Measurement of Noncooperative Spacecraft Based on Solar Panel Triangle Structure. J. Robot. 2015, 2015, 472461. [Google Scholar] [CrossRef]
- Arantes, G., Jr.; Rocco, E.M.; da Fonseca, I.M.; Theil, S. Far and proximity maneuvers of a constellation of service satellites and autonomous pose estimation of customer satellite using machine vision. Acta Astronaut. 2010, 66, 1493–1505. [Google Scholar] [CrossRef] [Green Version]
- Oumer, N.W.; Panin, G. Tracking and Pose Estimation of Non-Cooperative Satellite for on-Orbit Servicing. In Proceedings of the i-SAIRAS 2012, Turin, Italy, 4–7 September 2012. [Google Scholar]
- Zhang, H.; Jiang, Z.; Elgammal, A. Satellite recognition and pose estimation using homeomorphic manifold analysis. IEEE Trans. Aerosp.Electron. Syst. 2015, 51, 785–792. [Google Scholar] [CrossRef]
- Shu, A.; Pei, H.; Duan, H. Trinocular stereo vision measurement method for spatial non-cooperative targets. Acta Opt. Sin. 2021, 41, 163–171. [Google Scholar]
- Zhang, Y. Research on Visual Measurement Method of Spatial Non-Cooperative Target Based on Straight Line Feature; National University of Defense Technology: Hunan, China, 2016. [Google Scholar]
- Martínez, H.G.; Giorgi, G.; Eissfeller, B. Pose estimation and tracking of non-cooperative rocket bodies using time-of-flight cameras. Acta Astronaut. 2017, 139, 165–175. [Google Scholar] [CrossRef]
- Huang, P.; Chen, L.; Zhang, B.; Meng, Z.; Liu, Z. Autonomous rendezvous and docking with nonfull field of view for tethered space robot. Int. J. Aerosp. Eng. 2017, 2017, 3162349. [Google Scholar] [CrossRef]
- Du, X.; Liang, B.; Xu, W.; Qiu, Y. Pose measurement of large non-cooperative satellite based on collaborative cameras. Acta Astronaut. 2011, 68, 2047–2065. [Google Scholar] [CrossRef]
- Reed, B.B.; Smith, R.C.; Naasz, B.J.; Pellegrino, J.F.; Bacon, C.E. The Restore-L Servicing Mission. In Proceedings of the AIAA SPACE, Long Beach, CA, USA, 13–16 September 2016. [Google Scholar]
- Wieser, M.; Richard, H.; Hausmann, G.; Meyer, J.-C.; Jaekel, S.; Lavagna, M.; Biesbroek, R. E. Deorbit Mission: OHB Debris Removal Concepts. In Proceedings of the ASTRA 2015—13th Symposium on Advanced Space Technologies in Robotics and Automation, Noordwijk, The Netherlands, 11–13 May 2015. [Google Scholar]
- Miao, X.; Zhu, F.; Ding, Q.; Hao, Y.; Wu, Q.; Xia, R. Monocular visual pose measurement method of aircraft based on star-arrow docking ring components. Acta Opt. Sin. 2013, 33, 123–131. [Google Scholar]
- Meng, C.; Li, Z.; Sun, H.; Yuan, D.; Bai, X.; Zhou, F. Satellite pose estimation via single perspective circle and line. IEEE Trans. Aerosp. Electron. Syst. 2018, 54, 3084–3095. [Google Scholar] [CrossRef]
- Liu, L.; Zhao, Z. A new approach for measurement of pitch, roll and yaw angles based on a circular feature. Trans. Inst. Meas. Control 2013, 35, 384–397. [Google Scholar] [CrossRef]
- Liu, Y.; Xie, Z.; Wang, B.; Liu, H. Pose Measurement of a Non-Cooperative Spacecraft Based on Circular Features. In Proceedings of the 2016 IEEE International Conference on Real-time Computing and Robotics (RCAR), Angkor Wat, Cambodia, 6–10 June 2016; pp. 221–226. [Google Scholar]
- Wang, S.; Zhang, S. Spacecraft ellipse feature extraction method based on texture boundary detection. J. Astronaut. 2018, 39, 76–82. [Google Scholar]
- Li, Z.; Hao, Y.; Fu, S. The relative pose measurement method of star-arrow docking ring based on structured light. Comput. Eng. Appl. 2019, 55, 205–212. [Google Scholar]
- Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPnP: An Accurate O(n) Solution to the PnP Problem. Int. J. Comput. Vis. 2009, 81, 155–166. [Google Scholar] [CrossRef] [Green Version]
- Chen, B.; Cao, J.; Parra, A.; Chin, T. Satellite Pose Estimation with Deep Landmark Regression and Nonlinear Pose Refinement. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October–3 November 2019. [Google Scholar]
- Sharma, S.; D’Amico, S. Neural Network-Based Pose Estimation for Noncooperative Spacecraft Rendezvous. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 4638–4658. [Google Scholar] [CrossRef]
- Harvard, A.; Capuano, V.; Shao, E.Y.; Chung, S.J. Spacecraft Pose Estimation from MonocularImages Using Neural Network Based Keypoints and Visibility Maps. In Proceedings of the AIAA Scitech2020 Forum, Orlando, FL, USA, 6–10 January 2020; p. 1874. [Google Scholar]
- Sharma, S.; Ventura, J.; D’Amico, S. Robust model-based monocular pose initialization for noncooperative spacecraft rendezvous. J. Spacecr. Rocket. 2018, 55, 1414–1429. [Google Scholar] [CrossRef] [Green Version]
- Newell, A.; Yang, K.; Deng, J. Stacked Hourglass Networks for Human Pose Estimation. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; Springer: Cham, Switzerland, 2016; pp. 483–499. [Google Scholar]
- Rad, M.; Lepetit, V. Bb8: A Scalable, Accurate, Robust to Partial Occlusion Method for Predicting the 3d Poses of Challenging Objects without Using Depth. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 3828–3836. [Google Scholar]
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.-E.; Sheikh, Y. OpenPose: Realtime multi-person 2D pose estimation using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Oberweger, M.; Rad, M.; Lepetit, V. Making Deep Heatmaps Robust to Partial Occlusions for 3D Object Pose Estimation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 119–134. [Google Scholar]
- Papandreou, G.; Zhu, T.; Chen, L.C.; Gidaris, S.; Tompson, J.; Murphy, K. Personlab: Person Pose Estimation and Instance Segmentation with a Bottom-up, Part-Based, Geometric Embedding Model. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 269–286. [Google Scholar]
- Peng, S.; Liu, Y.; Huang, Q.; Zhou, X.; Bao, H. Pvnet: Pixel-Wise Voting Network for 6dof Pose Estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 4561–4570. [Google Scholar]
- He, Y.; Sun, W.; Huang, H.; Liu, J.; Fan, H.; Sun, J. Pvn3d: A Deep Point-Wise 3D Keypoints Voting Network for 6Dof Pose Estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Virtual, 14–19 June 2020; pp. 11632–11641. [Google Scholar]
- He, Y.; Huang, H.; Fan, H.; Chen, Q.; Sun, J. FFB6D: A Full Flow Bidirectional Fusion Network for 6D Pose Estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Virtual, 19–25 June 2021; pp. 3003–3013. [Google Scholar]
- Toshev, A.; Szegedy, C. Deeppose: Human Pose Estimation via Deep Neural Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1653–1660. [Google Scholar]
- Tekin, B.; Sinha, S.N.; Fua, P. Real-Time Seamless Single Shot 6D Object Pose Prediction. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 292–301. [Google Scholar]
- Xiang, Y.; Schmidt, T.; Narayanan, V.; Fox, D. PoseCNN: A Convolutional Neural Network for 6D Object Pose Estimation in Cluttered Scenes. In Proceedings of the Robotics: Science and Systems (RSS), Pittsburgh, PA, USA, 26–30 June 2018. [Google Scholar]
- Wang, J.; Sun, K.; Cheng, T.; Jiang, B.; Deng, D.; Zhao, Y.; Liu, D.; Mu, Y.; Tan, M.; Wang, X.; et al. Deep High-Resolution Representation Learning for Visual Recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 3349–3364. [Google Scholar] [CrossRef] [Green Version]
- Kingma, D.P.; Ba, J.L. Adam: A Method for Stochastic Optimization. In Proceedings of the International Conference for Learning Representations, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Rennie, C.; Shome, R.; Bekris, K.E.; De Souza, A.F. A dataset for improved rgbd-based object detection and pose estimation for warehouse pick-and-place. IEEE Robot. Automat. Lett. 2016, 1, 1179–1185. [Google Scholar] [CrossRef] [Green Version]
- Hinterstoisser, S.; Lepetit, V.; Ilic, S.; Holzer, S.; Bradski, G.; Konolige, K.; Navab, N. Model Based Training, Detection and Pose Estimation of Texture-Less 3d Objects in Heavily Cluttered Scenes. In Proceedings of the ACCV, Deajeon, Korea, 5–9 November 2012. [Google Scholar]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
No. | max | mean | std | max | mean | std |
---|---|---|---|---|---|---|
Unit | (mm) | (mm) | (mm) | (%) | (%) | (%) |
pvnet (x) | 16.7 | 8.9 | 4.9 | - | - | - |
pvnet (y) | 35.7 | 16.3 | 10.1 | - | - | - |
pvnet (z) | 15.1 | 8.1 | 3.8 | 46 | 5.3 | 7.2 |
hvnet (x) | 16.5 | 8.7 | 4.8 | - | - | - |
hvnet (y) | 34.6 | 16.2 | 9.9 | - | - | - |
hvnet (z) | 10.3 | 6.9 | 2.5 | 18 | 4.2 | 2.7 |
No. | max | mean | std |
---|---|---|---|
Unit | |||
pvnet (pitch) | 18.6 | 6.6 | 4.8 |
pvnet (yaw) | 5.8 | 2.4 | 1.5 |
pvnet (roll) | 17.9 | 6.6 | 4.7 |
hvnet (pitch) | 3.4 | 1.5 | 0.8 |
hvnet (yaw) | 5.2 | 2.2 | 1.6 |
hvnet (roll) | 3.4 | 1.3 | 0.9 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, W.; Xiao, P.; Li, J. Satellite Pose Estimation via Only a Single Spatial Circle. Information 2022, 13, 95. https://doi.org/10.3390/info13020095
Zhang W, Xiao P, Li J. Satellite Pose Estimation via Only a Single Spatial Circle. Information. 2022; 13(2):95. https://doi.org/10.3390/info13020095
Chicago/Turabian StyleZhang, Wei, Pingguo Xiao, and Junlin Li. 2022. "Satellite Pose Estimation via Only a Single Spatial Circle" Information 13, no. 2: 95. https://doi.org/10.3390/info13020095
APA StyleZhang, W., Xiao, P., & Li, J. (2022). Satellite Pose Estimation via Only a Single Spatial Circle. Information, 13(2), 95. https://doi.org/10.3390/info13020095