Scanning Photogrammetry for Measuring Large Targets in Close Range
"> Figure 1
<p>Illustration of image matrix at one station (The matrix contains 6 rows and 10 columns).</p> "> Figure 2
<p>Rotation angles influenced as perspective projection.</p> "> Figure 3
<p>Calculation of rotation angles.</p> "> Figure 4
<p>Distribution of horizontal and vertical rotation angles; the focal length is 100 mm, and the camera format is <math display="inline"> <semantics> <mrow> <mn>36</mn> <mtext> mm</mtext> <mo>×</mo> <mn>24</mn> <mtext> mm</mtext> </mrow> </semantics> </math>. The rotation angles of the image at the bottom left corner and top right corner of the image matrix are <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <mo>−</mo> <mn>30</mn> <mo>°</mo> <mo>,</mo> <mo>−</mo> <mn>30</mn> <mo>°</mo> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>, and <math display="inline"> <semantics> <mrow> <mrow> <mo>(</mo> <mrow> <mn>30</mn> <mo>°</mo> <mo>,</mo> <mn>30</mn> <mo>°</mo> </mrow> <mo>)</mo> </mrow> </mrow> </semantics> </math>, respectively. The set overlaps in horizontal and vertical directions are 80% and 60%, respectively.</p> "> Figure 5
<p>Synthetic images from image matrices with rotation angles calculated in different approaches; (<b>a</b>) is the synthetic image from the image matrix acquired using the same approach in determining the relative rotation angle between adjacent images; (<b>b</b>) is the synthetic image from the image matrix acquired with the improved method introduced in this paper; (<b>c</b>) is the image at the top left corner of the image matrix presented in <a href="#remotesensing-07-10042-f005" class="html-fig">Figure 5</a>a; (<b>d</b>) is the image at the top left corner of the image matrix presented in <a href="#remotesensing-07-10042-f005" class="html-fig">Figure 5</a>b.</p> "> Figure 6
<p>Station distribution: (<b>a</b>) station distribution of wide target measuring; (<b>b</b>) station distribution of narrow target measuring.</p> "> Figure 7
<p>Parts of the photo scanner</p> "> Figure 8
<p>Flow chart of data processing</p> "> Figure 9
<p>Model of image synthesis</p> "> Figure 10
<p>Quick view of test fields. ((<b>a</b>) flag platform of Wuhan University.; (<b>b</b>) high slope of Three Gorges Project permanent lock. The regions in the red rectangles show the measuring ranges.)</p> "> Figure 11
<p>Synthetic images: (<b>a</b>) synthetic image of one station for test field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>a; (<b>b</b>) synthetic image of one station for test field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>b.</p> "> Figure 12
<p>Synthetic images matching. (<b>a</b>,<b>b</b>) shows the matching results of synthetic images from the first and second station for test field in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>a; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 13
<p>Synthetic images matching. (<b>a</b>,<b>b</b>) shows the matching results of synthetic images from the first and second station for test field in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>b; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 14
<p>Original images matching. (<b>a</b>,<b>b</b>) shows the matching results of images from the adjacent stations as measuring the first field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>a; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 15
<p>Original images matching. (<b>a</b>,<b>b</b>) shows the matching results of adjacent images in the same row of one image matrix as measuring the first field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>a; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 16
<p>Results of stereo images matching. (<b>a</b>,<b>b</b>) shows the matching results of images from the adjacent stations as measuring the second field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>b; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 17
<p>Results of stereo images matching. (<b>a</b>,<b>b</b>) shows the matching results of adjacent images in the same row of one image matrix as measuring the second field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>b; (<b>c</b>,<b>d</b>) illustrates part of the results.</p> "> Figure 18
<p>Point clouds. (<b>a</b>,<b>b</b>) shows the point clouds before and after self-calibration bundle adjustment with control points in the experiment of the first field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>a.</p> "> Figure 19
<p>Point clouds. (<b>a</b>,<b>b</b>) shows the point clouds before and after bundle adjustment in the experiment of the second field showed in <a href="#remotesensing-07-10042-f010" class="html-fig">Figure 10</a>b.</p> "> Figure 20
<p>Error vectors of control points and check points. (<b>a</b>) is the error vectors of points measured at a distance of 40 m; (<b>b</b>) and (<b>c</b>) denote error vectors of points measured at a photo distance of 80 m. (b) is the result with 12 control points, and (c) is 10 control points; (<b>d</b>) and (<b>e</b>) show error vectors at a photo distance of 150 m, and they are the results with 12 control points and eight control points, respectively.</p> "> Figure 21
<p>Error vectors of control points and check points. (<b>a</b>), (<b>b</b>), and (<b>c</b>) denote error vectors of points with horizontal overlap of 80%, 60%, and 30%, respectively. The coordinate residuals of check points in the experiments with image horizontal overlap 80%, 60%, and 30%, respectively.</p> "> Figure 22
<p>Error vectors of control points and check points. (<b>a</b>) and (<b>b</b>) denote error vectors of points in the experiments of obtaining images from three and five stations Coordinates residuals of check points in the experiments with <span class="html-italic">M</span> as 3 and 5, respectively.</p> "> Figure 23
<p>Error vectors of control points and check points. (<b>a</b>) and (<b>b</b>) demonstrate error vectors of points in the experiments with designed intersection angle as 26<math display="inline"> <semantics> <mo>°</mo> </semantics> </math>and 35<math display="inline"> <semantics> <mo>°</mo> </semantics> </math>.</p> "> Figure 24
<p>Error vectors of control points and check points. (<b>a</b>) and (<b>b</b>) demonstrate error vectors of points in the experiments with 300 and 600 mm focal lens.</p> ">
Abstract
:1. Introduction
2. Proposed Scheme
2.1. Scanning Photography
- (1)
- Obtain the range of rotation angles of a station. As mentioned above, horizontal rotation angle ranges from to , and vertical rotation angle varies between and .
- (2)
- Region W × H can be calculated according to the angle range.
- (3)
- Calculate the temporary horizontal and vertical distances between centers of the adjacent images in the same row and column in image matrix as and , respectively. Accordingly, determine the row and column number of image matrix as ,, respectively.
- (4)
- Recalculate the horizontal and vertical distances as and , respectively, to divide the region equally into rows and columns .
- (5)
- Determine the location of each image center on focal plane of the “normal” images, and calculate the rotation angles of each image in both directions.
2.2. Station Distribution
- (1)
- (2)
- The photo distances of every station are approximately equal when the variation of the target in depth direction is not large. Otherwise, the stations should be adjusted to the variation of the target in depth direction to maintain equal photo distances.
- (3)
- The length of the baselines should be the same.
2.3. Photo Scanner
- (1)
- The instrument is installed and placed at the proper position.
- (2)
- The photographic plane is paralleled to the average plane of the needed photographing region, and the rotation platform is leveled with the bubble on the instrument base. This position of the camera is defined as the “zero posture” in this station and is considered the origin of rotation angles in both horizontal and vertical directions.
- (3)
- The FOV of this station is specified, and the required parameters are inputted into the controller.
- (4)
- The number of rows and columns of the image matrix as well as the rotation angles in horizontal and vertical directions for each image are computed, and the signal is sent to manage the rotation and exposure of the camera for automatic image acquisition.
2.4. Data Processing
2.4.1. Synthetic Image
2.4.2. Image Matching and Error Detection
2.4.3. Modified Aerial Triangulation
3. Experimental Results
3.1. Experimental Data
3.2. Results of Synthetic Images
3.3. Results of Image Matching (Synthetic Images and Original Images)
3.4. Results of Point Clouds
3.5. Coordinate Residuals of Triangulation
3.5.1. Comparison of Different Photo Distances
Cases | I | II | II | III | III | |
---|---|---|---|---|---|---|
Photo distance (m) | 40 | 80 | 80 | 150 | 150 | |
Focal length (mm) | 300 | |||||
Ground resolution (mm) | 0.9 | 1.7 | 1.7 | 3.2 | 3.2 | |
Total station number N | 5 | |||||
Give least intersection angle () | 26 | 35 | 35 | 35 | 35 | |
Baseline (m) | 8.8 | 14 | 14 | 26.3 | 26.3 | |
Image amount | 858 | 171 | 171 | 41 | 41 | |
RMSE of image point residuals (pixel) | 1/2 | 1/2 | 1/2 | 1/2 | 1/2 | |
Mean intersection angle of tie points () | 23.7 | 18.5 | 18.5 | 18.1 | 18.1 | |
Maximum intersection angle of tie points () | 43.2 | 37.6 | 36.4 | 35.8 | 35.6 | |
Minimum intersection angle of tie points () | 4.2 | 4.4 | 4.8 | 6.3 | 6.5 | |
Number of control points | 12 | 12 | 10 | 12 | 8 | |
Accuracy (mm) | X | 1.2 | 1.8 | 1.6 | 1.5 | 1.5 |
Y | 1.0 | 1.3 | 0.7 | 1.2 | 0.4 | |
Z | 1.1 | 2.2 | 2.7 | 2.5 | 2.9 | |
XY | 1.5 | 2.2 | 1.7 | 1.9 | 1.5 | |
XYZ | 1.9 | 3.1 | 3.2 | 3.1 | 3.3 | |
Number of check points | 14 | 14 | 16 | 14 | 18 | |
Accuracy (mm) | X | 2.0 | 2.1 | 2.4 | 2.5 | 2.5 |
Y | 0.8 | 1.8 | 1.9 | 1.6 | 1.7 | |
Z | 1.0 | 2.5 | 2.5 | 2.8 | 2.7 | |
XY | 2.1 | 2.8 | 3.0 | 2.9 | 3.0 | |
XYZ | 2.3 | 3.7 | 3.9 | 4.0 | 4.1 |
3.5.2. Comparison of Different Overlaps between Images in Image Matrices
Cases | I | II | III | |
---|---|---|---|---|
Photo distance (m) | 40 | 40 | 40 | |
overlap in horizontal direction | 80% | 60% | 30% | |
overlap in vertical direction | 60% | 60% | 60% | |
Focal length (mm) | 300 | |||
Ground resolution (mm) | 0.9 | |||
Total station number N | 5 | |||
Give least intersection angle () | 26 | |||
Baseline (m) | 8.8 | |||
Image amount | 858 | 444 | 232 | |
RMSE of image point residuals (pixel) | 1/2 | 1/2 | 1/2 | |
Mean intersection angle of tie points () | 23.7 | 23.4 | 22.5 | |
Maximum intersection angle of tie points () | 43.2 | 43.2 | 43.0 | |
Minimum intersection angle of tie points () | 4.2 | 5.6 | 2.4 | |
Number of control points | 12 | 12 | 12 | |
Accuracy (mm) | X | 1.2 | 1.5 | 1.6 |
Y | 1.0 | 1.1 | 1.2 | |
Z | 1.1 | 1.5 | 1.6 | |
XY | 1.5 | 1.9 | 2.0 | |
XYZ | 1.9 | 2.4 | 2.6 | |
Number of check points | 14 | 14 | 14 | |
Accuracy (mm) | X | 2.0 | 2.0 | 1.9 |
Y | 0.8 | 0.7 | 0.7 | |
Z | 1.0 | 1.4 | 1.8 | |
XY | 2.1 | 2.1 | 2.1 | |
XYZ | 2.3 | 2.5 | 2.7 |
3.5.3. Comparison of M
Cases | I | II | |
---|---|---|---|
Photo distance (m) | 80 | 80 | |
overlap in horizontal direction | 80% | 80% | |
overlap in vertical direction | 60% | 60% | |
Focal length (mm) | 300 | ||
Ground resolution (mm) | 1.7 | ||
Total station number N | 3 | 5 | |
Give least intersection angle () | 35 | 35 | |
Baseline(m) | 28 | 14 | |
Image amount | 90 | 171 | |
RMSE of image point residuals(pixel) | 1/2 | 1/2 | |
Mean intersection angle of tie points() | 25.2 | 18.5 | |
Maximum intersection angle of tie points() | 36.3 | 36.4 | |
Minimum intersection angle of tie points() | 3.4 | 4.8 | |
Number of control points | 10 | 10 | |
Accuracy (mm) | X | 1.6 | 1.6 |
Y | 0.8 | 0.7 | |
Z | 2.6 | 2.7 | |
XY | 1.8 | 1.7 | |
XYZ | 3.2 | 3.2 | |
Number of check points | 16 | 16 | |
Accuracy (mm) | X | 2.4 | 2.4 |
Y | 1.7 | 1.9 | |
Z | 2.5 | 2.5 | |
XY | 3.0 | 3.0 | |
XYZ | 3.9 | 3.9 |
3.5.4. Comparison of Set Intersection Angle
Cases | I | II | |
---|---|---|---|
Photo distance (m) | 40 | 40 | |
overlap in horizontal direction | 80% | 80% | |
overlap in vertical direction | 60% | 60% | |
Focal length (mm) | 300 | ||
Ground resolution (mm) | 0.9 | ||
Total station number N | 5 | 3 | |
Give least intersection angle () | 26 | 35 | |
Baseline (m) | 8.8 | 17.6 | |
Image amount | 858 | 738 | |
RMSE of image point residuals (pixel) | 1/2 | 1/2 | |
Mean intersection angle of tie points () | 23.7 | 29.9 | |
Maximum intersection angle of tie points () | 43.2 | 44.8 | |
Minimum intersection angle of tie points () | 4.2 | 6.1 | |
Number of control points | 12 | 12 | |
Accuracy (mm) | X | 1.1 | 1.3 |
Y | 1.0 | 1.1 | |
Z | 1.2 | 1.4 | |
XY | 1.4 | 1.7 | |
XYZ | 1.9 | 2.2 | |
Number of check points | 14 | 16 | |
Accuracy (mm) | X | 2.0 | 1.8 |
Y | 0.8 | 0.8 | |
Z | 1.0 | 1.9 | |
XY | 2.1 | 2.0 | |
XYZ | 2.3 | 2.7 |
3.5.5. Comparison of Different Focuses
Cases | I | II | |
---|---|---|---|
Photo distance (m) | 250 | 250 | |
overlap in horizontal direction | 80% | 60% | |
overlap in vertical direction | 60% | 40% | |
Focal length (mm) | 300 | 600 | |
Ground resolution (mm) | 5.3 | 2.7 | |
Total station number N | 3 | 3 | |
Give least intersection angle () | 30 | 30 | |
Baselines (m) | 55 | 55 | |
Image amount | 601 | 982 | |
RMS of image point residuals (pixel) | 2/5 | 1/2 | |
Mean intersection angle of tie points() | 22.2 | 23.4 | |
Maximum intersection angle of tie points() | 33.1 | 29.4 | |
Minimum intersection angle of tie points() | 6.1 | 6.0 | |
Number of control points | 14 | 14 | |
Accuracy(mm) | X | 3.3 | 1.7 |
Y | 3.3 | 1.6 | |
Z | 3.9 | 4.4 | |
XY | 4.6 | 2.3 | |
XYZ | 6.0 | 5.0 | |
Number of check points | 14 | 14 | |
Accuracy (mm) | X | 3.2 | 2.7 |
Y | 4.0 | 3.2 | |
Z | 4.4 | 3.9 | |
XY | 5.1 | 4.2 | |
XYZ | 6.8 | 5.7 |
4. Discussion
- (1)
- Difference in scanning photography
- (2)
- Difference in rotation platform
- (3)
- Difference in data processing
- (1)
- The modified triangulation ensures the initial parameters are stable, which benefits for the convergence of bundle adjustment.
- (2)
- In consideration of the unstable intrinsic parameters of the non-metric cameras with long focal lens, especially telephoto, the on-line self-calibration bundler adjustment with control points is employed in our scanning photogrammetry. We hold that the intrinsic parameters are constant during the short time of data acquisition. Further, we fixed the focus of lens with a tape to avoid small changes caused by vibration when the photo scanner moves. During bundler adjustment, the weights of observations based on intersection angles are considered to ensure its convergence.
5. Conclusions
- (1)
- To apply our method to more kinds of targets, such as tunnels, we consider projecting the image matrix on a cylinder or a sphere in future research.
- (2)
- To enhance measurement accuracy, we will continue to work on dealing with the image distortion using telephoto lens.
- (3)
- In the next step, we will work towards calibration of the rotation platform to give more precise initial parameters for bundle adjustment.
Acknowledgments
Author Contributions
Conflicts of Interest
References and Notes
- Kim, K.Y.; Kim, C.Y.; Lee, S.D.; Seo, Y.S.; Lee, C.I. Measurement of tunnel 3-D displacement using digital photogrammetry. J. Korean Soc. Eng. Geol. 2007, 17, 567–576. [Google Scholar] [CrossRef]
- Baldi, P.; Fabris, M.; Marsella, M.; Monticelli, R. Monitoring the morphological evolution of the Sciara del Fuoco during the 2002–2003 Stromboli eruption using multi-temporal photogrammetry. ISPRS J. Photogramm. 2005, 59, 199–211. [Google Scholar] [CrossRef]
- Nakai, T.; Ryu, M.; Miyauchi, H.; Miura, S.; Ohnishi, Y.; Nishiyama, S. Underground Space Use: Analysis of the Past and Lessons for the Future; Taylor & Francis: Istanbul, Turkey, 2005; pp. 1203–1209. [Google Scholar]
- Previtali, M.; Barazzetti, L.; Scaioni, M.; Tian, Y. An automatic multi-image procedure for accurate 3D object reconstruction. In Proceedings of the 2011 IEEE 4th International Congress on Image and Signal Processing (CISP), Shanghai, China, 15–17 October 2011; Volume 3, pp. 1400–1404.
- Balletti, C.; Guerra, F.; Tsioukas, V.; Vernier, P. Calibration of Action Cameras for Photogrammetric Purposes. Sensors 2014, 14, 17471–17490. [Google Scholar] [CrossRef] [PubMed]
- Flener, C.; Vaaja, M.; Jaakkola, A.; Krooks, A.; Kaartinen, H.; Kukko, A.; Kasvi, E.; Hyyppä, H.; Hyyppä, J.; Alho, P. Seamless Mapping of river channels at high resolution using mobile LiDAR and UAV-Photography. Remote Sens. 2013, 5, 6382–6407. [Google Scholar] [CrossRef]
- Chandler, J. Effective application of automated digital photogrammetry for geomorphological research. Earth Surf. Proc. Land 1999, 24, 51–63. [Google Scholar] [CrossRef]
- Fraser, C.S.; Shortis, M.R.; Ganci, G. Multi-sensor system self-calibration. Proc. SPIE 1995, 2598. [Google Scholar] [CrossRef]
- Peipe, J.; Schneider, C.T. High-resolution still video camera for industrial photogrammetry. Photogramm. Rec. 1995, 15, 135–139. [Google Scholar] [CrossRef]
- Feng, W. The application of non-metric camera in close-range photogrammetry. Railway Investig. Surv. 1982, 4, 43–54. [Google Scholar]
- Rieke-Zapp, D.H. A Digital Medium-Format Camera for Metric Applications. Photogramm. Rec. 2010, 25, 283–298. [Google Scholar] [CrossRef]
- Yakar, M. Using close range photogrammetry to measure the position of inaccessible geological features. Exp. Tech. 2011, 35, 54–59. [Google Scholar] [CrossRef]
- Yilmaz, H.M. Close range photogrammetry in volume computing. Exp. Tech. 2010, 34, 48–54. [Google Scholar] [CrossRef]
- Luhmann, T. Close range photogrammetry for industrial applications. ISPRS J. Photogramm. 2010, 65, 558–569. [Google Scholar] [CrossRef]
- Fraser, C.S.; Cronk, S. A hybrid measurement approach for close-range photogrammetry. ISPRS J. Photogramm. 2009, 64, 328–333. [Google Scholar] [CrossRef]
- Ke, T.; Zhang, Z.X.; Zhang, J.Q. Panning and multi-baseline digital close-range photogrammetry. Proc. SPIE 2007, 6788. [Google Scholar] [CrossRef]
- Ordonez, C.; Martinez, J.; Arias, P.; Armesto, J. Measuring building facades with a low-cost close-range photogrammetry system. Automat. Constr. 2010, 19, 742–749. [Google Scholar] [CrossRef]
- Dall'Asta, E.; Thoeni, K.; Santise, M.; Forlani, G.; Giacomini, A.; Roncella, R. Network Design and Quality Checks in Automatic Orientation of Close-Range Photogrammetric Blocks. Sensors 2015, 15, 7985–8008. [Google Scholar] [CrossRef] [PubMed]
- Fraser, C.S.; Edmundson, K.L. Design and implementation of a computational processing system for off-line digital close-range photogrammetry. ISPRS J. Photogramm. 2000, 55, 94–104. [Google Scholar] [CrossRef]
- Fraser, C.S.; Woods, A.; Brizzi, D. Hyper redundancy for accuracy enhancement in automated close range photogrammetry. Photogramm. Rec. 2005, 20, 205–217. [Google Scholar] [CrossRef]
- Jiang, R.N.; Jauregui, D.V. Development of a digital close-range photogrammetric bridge deflection measurement system. Measurement 2010, 43, 1431–1438. [Google Scholar] [CrossRef]
- Geodetic Systems, Inc. Available online: http://www.geodetic.com/v-stars.aspx (accessed on 27 June 2015).
- AICON 3D Systems. Available online: http://aicon3d.com/start.html (accessed on 27 June 2015).
- Pechatnikov, M.; Shor, E.; Raizman, Y. VisionMap A3 - super wide angle mapping system basic principles and workflow. In Proceeding of the 21th ISPRS Congress, Beijing, China, 3–11 July 2008; pp. 1735–1740.
- Fangi, G.; Nardinocchi, C. Photogrammetric Processing of Spherical Panoramas. Photogram. Rec. 2013, 28, 293–311. [Google Scholar] [CrossRef]
- Takasu, M.; Sato, T.; Kojima, S.; Hamada, K. Development of Surveying Robot. In Proceedings of the 13th International Association for Automation and Robotics in Construction, Tokyo, Japan, 11–13 June 1996; pp. 709–716.
- Huang, Y.D. 3-D measuring systems based on theodolite-CCD cameras. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 1993, 29, 541. [Google Scholar]
- Ke, T.; Zhang, Z.; Zhang, J. Panning and multi-baseline digital close-range photogrammetry. Geomat. Inf. Sci. Wuhan Univ. 2009, 34, 44–47. [Google Scholar]
- Ke, T. Panning and Multi-baseline Digital Close-range Photogrammetry. Ph.D. Thesis, Wuhan University, Wuhan, China, 2008. [Google Scholar]
- Barazzetti, L.; Previtali, M.; Scaioni, M. 3D modeling from gnomonic projections. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 1, 19–24. [Google Scholar] [CrossRef]
- Barazzetti, L.; Previtali, M.; Scaioni, M. Simultaneous registration of gnomonic projections and central perspectives. Photogramm. Rec. 2014, 29, 278–296. [Google Scholar] [CrossRef]
- Barazzetti, L.; Previtali, M.; Scaioni, M. Stitching and processing gnomonic projections for close-range photogrammetry. Photogramm. Eng. Remote Sens. 2013, 79, 573–582. [Google Scholar] [CrossRef]
- Fangi, G. Multiscale Multiresolution Spherical Photogrammetry with Long Focal Lenses for Architectural Surveys. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2010, 38, 1–6. [Google Scholar]
- Amiri Parian, J.; Gruen, A. Sensor modeling, self-calibration and accuracy testing of panoramic cameras and laser scanners. ISPRS J. Photogramm. 2010, 65, 60–76. [Google Scholar] [CrossRef]
- Schneider, D.; Hans, M. A geometric model for linear-array-based terrestrial panoramic cameras. Photogram. Rec. 2006, 21, 198–210. [Google Scholar] [CrossRef]
- Mikhail, E.M.; Bethel, J.S.; McGlone, J.C. Introduction to Modern Photogrammetry; Wiley: New York, NY, USA, 2001; p. 479. [Google Scholar]
- Maas, H.G.; Hampel, U. Photogrammetric techniques in civil engineering material testing and structure monitoring. Photogramm. Eng. Remote Sens. 2006, 72, 39–45. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: New York, NY, USA, 2003; pp. 87–131. [Google Scholar]
- Ma, J.L.; Chan, J.C.; Canters, F. Fully Automatic Subpixel Image Registration of Multiangle CHRIS/Proba Data. IEEE Trans. Geosci. Remote Sens. 2010, 48, 2829–2839. [Google Scholar]
- Nistér, D. An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 756–770. [Google Scholar] [CrossRef] [PubMed]
- Stewenius, H.; Engels, C.; Nister, D. Recent developments on direct relative orientation. ISPRS J. Photogramm. 2006, 60, 284–294. [Google Scholar] [CrossRef]
- Tee-Ann, T.; Liang-Chien, C.; Chien-Liang, L.; Yi-Chung, T.; Wan-Yu, W. DEM-Aided Block Adjustment for Satellite Images With Weak Convergence Geometry. IEEE Trans. Geosci. Remote Sens. 2010, 48, 1907–1918. [Google Scholar] [CrossRef]
- Slama, C.C.; Theurer, C.; Hendrikson, S.W. Manual of Photogrammetry, 4th ed.; American Society of Photogrammetry: Falls Church, VA, USA, 1980; p. 1059. [Google Scholar]
- Stamatopoulos, C.; Fraser, C.S. Calibration of long focal length cameras in close range photogrammetry. Photogramm. Rec. 2011, 26, 339–360. [Google Scholar] [CrossRef]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, S.; Zhang, Z.; Ke, T.; Tang, M.; Xu, X. Scanning Photogrammetry for Measuring Large Targets in Close Range. Remote Sens. 2015, 7, 10042-10077. https://doi.org/10.3390/rs70810042
Huang S, Zhang Z, Ke T, Tang M, Xu X. Scanning Photogrammetry for Measuring Large Targets in Close Range. Remote Sensing. 2015; 7(8):10042-10077. https://doi.org/10.3390/rs70810042
Chicago/Turabian StyleHuang, Shan, Zuxun Zhang, Tao Ke, Min Tang, and Xuan Xu. 2015. "Scanning Photogrammetry for Measuring Large Targets in Close Range" Remote Sensing 7, no. 8: 10042-10077. https://doi.org/10.3390/rs70810042
APA StyleHuang, S., Zhang, Z., Ke, T., Tang, M., & Xu, X. (2015). Scanning Photogrammetry for Measuring Large Targets in Close Range. Remote Sensing, 7(8), 10042-10077. https://doi.org/10.3390/rs70810042