Enhancing Image-Based Multiscale Heritage Recording with Near-Infrared Data
<p>Case studies: (<b>a</b>) part of the archaeological site of ancient Kymissala in Rhodes, (<b>b</b>) interior courtyard façades at the Venaria Center of Conservation and Restoration, (<b>c</b>) coromandel folding screen from <span class="html-italic">Castello Cavour</span> in Santena, (<b>d</b>) wooden furniture painted with flowers from <span class="html-italic">Palazzo Chiablese</span> in Turin.</p> "> Figure 2
<p>Pre-signaling and distribution of fixed-location points for geo-referencing and metric checks at the archaeological site of ancient Kymissala.</p> "> Figure 3
<p>Near-infrared (NIR) image acquisition: (<b>a</b>) dataset 1, (<b>b</b>) dataset 2, and (<b>c</b>) dataset 4.</p> "> Figure 4
<p>Number of overlapping images on the area of interest for scenarios 3 (<b>left</b>) and 4 (<b>right</b>).</p> "> Figure 5
<p>Textured models for dataset 1, (<b>a</b>) scenario 1, (<b>b</b>) scenario 2, (<b>c</b>) scenario 3, (<b>d</b>) scenario 4.</p> "> Figure 6
<p>NIR models for dataset 2: Metashape Professional (<b>left</b>) and Zephyr Aerial (<b>right</b>).</p> "> Figure 7
<p>Dataset 3 meshes produced with Metashape Professional, (<b>a</b>) untextured VIS mesh, (<b>b</b>) textured VIS mesh, (<b>c</b>) untextured NIR mesh, (<b>d</b>) textured NIR mesh.</p> "> Figure 8
<p>Variation between VIS and NIR Metashape Professional models for dataset 3 (Hausdorff distances shown on a scale up to 8 mm, with 0.5 mm scalar field intervals).</p> "> Figure 9
<p>Dataset 3 meshes produced with Metashape Professional, (<b>a</b>) untextured VIS mesh, (<b>b</b>) untextured NIR mesh, (<b>c</b>) vertices textured NIR mesh, (<b>d</b>) distances between NIR and VIS meshes.</p> "> Figure 10
<p>Details from the high-resolution orthomosaics produced for dataset 1 from Metashape Professional: scenario 1—VIS (<b>left</b>), scenario 2—NIR (<b>right</b>).</p> "> Figure 11
<p>Data terrain models, constructed after vegetation filtering, produced for dataset 1 from Metashape Professional: scenario 1—VIS (<b>left</b>), scenario 2—NIR (<b>right</b>).</p> "> Figure 12
<p>Comparison of details from the VIS (<b>left</b>) and NIR (<b>right</b>) dataset 2 textured models to evaluate identification of bio-deterioration on the façade’s surface.</p> "> Figure 13
<p>Detail from the NIR dataset 3 textured model, where retouching and defects on the painted surface can be observed.</p> "> Figure 14
<p>Detail from dataset 4 VIS (<b>left</b>) and NIR (<b>right</b>) textures to showcase how restored and repainted areas can be observed with the beyond-VIS acquisition.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Case Studies
2.2. Datasets
2.3. Processing Software and Hardware
3. Results
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Dostal, C.; Yamafune, K. Photogrammetric texture mapping: A method for increasing the fidelity of 3D models of cultural heritage materials. J. Archeol. Sci. Rep. 2018, 18, 430–436. [Google Scholar] [CrossRef]
- Soile, S.; Tsilimantou, E.; Keramidas, V.; Karoglou, M.; Bourexis, F.; Adamopoulos, E.; Delegou, T.E.; Lampropoulos, K.; Ioannidis, C.; Moropoulou, A.; et al. Multidisciplinary Documentation Using Non-destructive Testing Techniques for the Diagnostic Study of an Ancient Temple. In Nondestructive Evaluation and Monitoring Technologies, Documentation, Diagnosis and Preservation of Cultural Heritage; Osman, A., Moropoulou, A., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 17–36. [Google Scholar]
- Aiello, D.; Buscemi, F.; D’Agostino, G.; Galizia, M.; Militello, P.; Santagati, C. Low cost techniques for the digital survey of a Minoan architecture in the archaeological site of Phaistos (Crete). Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W17, 31–38. [Google Scholar] [CrossRef] [Green Version]
- Manajitprasert, S.; Tripathi, N.K.; Arunplod, S. Three-Dimensional (3D) Modeling of Cultural Heritage Site Using UAV Imagery: A Case Study of the Pagodas in Wat Maha That, Thailand. Appl. Sci. 2019, 9, 3640. [Google Scholar] [CrossRef] [Green Version]
- Chiabrando, F.; D’Andria, F.; Sammartano, G.; Spanò, A. UAV photogrammetry for archaeological site survey. 3D models at the Hierapolis in Phrygia (Turkey). Virtual Archaeol. Rev. 2018, 9, 28. [Google Scholar] [CrossRef] [Green Version]
- Rossi, C.; Achille, C.; Fassi, F.; Lori, F.; Rechichi, F.; Fiorillo, F. Digital Workflow to Support Archaeological Excavation: From the 3D Survey to the Websharing of Data. In Innovative Models for Sustainable Development in Emerging African Countries; Aste, N., Della Torre, S., Talamo, C., Adhikari, R.S., Rossi, C., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 149–157. [Google Scholar]
- Spanò, A.; Chiabrando, F.; Sammartano, G.; Teppati Losè, L. Integrated imaging approaches supporting the excavation activities. Multi-scale geospatial documentation in Hierapoli (TK). Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2018, XLII–2, 1075–1082. [Google Scholar]
- Mirabella Roberti, G.; Nannei, V.M.; Azzola, P.; Cardaci, A. Preserving the Venetian Fortress of Bergamo: Quick Photogrammetric Survey for Conservation Planning. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W11, 873–879. [Google Scholar] [CrossRef] [Green Version]
- Adamopoulos, E.; Tsilimantou, E.; Keramidas, V.; Apostolopoulou, M.; Karoglou, M.; Tapinaki, S.; Ioannidis, C.; Georgopoulos, A.; Moropoulou, A. Multi-sensor documentation of metric and qualitative information of historic stone structures. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, IV-2/W2, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Apollonio, F.I.; Basilissi, V.; Callieri, M.; Dellepiane, M.; Gaiani, M.; Ponchio, F.; Rizzo, F.; Rubino, A.R.; Scopigno, R.; Sobra, G. A 3D-centered information system for the documentation of a complex restoration intervention. J. Cult. Herit. 2018, 29, 89–99. [Google Scholar] [CrossRef]
- Tsilimantou, E.; Delegou, E.T.; Nikitakos, I.A.; Ioannidis, C.; Moropoulou, A. GIS and BIM as Integrated Digital Environments for Modeling and Monitoring of Historic Buildings. Appl. Sci. 2020, 10, 1078. [Google Scholar] [CrossRef] [Green Version]
- Nikolov, I.; Madsen, C. Benchmarking Close-range Structure from Motion 3D Reconstruction Software Under Varying Capturing Conditions. In Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection; Lecture Notes in Computer Science; Ioannides, M., Fink, E., Moropoulou, A., Hagedorn-Saupe, M., Fresa, A., Liestøl, G., Rajcic, V., Grussenmeyer, P., Eds.; Springer International Publishing: Cham, Switzerland, 2016; Volume 10058, pp. 15–26. [Google Scholar]
- Adamopoulos, E.; Rinaudo, F. An Updated Comparison on Contemporary Approaches for Digitization of Heritage Objects. In Proceedings of the 5th IMEKO TC-4 International Conference on Metrology for Archaeology and Cultural Heritage, Florence, Italy, 4–6 December 2019; Catelani, M., Daponte, P., Eds.; IMEKO: Florence, Italy, 2019; pp. 1–6. [Google Scholar]
- Stathopoulou, E.K.; Welponer, M.; Remondino, F. Open-source image-based 3D reconstruction pipelines: Revie, comparison and evaluation. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W17, 331–338. [Google Scholar] [CrossRef] [Green Version]
- Cosentino, A.; Gil, M.; Ribeiro, M.; Di Mauro, R. Technical photography for mural paintings: The newly discovered frescoes in Aci Sant’Antonio (Sicily, Italy). Cons. Patrim. 2014, 20, 23–33. [Google Scholar] [CrossRef]
- Verhoeven, G.J.; Smet, P.F.; Poelman, D.; Vermeulen, F. Spectral Characterization of a Digital Still Camera’s NIR Modification to Enhance Archaeological Observation. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3456–3468. [Google Scholar] [CrossRef]
- Lerma, J.L.; Cabrelles, M.; Akasheh, T.S.; Haddad, N.A. Documentation of Weathered Architectural Heritage with Visible, near Infrared, Thermal and Laser Scanning Data. Int. J. Herit. Digit. Era 2012, 1, 251–275. [Google Scholar] [CrossRef]
- Cosentino, A.; Stout, S.; Scandurra, C. Innovative imaging techniques for examination and documentation of mural paintings and historical graffiti in the catacombs of San Giovanni, Syracuse. Int. J. Conserv. Sci. 2015, 6, 23–34. [Google Scholar]
- Delaney, J.K.; Zeibel, J.G.; Thoury, M.; Littleton, R.; Palmer, M.; Morales, K.M.; de la Rie, E.R.; Hoenigswald, A. Visible and Infrared Imaging Spectroscopy of Picasso’s Harlequin Musician: Mapping and Identification of Artist Materials In Situ. Appl. Spectrosc. 2010, 64, 584–594. [Google Scholar] [CrossRef]
- Bendada, A.; Sfarra, S.; Ibarra-Castanedo, C.; Akhloufi, M.; Caumes, J.P.; Pradere, C.; Batsale, J.C.; Maldague, X. Subsurface imaging for panel paintings inspection: A comparative study of the ultraviolet, the visible, the infrared and the terahertz spectra. Optoelectron. Rev. 2015, 23, 88–99. [Google Scholar] [CrossRef]
- Miguel, C.; Bottura, S.; Ferreira, T.; Conde, A.F.; Barrocas-Dias, C.; Candeias, A. Unveiling the underprintings of a late-fifteenth-early-sixteenth century illuminated French incunabulum by infrared reflectography. J. Cult. Herit. 2019, 40, 34–42. [Google Scholar] [CrossRef] [Green Version]
- Easton, R.L., Jr.; Noël, W. The multispectral imaging of the Archimedes Palimpsest. Gazette du Livre Médiéval 2004, 45, 39–49. [Google Scholar] [CrossRef]
- Zainuddin, K.; Majid, Z.; Ariff, M.F.M.; Idris, K.M.; Abbas, M.A.; Darwin, N. 3D modeling for Rock Art documentation using lightweight multispectral camera. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W9, 787–793. [Google Scholar] [CrossRef] [Green Version]
- Verhoeven, G. Imaging the invisible using modified digital still cameras for straightforward and low-cost archaeological near-infrared photography. J. Archaeol. Sci. 2008, 35, 3087–3100. [Google Scholar] [CrossRef]
- Kedzierski, M.; Walczykowski, P.; Wojtkowska, M.; Fryskowska, A. Integration of point clouds and images acquired from a low-cost NIR camera sensor for cultural heritage purposes. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W5, 407–414. [Google Scholar] [CrossRef] [Green Version]
- Pelagotti, A.; Mastio, A.D.; Uccheddu, F.; Remondino, F. Automated Multispectral Texture Mapping of 3D Models. In Proceedings of the 17th European Signal Processing Conference (EUSIPCO-2009), Glasgow, Scotland, 24–28 August 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1215–1219. [Google Scholar]
- Corsini, M.; Dellepiane, M.; Ponchio, F.; Scopigno, R. Image-to-Geometry Registration: A Mutual Information Method exploiting Illumination-related Geometric Properties. Comput. Graph. Forum 2009, 28, 1755–1764. [Google Scholar] [CrossRef]
- Simon Chane, C.; Mansouri, A.; Marzani, F.S.; Boochs, F. Integration of 3D and multispectral data for cultural heritage applications: Survey and perspectives. Image Vis. Comput. 2013, 31, 91–102. [Google Scholar] [CrossRef] [Green Version]
- Blazek, J.; Soukup, J.; Zitova, B.; Flusser, J.; Tichy, T.; Hradilova, J. Low-cost mobile system for multispectral cultural heritage data acquisition. In Proceedings of the Digital Heritage International Congress (DigitalHeritage), Marseille, France, 28 October–1 November 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 73–79. [Google Scholar]
- Webb, E.K.; Robson, S.; MacDonald, L.; Garside, D.; Evans, R. Spectral and 3D cultural heritage documentation using a modified camera. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2018, XLII–2, 1183–1190. [Google Scholar] [CrossRef] [Green Version]
- Mathys, A.; Jadinon, R.; Hallot, P. Exploiting 3D multispectral texture for a better feature identification for cultural heritage. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, IV-2/W6, 91–97. [Google Scholar] [CrossRef] [Green Version]
- Adamopoulos, E.; Rinaudo, F.; Bovero, A. First assessments on heritage science-oriented image-based modeling using low-cost modified and mobile cameras. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W17, 23–30. [Google Scholar] [CrossRef] [Green Version]
- Pamart, A.; Guillon, O.; Faraci, S.; Gattet, E.; Genevois, M.; Vallet, J.M.; De Luca, L. Multispectral Photogrammetric data acquisition and processing for wall paintings studies. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W3, 559–566. [Google Scholar] [CrossRef] [Green Version]
- Pardo, C.A.; Farjas, M.; Georgopoulos, A.; Mielczarek, M.; Parenti, R.; Schramm, T.; Skarlatos, D.; Stefanakis, E.; Tapinaki, S.; Tucci, G.; et al. Experiences gained from the ERASMUS intensive programme HERICT 2013. In Proceedings of the 6th International Conference of Education, Research and Innovation, Seville, Spain, 18–20 November 2013; IATED: Valencia, Spain, 2013; pp. 4424–4431. [Google Scholar]
- Nijland, W.; de Jong, R.; de Jong, S.M.; Wulder, M.A.; Bater, C.W.; Coops, N.C. Monitoring plant condition and phenology using infrared sensitive consumer grade digital cameras. Agric. For. Meteorol. 2014, 184, 98–106. [Google Scholar] [CrossRef] [Green Version]
Dataset | Scenario | Camera Model | Mega- Pixels | Focal Length (mm) | Pixel Size (μm) | Distance (m) | Ground Sample Distance (mm) | Spectrum | Image Count |
---|---|---|---|---|---|---|---|---|---|
1 | 1 | IXUS 220HS | 12.0 | 4.0 | 1.55 | 86.4 | 27.8 | VIS | 82 |
2 | ELPH 300 HS | 12.0 | 4.0 | 1.55 | 86.9 | 27.7 | NIR | 82 | |
3 | IXUS 220HS | 12.0 | 4.0 | 1.55 | 95.3 | 31.6 | VIS | 77 | |
4 | ELPH 300 HS | 12.0 | 4.0 | 1.55 | 109.0 | 36.0 | NIR | 77 | |
2 | 1 | REBEL-SL1 | 17.9 | 18 | 4.38 | 1.69 | 0.36 | VIS | 110 |
2 | REBEL-SL1 | 17.9 | 18 | 4.38 | 1.67 | 0.36 | NIR | 110 | |
3 | 1 | REBEL-SL1 | 17.9 | 18 | 4.38 | 0.67 | 0.58 | VIS | 10 |
2 | REBEL-SL1 | 17.9 | 18 | 4.38 | 0.57 | 0.58 | NIR | 10 | |
4 | 1 | REBEL-SL1 | 17.9 | 55 | 4.38 | 0.86 | 0.07 | VIS | 100 |
2 | REBEL-SL1 | 17.9 | 55 | 4.38 | 0.86 | 0.07 | NIR | 100 |
Dataset | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Sparse reconstruction | ||||
Key point density | highest | high | highest | highest |
Matching type | accurate | fast | accurate | accurate |
Pair preselection | reference | unordered | unordered | unordered |
Key point limits | 100 K | 50 K | 100 K | 100 K |
Dense reconstruction | ||||
Masking | no | no | yes | yes |
Point density | high | medium | very high | high |
Depth filtering | moderate | moderate | moderate | moderate |
Mesh generation | ||||
Max faces number | 25 M | 15 M | 10 M | 20 M |
Quality | high | very high | high | very high |
Interpolation | enabled | disabled | enabled | disabled |
Texture generation | ||||
Mapping mode | ortho | generic | ortho | generic |
Blending mode | mosaic | mosaic | average | average |
Texture size | 3840 | 16,384 | 4096 | 8192 |
Hole filling | yes | no | yes | no |
Scenario | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Sparse Cloud | ||||
Aligned images | 82 | 82 | 76 | 73 |
Matching time 1 | 0:02:11 | 0:02:09 | 0:03:58 | 0:04:29 |
Alignment time | 0:00:21 | 0:00:17 | 0:00:13 | 0:00:28 |
Tie point count | 87,183 | 90,028 | 51,542 | 63,833 |
Projections | 191,675 | 197,004 | 109,756 | 139,906 |
Adjustment error (pixels) | 0.66 | 0.55 | 0.58 | 0.72 |
Dense Cloud | ||||
Densification time | 0:10:24 | 0:10:06 | 0:04:42 | 0:05:53 |
Point count | 75,572 | 75,785 | 47,705 | 50,942 |
Triangle Mesh | ||||
Meshing time | 0:13:23 | 0:14:33 | 0:06:18 | 0:06:04 |
Face count | 23,856,573 | 23,805,413 | 24,145,080 | 24,347,432 |
Vertices count | 11,930,446 | 11,904,337 | 12,076,150 | 12,177,636 |
Texture | ||||
Texturing time | 0:03:10 | 0:03:06 | 0:02:26 | 0:02:35 |
Overall Results | ||||
Total time | 0:29:29 | 0:30:11 | 0:17:37 | 0:19:29 |
Scenario | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Control Points | ||||
Count | 13 | 13 | 10 | 10 |
X error (cm) | 2.5 | 1.4 | 6.4 | 5.7 |
Y error (cm) | 3.4 | 1.6 | 4.1 | 4.9 |
Z error (cm) | 5.5 | 3.2 | 3.7 | 4.9 |
Total (cm) | 7.0 | 3.8 | 8.4 | 9.0 |
Check Points | ||||
Count | 8 | 8 | 7 | 7 |
X error (cm) | 3.1 | 1.9 | 5.8 | 4.6 |
Y error (cm) | 4.1 | 1.5 | 4.1 | 6.6 |
Z error (cm) | 5.6 | 2.7 | 8.0 | 6.1 |
Total (cm) | 7.6 | 3.6 | 10.7 | 10.1 |
Scenario | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Sparse Cloud | ||||
Aligned images | 68 | 69 | 76 | 73 |
Tie point count | 21,067 | 17,435 | 31,740 | 25,690 |
Projections | 56,168 | 50,025 | 96,064 | 78,329 |
Adjustment error (pixels) | 0.58 | 0.56 | 0.70 | 0.67 |
Dense Cloud | ||||
Point count | 5,762,125 | 6,846,175 | 9,315,791 | 9,959,981 |
Software | Agisoft Metashape | 3DFlow Zephyr | ||
---|---|---|---|---|
Scenario | 1 | 2 | 1 | 2 |
Sparse Cloud | ||||
Aligned images | 110/110 | 110/110 | 110/110 | 110/110 |
Matching time 1 | 0:01:34 | 00:01:33 | 0:31:11 | 0:36:47 |
Alignment time | 0:03:06 | 00:02:37 | 0:03:58 | 0:02:52 |
Tie point count 2 | 443 | 455 | 57 | 64 |
Projections | 1928 | 1850 | 481 | 423 |
Adjustment error (pixels) | 0.51 | 0.42 | 0.88 | 0.91 |
Dense Cloud | ||||
Densification time | 0:28:57 | 00:21:02 | 0:38:17 | 0:36:11 |
Point count | 6286 | 7252 | 2842 | 2321 |
Triangle Mesh | ||||
Meshing time | 00:02:09 | 00:01:49 | 00:00:35 | 00:00:27 |
Texture | ||||
Texturing time | 00:17:26 | 00:07:46 | 0:11:48 | 0:16:05 |
Overall Results | ||||
Total time | 00:53:12 | 00:34:47 | 1:25:49 | 1:32:22 |
Control RMS Error (mm) | 1.3 | 1.6 | 2.8 | 2.3 |
Check RMS Error (mm) | 1.1 | 1.3 | 3.1 | 2.7 |
Scenario | 1 | 2 |
---|---|---|
Sparse Cloud | ||
Aligned images | 10 | 10 |
Tie point count | 37,119 | 43,105 |
Projections | 165,662 | 192,583 |
Adj. error (pixels) | 0.21 | 0.34 |
Dense Cloud | ||
Point count | 1,692,727 | 1,757,642 |
Overall Results | ||
Total time (mm:ss) | 02:14 | 01:57 |
Software | Agisoft Metashape | 3DFlow Zephyr | ||
---|---|---|---|---|
Scenario | 1 | 2 | 1 | 2 |
Sparse Cloud | ||||
Aligned images | 100 | 99 | 100 | 80 |
Matching time (hh:mm:ss) | 00:02:54 | 00:02:42 | 00:22:36 | 00:21:56 |
Alignment time (hh:mm:ss) | 00:01:20 | 00:00:35 | 00:01:30 | 00:01:16 |
Tie point count | 212,285 | 110,606 | 123,262 | 81,541 |
Projections | 488,205 | 254,553 | 380,900 | 325,400 |
Adjustment error (pixels) | 0.36 | 0.49 | 0.45 | 0.50 |
Dense Cloud | ||||
Densification time (hh:mm:ss) | 00:09:32 | 00:11:45 | 00:31:26 | 0:24:12 |
Point count | 120,664,933 | 198,431,339 | 11,185,124 | 8,787,781 |
Triangle Mesh | ||||
Meshing time (hh:mm:ss) | 00:11:53 | 00:07:42 | 00:28:18 | 00:21:53 |
Texture | ||||
Texturing time (hh:mm:ss) | 00:05:35 | 00:03:36 | 00:06:04 | 00:05:48 |
Total time (hh:mm:ss) | 00:31:14 | 00:26:20 | 01:29:54 | 01:15:05 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Adamopoulos, E.; Rinaudo, F. Enhancing Image-Based Multiscale Heritage Recording with Near-Infrared Data. ISPRS Int. J. Geo-Inf. 2020, 9, 269. https://doi.org/10.3390/ijgi9040269
Adamopoulos E, Rinaudo F. Enhancing Image-Based Multiscale Heritage Recording with Near-Infrared Data. ISPRS International Journal of Geo-Information. 2020; 9(4):269. https://doi.org/10.3390/ijgi9040269
Chicago/Turabian StyleAdamopoulos, Efstathios, and Fulvio Rinaudo. 2020. "Enhancing Image-Based Multiscale Heritage Recording with Near-Infrared Data" ISPRS International Journal of Geo-Information 9, no. 4: 269. https://doi.org/10.3390/ijgi9040269
APA StyleAdamopoulos, E., & Rinaudo, F. (2020). Enhancing Image-Based Multiscale Heritage Recording with Near-Infrared Data. ISPRS International Journal of Geo-Information, 9(4), 269. https://doi.org/10.3390/ijgi9040269