Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (31)

Search Parameters:
Keywords = photo-mosaic

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 5996 KiB  
Article
Proximal Sensing for Characterising Seaweed Aquaculture Crop Conditions: Optical Detection of Ice-Ice Disease
by Evangelos Alevizos, Nurjannah Nurdin, Agus Aris and Laurent Barillé
Remote Sens. 2024, 16(18), 3502; https://doi.org/10.3390/rs16183502 - 21 Sep 2024
Viewed by 1068
Abstract
Crop monitoring is a fundamental practice in seaweed aquaculture. Seaweeds are vulnerable to several threats such as ice-ice disease (IID) causing a whitening of the thallus due to depigmentation. Crop condition assessment is important for minimizing yield losses and improving the biosecurity of [...] Read more.
Crop monitoring is a fundamental practice in seaweed aquaculture. Seaweeds are vulnerable to several threats such as ice-ice disease (IID) causing a whitening of the thallus due to depigmentation. Crop condition assessment is important for minimizing yield losses and improving the biosecurity of seaweed farms. The recent influence of modern technology has resulted in the development of precision aquaculture. The present study focuses on the exploitation of spectral reflectance in the visible and near-infrared regions for characterizing the crop condition of two of the most cultivated Eucheumatoids species: Kappaphycus alvareezi and Eucheuma denticulatum. In particular, the influence of spectral resolution is examined towards discriminating: (a) species and morphotypes, (b) different levels of seaweed health (i.e., from healthy to completely depigmented) and (c) depigmented from silted specimens (thallus covered by a thin layer of sediment). Two spectral libraries were built at different spectral resolutions (5 and 45 spectral bands) using in situ data. In addition, proximal multispectral imagery using a drone-based sensor was utilised. At each experimental scenario, the spectral data were classified using a Random Forest algorithm for crop condition identification. The results showed good discrimination (83–99% overall accuracy) for crop conditions and morphotypes regardless of spectral resolution. According to the importance scores of the hyperspectral data, useful wavelengths were identified for discriminating healthy seaweeds from seaweeds with varying symptoms of IID (i.e., thalli whitening). These wavelengths assisted in selecting a set of vegetation indices for testing their ability to improve crop condition characterisation. Specifically, five vegetation indices (the RBNDVI, GLI, Hue, Green–Red ratio and NGRDI) were found to improve classification accuracy, making them recommended for seaweed health monitoring. Image-based classification demonstrated that multispectral library data can be extended to photomosaics to assess seaweed conditions on a broad scale. The results of this study suggest that proximal sensing is a first step towards effective seaweed crop monitoring, enhancing yield and contributing to aquaculture biosecurity. Full article
(This article belongs to the Special Issue Innovative UAV Applications)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Characteristic end-member spectra used for the spectral library with various crop types: (<b>A</b>) <span class="html-italic">E. denticulatum</span> with deep purple/brown thalli, typical in healthy specimens, (<b>B</b>) silted <span class="html-italic">E. denticulatum</span>, with thalli in beige colour patches due to accumulation of silt particles, (<b>C</b>) depigmented <span class="html-italic">E. denticulatum</span>, that is the typical appearance of deceased seaweed, (<b>D</b>) green morphotype of <span class="html-italic">K. alvareezi</span>, with branching thalli and (<b>E</b>) brown morphotype of <span class="html-italic">K. alvareezi</span> with light brown/orange thalli.</p>
Full article ">Figure 2
<p>Spectral signatures of <span class="html-italic">E. denticulatum</span> and <span class="html-italic">K. alvareezi</span> green and brown morphotypes: (<b>A</b>) Average spectra of healthy thallus with no signs of depigmentation. (<b>B</b>) Diagram of wavelengths’ relative importance for discriminating <span class="html-italic">Eucheuma</span> and <span class="html-italic">Kappaphycus</span> morphotypes.</p>
Full article ">Figure 3
<p>Hyperspectral signatures of <span class="html-italic">E. denticulatum</span> showing a gradient of white discolouration of the thallus: (<b>A</b>) Average spectra of healthy, mixed and entirely white <span class="html-italic">Eucheuma</span> thallus. (<b>B</b>) Diagram of wavelengths’ relative importance for characterising thallus whitening.</p>
Full article ">Figure 4
<p>Hyperspectral signatures of <span class="html-italic">K. alvareezi</span> showing a gradient of white discolouration of the thallus: (<b>A</b>) Average spectra of healthy, mixed and entirely white thallus. (<b>B</b>) Diagram of wavelengths’ relative importance for characterising thallus whitening.</p>
Full article ">Figure 5
<p>(<b>A</b>) Comparison of silted and depigmented <span class="html-italic">E. denticulatum</span> spectra. (<b>B</b>) Diagram of relative wavelengths’ importance for differentiating silted and depigmented thallus.</p>
Full article ">Figure 6
<p>Spectral signatures of <span class="html-italic">E. denticulatum</span> degraded at the multispectral resolution of a DJI Phantom 4 multispectral sensor. (<b>A</b>) Average spectra of healthy, mixed and fully depigmented thallus. (<b>B</b>) Diagram of wavelengths’ relative importance for characterising the thallus whitening.</p>
Full article ">Figure 7
<p>Spectral signatures of <span class="html-italic">K. alvareezi</span> degraded at the multispectral resolution of a DJI Phantom 4 multispectral sensor. (<b>A</b>) Average spectra of healthy, mixed and fully depigmented thallus. (<b>B</b>) Diagram of wavelengths’ relative importance for characterising the thallus whitening.</p>
Full article ">Figure 8
<p>Spectral signatures of <span class="html-italic">E. denticulatum</span> and <span class="html-italic">K. alvareezi</span> morphotypes degraded at the multispectral resolution of a DJI Phantom 4 multispectral sensor.</p>
Full article ">Figure 9
<p>(<b>A</b>) Comparison of silted and depigmented <span class="html-italic">E. denticulatum</span> spectra degraded at the multispectral resolution of a DJI Phantom 4 multispectral sensor. (<b>B</b>) Diagram of relative wavelengths’ importance for differentiating silted and depigmented thallus.</p>
Full article ">Figure 10
<p>(<b>A</b>) True-color RGB image of healthy, silted and depigmented <span class="html-italic">Eucheuma</span> samples obtained with a DJI Phantom 4 multispectral drone hand-held 1.5 m above the ground. (<b>B</b>) Random Forest classification output using the drone’s five multispectral bands. (<b>C</b>) Random classification output using the four indices with the greatest importance described in <a href="#remotesensing-16-03502-f009" class="html-fig">Figure 9</a>A. The image in the black rectangle is shown in the zoom-in frame to the right to illustrate better the silted specimen.</p>
Full article ">
36 pages, 19294 KiB  
Article
Red Sea Coral Reef Monitoring Site in Sudan after 39 Years Reveals Stagnant Reef Growth, Continuity and Change
by Sarah Abdelhamid, Götz B. Reinicke, Rebecca Klaus, Johannes Höhn, Osama S. Saad and Görres Grenzdörffer
Diversity 2024, 16(7), 379; https://doi.org/10.3390/d16070379 - 29 Jun 2024
Viewed by 1887
Abstract
Coral reefs off the coast of the Republic of Sudan are still considered to be among the most pristine reefs in the central Red Sea. The complex coastal fringing reefs, offshore banks, and shoals of Dungonab Bay in the north and Sanganeb atoll [...] Read more.
Coral reefs off the coast of the Republic of Sudan are still considered to be among the most pristine reefs in the central Red Sea. The complex coastal fringing reefs, offshore banks, and shoals of Dungonab Bay in the north and Sanganeb atoll situated further to the south, about 23 km off the Sudanese mainland coast, were inscribed on the UNESCO World Heritage List in 2016. Due to their remote location and limited access, monitoring of the status of the reefs has been sporadic. Here, we present the results of a repeated large area photomosaic survey (5 m × 5 m plots) on the Sanganeb atoll, first established and surveyed in 1980, and revisited in 1991 and most recently in 2019. The 2019 survey recovered and reinstated the four original monitoring plots. Evaluation of photographic and video records from one photomosaic plot on the seaward slope of the atoll revealed general continuity of the overall community structure and composition over 39 years. Individual colonies of Echinopora gemmacea and Lobophyllia erythraea were recorded in the exact same positions as in the 1980 and 1991 plots. The genera Acropora and Pocillopora remain dominant, although in altered proportions. Shifts in composition were detected at the species level (e.g., increase in Pocillopora verrucosa, Stylophora pistillata, Acropora hemprichii, Dipsastraea pallida, and Echinopora gemmacea, decrease in Acropora cytherea and A. superba), in addition to changes in the extent of uncolonized substrate (e.g., increase from 43.9% in 1980 to 52.2% in 2019), and other scleractinian, hydrozoan, and soft coral living cover. While the temporal resolution only includes three sampling events over 39 years (1980, 1991, 2019), this study presents one of the longest time series of benthic community surveys available for the entire Red Sea. A semi-quantitative estimate of vertical reef growth in the studied test plot indicates a reduction in net accretion rates of more than 80%, from 2.27 to 2.72 cm/yr between 1980 and 1991 to 0.28–0.42 cm/yr between 1991 and 2019. We carefully conclude that the changes observed in the coral community in the plot in 2019 (Acropora–Pocillopora shift, increase in Montipora and calcareous algae) are representative of impacts at the community level, including rising sea surface temperatures and recent bleaching events. Full article
(This article belongs to the Section Marine Diversity)
Show Figures

Figure 1

Figure 1
<p>Map showing the location of Sanganeb atoll, about 20 km off the coast of Sudan, to the northeast of the city of Port Sudan (central Red Sea). Water depths are shown as small numbers. The inset map (bottom right) shows the position of the four test plots [TQs 1–4] around Sanganeb atoll, labelled I–IV. Lh = lighthouse, Bc = Beacon (after Mergner and Schuhmacher, 1985).</p>
Full article ">Figure 2
<p>Matching and mosaicking of the series of individual images using Agisoft Metashape Professional v. 1.7.1 resulted in a digital orthophoto mosaic (see Figure 8).</p>
Full article ">Figure 3
<p>Remnants of the 1991 survey line grid within the TQ4 plot in 2019 (photo: G.B. Reinicke).</p>
Full article ">Figure 4
<p>View of TQ1 on 3 October 2019. The site morphology appears unaltered compared to 1980, with sand and gravel loads drifting downslope on the bottom left and right sectors. Non-living substrata, including dead coral skeletons, dominate, while xeniid and other soft corals colonise current exposed positions, protected from sedimentation risk. Smaller, younger scleractinian colonies (<span class="html-italic">Acropora</span>, <span class="html-italic">Pocillopora</span>, <span class="html-italic">Porites</span> spp.) are common, reflecting frequent perturbation (line square 5 × 5 m, photo credit: G.B. Reinicke).</p>
Full article ">Figure 5
<p>View of the upper sector of TQ2 on 30 September 2019. The original morphology of the sloping plot is characterised by the build-up of <span class="html-italic">Lobophyllia corymbosa</span> framework of about 1.5–2 m height (bottom left). Colonies of <span class="html-italic">Acropora hemprichii</span> occupy similar positions as in 1980 and 1991 (photo credit: G.B. Reinicke).</p>
Full article ">Figure 6
<p>View of TQ3 on 3 October 2019. The morphology of the solitary substrate pillar has retained its shape and dimensions, the community is characterised by a lush and diverse cnidarian coverage; in the bottom right sector of the TQ, about 15% of the plot area was covered by coral rubble sliding down the atoll’s lagoon side slope (video and photo: J. Höhn).</p>
Full article ">Figure 7
<p>View of the TQ4 (from the north) on 1 October 2019. The substrate cover at the windward slope between 8 and 12 m depth shows a good representation of the upper slope habitats. Structural variability of the framework build-up is caused by underlying morphological anomalies like a deep trench (in the background) with gravel sloping about 2–3 m wide from a fracture in the reef crest above (photo credit: G.B. Reinicke).</p>
Full article ">Figure 8
<p>Mosaic orthophoto of test plot TQ4, Sanganeb atoll, all photos taken on 1 October 2019 (photo credit: G.B. Reinicke).</p>
Full article ">Figure 9
<p>Colour-coded 5 × 5 m map of the test area TQ4 (NE–outer reef of the Sanganeb atoll) in the 2019 survey; Scale: c. 1:40. Legend of colour-coded map of the TQ4 plot in 2019 shows the Hexacorallia: <span class="html-italic">Acropora</span> (red); Pocilloporidae (pink-red); <span class="html-italic">Psammocora</span> (orange); Fungiidae (salmon); <span class="html-italic">Montipora</span> (purple); Merulinidae (green to blue); Agariciidae (dark-brown); Oculinidae (grey). Hydrozoa: Milleporidae (black); Alcyonacea Alcyoniidae (yellow); Porifera (P), and Calcareous algae (ka). Taxa marked by an asterisk* were identified on close-up photographs but were too small to figure in the map.</p>
Full article ">Figure 10
<p>Colonies of <span class="html-italic">Echinopora gemmacea</span> (<b>A</b>, top right) and <span class="html-italic">Lobophyllia erythraea</span> (<b>B</b>, bottom right corners of TQ4 plot) have persisted in the same positions within TQ4 since 1980 (photo credit: G.B. Reinicke, 1 October 2019).</p>
Full article ">Figure 11
<p>Schematic diagram to illustrate estimated vertical reef growth based on the surface levels during the three surveys (vertical axis to scale). Substrate overgrowth is shaded for 1980 (dark grey), 1991 (mid-grey), and 2019 (light grey). The first net grid level of 1980, recovered in 1991, is shown as a red line. The green line represents the net grid level of 1991 and the turquoise line represents the overgrowth of 8–12 cm between 1991–2019.</p>
Full article ">Figure 12
<p>(<b>A</b>) Percent cover of main benthic categories including substrata, non-cnidarian, and cnidarian in the test plot TQ4 in 1980, 1991, and 2019 on the Sanganeb atoll. (<b>B</b>) Changes in the percent cover of selected cnidarian taxa in the test plot TQ4 in 1980, 1991, and 2019, respectively.</p>
Full article ">Figure 13
<p>Rank abundance plots for the 1980, 1991, and 2019 surveys.</p>
Full article ">Figure 14
<p>Dendrogram showing results of cluster analysis of cnidarian cover within TQ4 (standardised and square root transformed), including SIMPROF results.</p>
Full article ">Figure 15
<p>Shade plot of cnidarian cover data (standardised and square root transformed), showing the top 50 contributory species across all years and the clustering of these species (index of association) and samples (Bray–Curtis similarity).</p>
Full article ">Figure 16
<p>Reference documentation for the identification of <span class="html-italic">Acropora “superba”</span> (Klz., sensu M and S, 1985) from the survey plot TQ4 at the Sanganeb atoll. (<b>A</b>) Showing detail of b/w photographic documentation of TQ4, sector “V-d” (1 m × 1 m), with colonies of dominant <span class="html-italic">A. “superba”</span>, <span class="html-italic">Pocillopora verrucosa</span>, and <span class="html-italic">Millepora platyphylla</span> (<span class="html-italic">i.a.</span>); (<b>B</b>) close-up colour slide 1:3 of live <span class="html-italic">A. “superba”</span> (photo credit and identification: H. Schuhmacher, 1980).</p>
Full article ">Figure 17
<p>Reference specimens of <span class="html-italic">Acropora “superba”</span> from TQ4 in 1980. (<b>A</b>): GOM-HS-coll. #226: ID noted in H. Schuhmacher’s handwriting. (<b>B</b>) Detail of #226; (<b>C</b>) GOM-HS-coll. #228: ID noted in H. Schuhmacher’s handwriting. (<b>D</b>) Detail of #228, all scale bars 1 cm (photo credit: T. Moritz).</p>
Full article ">Figure 18
<p>Reference documentation for the identification of <span class="html-italic">Acropora “superba”</span> (Klz., sensu M and S, 1985) in the survey plot TQ4 at the Sanganeb atoll in 2019. (<b>A</b>,<b>B</b>) Colony details in the plot sectors “IV/V-c/d” (2 m × 2 m) showing the dominant, but yet unnamed <span class="html-italic">Acropora</span> species (photo credit: J. Höhn).</p>
Full article ">Figure 19
<p>Photographs of (<b>A</b>) <span class="html-italic">Acropora squarrosa</span> (Ehrenberg, 1834); (<b>B</b>) <span class="html-italic">Acropora maryae</span> Veron, 2000 colonies from TQ4, Sanganeb atoll (Sudan) (photo credits: J. Höhn, 2019).</p>
Full article ">Figure 20
<p>A total of 39 years of reef life encapsulated within plots of the coral community at test plot TQ4, Sanganeb atoll (Sudan); (<b>A</b>,<b>B</b>) show the photomosaic and coral community map from the initial survey in 1980. The colour codes of taxa are in agreement with the subsequent plots; (<b>C</b>) plot TQ4 of 1991 and (<b>D</b>) of 2019—see legend in <a href="#diversity-16-00379-f009" class="html-fig">Figure 9</a>. The comparison reveals persistence at the individual and assemblage levels. A certain degree of consistency as well as dynamic changes in the community are reflected on the background of a stable geomorphological setup. While 86 colonies of <span class="html-italic">Pocillopora verrucosa</span> indicate recovery after a recent disturbance event, individual colonies demonstrate suitable conditions over 39 years. Arrows indicate <span class="html-italic">Echinopora gemmacea</span> in the coordinate Va (top right) and <span class="html-italic">Lobophytum erythraea</span> in Ve (bottom right) of the plots. Also, preferred positions of <span class="html-italic">Millepora</span> spp. (black) appear stable over almost four decades.</p>
Full article ">
25 pages, 11761 KiB  
Article
A New Coastal Crawler Prototype to Expand the Ecological Monitoring Radius of OBSEA Cabled Observatory
by Ahmad Falahzadeh, Daniel Mihai Toma, Marco Francescangeli, Damianos Chatzievangelou, Marc Nogueras, Enoc Martínez, Matias Carandell, Michael Tangerlini, Laurenz Thomsen, Giacomo Picardi, Marie Le Bris, Luisa Dominguez, Jacopo Aguzzi and Joaquin del Río
J. Mar. Sci. Eng. 2023, 11(4), 857; https://doi.org/10.3390/jmse11040857 - 18 Apr 2023
Cited by 6 | Viewed by 2139
Abstract
The use of marine cabled video observatories with multiparametric environmental data collection capability is becoming relevant for ecological monitoring strategies. Their ecosystem surveying can be enforced in real time, remotely, and continuously, over consecutive days, seasons, and even years. Unfortunately, as most observatories [...] Read more.
The use of marine cabled video observatories with multiparametric environmental data collection capability is becoming relevant for ecological monitoring strategies. Their ecosystem surveying can be enforced in real time, remotely, and continuously, over consecutive days, seasons, and even years. Unfortunately, as most observatories perform such monitoring with fixed cameras, the ecological value of their data is limited to a narrow field of view, possibly not representative of the local habitat heterogeneity. Docked mobile robotic platforms could be used to extend data collection to larger, and hence more ecologically representative areas. Among the various state-of-the-art underwater robotic platforms available, benthic crawlers are excellent candidates to perform ecological monitoring tasks in combination with cabled observatories. Although they are normally used in the deep sea, their high positioning stability, low acoustic signature, and low energetic consumption, especially during stationary phases, make them suitable for coastal operations. In this paper, we present the integration of a benthic crawler into a coastal cabled observatory (OBSEA) to extend its monitoring radius and collect more ecologically representative data. The extension of the monitoring radius was obtained by remotely operating the crawler to enforce back-and-forth drives along specific transects while recording videos with the onboard cameras. The ecological relevance of the monitoring-radius extension was demonstrated by performing a visual census of the species observed with the crawler’s cameras in comparison to the observatory’s fixed cameras, revealing non-negligible differences. Additionally, the videos recorded from the crawler’s cameras during the transects were used to demonstrate an automated photo-mosaic of the seabed for the first time on this class of vehicles. In the present work, the crawler travelled in an area of 40 m away from the OBSEA, producing an extension of the monitoring field of view (FOV), and covering an area approximately 230 times larger than OBSEA’s camera. The analysis of the videos obtained from the crawler’s and the observatory’s cameras revealed differences in the species observed. Future implementation scenarios are also discussed in relation to mission autonomy to perform imaging across spatial heterogeneity gradients around the OBSEA. Full article
(This article belongs to the Section Marine Environmental Science)
Show Figures

Figure 1

Figure 1
<p>The superior (<b>A</b>) and near seafloor lateral (<b>B</b>) images of the OBSEA cabled observatory, with its camera within the glass crystal dome on top of the reticular changing infrastructure. The seabed-laying fibre optic cable, connecting the platform to the shore, is also visible as anchored to the seabed with white weight bags (visible in image (<b>A</b>)). Different images of the nearby concrete artificial reef are also provided to spatially characterise the monitoring scenario [<a href="#B41-jmse-11-00857" class="html-bibr">41</a>,<a href="#B49-jmse-11-00857" class="html-bibr">49</a>].</p>
Full article ">Figure 2
<p>Schematised global view of the OBSEA monitoring area with artificial concrete columns for its protection from illegal hauling. Circles indicate the fixed (yellow) and mobile crawler (red) cameras and arrows represent their time-lapse (fixed) or continuous footage-based imaging (with fish silhouettes in the above images as an example of the different individuals and species). Notably, the second fixed camera mounted on a satellite tripod was not operational at the time of crawler deployment and testing.</p>
Full article ">Figure 3
<p>The crawler (<b>A</b>) and its components (<b>B</b>) with numbers indicating: (1) the camera dome, (2) the tracks, (3) the control cylinder for crawler functioning control, (4) lights, (5) the junction box, and finally, (6) the umbilical cable that connects the crawler to the seabed station.</p>
Full article ">Figure 4
<p>Control cylinder components: power supply board (1), main controller board (2), motor drivers (3), and compass (4).</p>
Full article ">Figure 5
<p>The web architecture for the remote control of the crawler through the OBSEA portal. Notably, the OBSEA is endowed with an acoustic USBL for wireless communications with the crawler which allows for modular control expansion in relation to future sensors.</p>
Full article ">Figure 6
<p>The web user interface for crawler control within the three different functioning modes: manual (<b>A</b>), advanced (<b>B</b>), and automatic modes (<b>C</b>).</p>
Full article ">Figure 7
<p>Procedure for performing the first calibration test. The initial positioning, the correction attempt in terms of light, the new location without reflections, and the marking of each position according to the distance and the calculation of each distance are represented.</p>
Full article ">Figure 8
<p>Calibration essays at the OBSEA, with a photo of the checkerboard from the centre viewpoint of the remote vehicle (<b>A</b>), and from a higher perspective (<b>B</b>).</p>
Full article ">Figure 9
<p>The flux diagram of the Python code for the automated production of photo-mosaics.</p>
Full article ">Figure 10
<p>Different phases of the testing of the crawler: (<b>A</b>) in the hyperbaric chamber, (<b>B</b>) in the swimming pool of SARTI facilities, and (<b>C</b>) in the marine environment close to the OBSEA platform. The platform provided a vision of the marine seascape in remote mode (<b>D</b>) where the inferior part of the dome is visible, along with the timestamp coordinates.</p>
Full article ">Figure 11
<p>The crawler’s automatic navigation test at the OBSEA facility, showing the platform approach to the plastic tags (indicated by the white arrows): (<b>A</b>) the crawler approaching a circle tag and (<b>B</b>) the imaging of the tag within the crawlers’ FOV; (<b>C</b>) the crawler approaching another rhomboidal tag (sharp lateral vision) and (<b>D</b>) the same spotting of the rhomboidal tag within the crawlers’ FOV. One should notice the high rendering of the crawler camera (plates (<b>B</b>,<b>D</b>)) in relation to the ROV camera (plates (<b>A</b>,<b>C</b>), a different camera type and model).</p>
Full article ">Figure 12
<p>The limited OBSEA underwater observatory monitoring area, expanded by the underwater crawler.</p>
Full article ">Figure 13
<p>Comparison of species rarefaction curves for the OBSEA (<b>A</b>) and the crawler (<b>B</b>) cameras. Richness rarefaction (blue curve) with 95% confidence interval (light blue band) and Coleman’s rarefaction (red curve with ± SD red bars) for (<b>A</b>) OBSEA, with added samples from images every 30 min; and (<b>B</b>) crawler, with samples from video segments of 10 s. The horizontal dashed line corresponds to 95% of the estimated number of species for each platform.</p>
Full article ">Figure 14
<p>Checkerboard photo-mosaic generated after the calibration (<b>A</b>) and photo-mosaic (<b>B</b>) with an enlarged vision of a detected fish ((<b>C</b>); SPECIES).</p>
Full article ">
16 pages, 15677 KiB  
Article
BotanicX-AI: Identification of Tomato Leaf Diseases Using an Explanation-Driven Deep-Learning Model
by Mohan Bhandari, Tej Bahadur Shahi, Arjun Neupane and Kerry Brian Walsh
J. Imaging 2023, 9(2), 53; https://doi.org/10.3390/jimaging9020053 - 20 Feb 2023
Cited by 21 | Viewed by 5153
Abstract
Early and accurate tomato disease detection using easily available leaf photos is essential for farmers and stakeholders as it help reduce yield loss due to possible disease epidemics. This paper aims to visually identify nine different infectious diseases (bacterial spot, early blight, Septoria [...] Read more.
Early and accurate tomato disease detection using easily available leaf photos is essential for farmers and stakeholders as it help reduce yield loss due to possible disease epidemics. This paper aims to visually identify nine different infectious diseases (bacterial spot, early blight, Septoria leaf spot, late blight, leaf mold, two-spotted spider mite, mosaic virus, target spot, and yellow leaf curl virus) in tomato leaves in addition to healthy leaves. We implemented EfficientNetB5 with a tomato leaf disease (TLD) dataset without any segmentation, and the model achieved an average training accuracy of 99.84% ± 0.10%, average validation accuracy of 98.28% ± 0.20%, and average test accuracy of 99.07% ± 0.38% over 10 cross folds.The use of gradient-weighted class activation mapping (GradCAM) and local interpretable model-agnostic explanations are proposed to provide model interpretability, which is essential to predictive performance, helpful in building trust, and required for integration into agricultural practice. Full article
Show Figures

Figure 1

Figure 1
<p>Distinct sample images from the dataset for individual diseases. (<b>a</b>–<b>j</b>) denote “healthy”, “bacterial spot”, “early blight”, “late blight”, “leaf mold”, “mosaic virus”, “Septoria leaf spot”, “target spot”, “two spotted spider mite”, and ”yellow leaf curl virus” classes, respectively.</p>
Full article ">Figure 2
<p>Proposed high level conceptual architecture of explanation—driven DL model (BotanicX-AI) for TLD detection.</p>
Full article ">Figure 3
<p>Training and validation results. (<b>a</b>) 99.84% ± 0.10% average training accuracy and 99.07% ± 0.38% average validation accuracy over 10 folds. (<b>b</b>) 0.18 ± 0.01 training loss and 0.24 ± 0.02 validation loss.</p>
Full article ">Figure 4
<p>Confusion Matrix. ‘BS’, ‘EB’, ‘LB’, ‘LM’, ‘SP’, ‘SM’, ‘TS’, ‘YV’, ‘MV’, and ‘HL’ stand for bacterial spot, early blight, late blight, leaf mold, Septoria leaf spot, spider mite, target spot, yellow curl virus, mosaic virus, and healthy leaves, respectively.</p>
Full article ">Figure 5
<p>The AUC-ROC results of the proposed model with an AUC score of 1.0.</p>
Full article ">
16 pages, 14078 KiB  
Article
New Perspectives of Earth Surface Remote Detection for Hydro-Geomorphological Monitoring of Rivers
by Marina Zingaro, Marco La Salandra and Domenico Capolongo
Sustainability 2022, 14(21), 14093; https://doi.org/10.3390/su142114093 - 28 Oct 2022
Cited by 7 | Viewed by 1805
Abstract
In the current scenery of climate change and its relatively increasing visible effects seen over the world, the monitoring of geomorphological processes and flood dynamics becomes more and more necessary for disaster risk reduction. During recent decades, the advantages offered by remote sensing [...] Read more.
In the current scenery of climate change and its relatively increasing visible effects seen over the world, the monitoring of geomorphological processes and flood dynamics becomes more and more necessary for disaster risk reduction. During recent decades, the advantages offered by remote sensing for Earth surface observations have been widely exploited, producing images, digital elevation models (DEM), maps, and other tools useful for hydro-geomorphological parameters detection, flood extent monitoring, and forecasting. However, today, advanced technologies and integrated methodologies do not yet enable one to completely provide near-real-time (NRT) and very-high-resolution (VHR) observations of a river, which is needed for risk evaluation and correct operational strategy identification. This work presents an advanced remote detection analysis system (ARDAS) based on the combination of multiple technologies, such as Unmanned Aerial Vehicle (UAV) systems, Structure from Motion (SfM) techniques, and cloud computing environment. The system allows to obtain VHR products, such as ortho-photomosaics and DEM, for deep observation of the river conditions, morphological modifications, and evolution trend. The test of ARDAS in the Basento river catchment area (Basilicata, South Italy) showed that the innovative system (i) proves to be advantageous in river monitoring due to its high accuracy, quickness, and data flexibility; (ii) could represent a NRT solution for timely support of flood hazard assessments; and (iii) can be further developed by integrating other technologies for direct application in land planning and safeguard activities by contributing to the value chain of the new space economy and sustainable development. Full article
(This article belongs to the Section Resources and Sustainable Utilization)
Show Figures

Figure 1

Figure 1
<p>Examples of river boundary extraction from RS data through different image-processing algorithms. (<b>1</b>) River extraction along the Zambezi reach (Mozambique) from a Sentinel−1 SAR image acquired on 15 July 2015 (<b>a</b>), by applying texture analysis through pixel-based homogeneity computation (<b>b</b>) and the threshold operator of the resulting values into the set region boundaries (<b>c</b>). (<b>2</b>) River extraction along the Zambezi reach (Mozambique) from a Sentinel−2 multispectral image acquired on 28 September 2016 (<b>a</b>), by applying the normalized difference vegetation index (NDVI) (<b>b</b>) and the threshold operator of the resulting values into the set region boundaries (<b>c</b>). (<b>3</b>) River extraction along the Basento reach (Italy) from a very-high spatial resolution (1 cm) ortho-photomosaic derived from drone images (<b>a</b>) (through the ARDAS system, see below), by applying algorithms of edge detection through a Sobel filter (<b>b</b>) and unsupervised k-means clustering (<b>c</b>) that distinguish the bars in the channel.</p>
Full article ">Figure 2
<p>Examples of RS data integration in flood monitoring. (<b>1</b>) Photographs of flood event effects (<b>a</b>–<b>d</b>) in the historical site of Metaponto (Basilicata, Italy) and maps of the inundated area (<b>A</b>), with the different water extents detected by optical and SAR images (<b>B</b>–<b>D</b>). Figure from [<a href="#B27-sustainability-14-14093" class="html-bibr">27</a>]. (<b>2</b>) Flood map deriving from the integration of multi-frequency SAR data. Figure from [<a href="#B28-sustainability-14-14093" class="html-bibr">28</a>]. (<b>3</b>) Map of the sediment flow connectivity index of an area of the Severn River catchment (UK), overlaid with the water extent borders (blue line) of the flood event (<b>left</b>) and a comparison with aerial photography (<b>right</b>), showing the flooded areas (<b>top</b>, red box) corresponding to high sediment connectivity areas (in red) and non-flooded areas (<b>bottom</b>, yellow box) corresponding to low sediment connectivity areas (in green). Figure from [<a href="#B11-sustainability-14-14093" class="html-bibr">11</a>].</p>
Full article ">Figure 3
<p>Detection and processing sequences of the ARDAS system: (<b>1</b>) acquisition of UAV data; (<b>2</b>) NRT processing in the cloud environment; (<b>3</b>) application of the photogrammetric workflow through SfM techniques; (<b>4</b>) processing of the multi-temporal VHR ortho-photomosaic and DEM for hydro-geomorphological analysis.</p>
Full article ">Figure 4
<p>Output generated by the ARDAS application in Basilicata (Southern Italy) (<b>a</b>) along a reach of the Basento River catchment (respectively, the red spots and green borders in (<b>b</b>)): ortho-photomosaic (<b>c</b>) and DEM (<b>d</b>).</p>
Full article ">Figure 5
<p>(<b>a</b>) DEM of Difference between 2019 and 2021, showing channel deposition (blue) and erosion (red) along the reach. The black segment indicates the trace of the cross−section shown below. Basemap: Google Satellite. (<b>b</b>) Multi−temporal river cross-section comparison (2019–2021) and flood discharge area estimation (A) for each river cross-section (2019 and 2021).</p>
Full article ">Figure 6
<p>Map of the channel displacement with the extraction and overlap of channel location relative to 2019 (green) and 2021 (light blue) and the indication of the corresponding sinuosity index. Basemap: Google Satellite.</p>
Full article ">Figure 7
<p>Future development of the ARDAS system with integration of some sequences (indicated by orange circles): (<b>1</b>) Galileo and EGNOS systems; (<b>2</b>) acquisition of the UAVs’ fleet data; (<b>3</b>) RT transmission into the cloud environment (see <a href="#sec2dot1-sustainability-14-14093" class="html-sec">Section 2.1</a>); (<b>4</b>) application of the photogrammetric workflow through SfM techniques (see <a href="#sec2dot1-sustainability-14-14093" class="html-sec">Section 2.1</a>); (<b>5</b>) processing of the multi-temporal VHR ortho-photomosaic and DEM (see <a href="#sec2dot1-sustainability-14-14093" class="html-sec">Section 2.1</a>); (<b>6</b>) automation algorithms for extrapolation and analysis of the hydro-geomorphological features and parameters (<b>7</b>).</p>
Full article ">
21 pages, 7586 KiB  
Article
Fast Tree Detection and Counting on UAVs for Sequential Aerial Images with Generating Orthophoto Mosaicing
by Pengcheng Han, Cunbao Ma, Jian Chen, Lin Chen, Shuhui Bu, Shibiao Xu, Yong Zhao, Chenhua Zhang and Tatsuya Hagino
Remote Sens. 2022, 14(16), 4113; https://doi.org/10.3390/rs14164113 - 22 Aug 2022
Cited by 13 | Viewed by 4214
Abstract
Individual tree counting (ITC) is a popular topic in the remote sensing application field. The number and planting density of trees are significant for estimating the yield and for futher planing, etc. Although existing studies have already achieved great performance on tree detection [...] Read more.
Individual tree counting (ITC) is a popular topic in the remote sensing application field. The number and planting density of trees are significant for estimating the yield and for futher planing, etc. Although existing studies have already achieved great performance on tree detection with satellite imagery, the quality is often negatively affected by clouds and heavy fog, which limits the application of high-frequency inventory. Nowadays, with ultra high spatial resolution and convenient usage, Unmanned Aerial Vehicles (UAVs) have become promising tools for obtaining statistics from plantations. However, for large scale areas, a UAV cannot capture the whole region of interest in one photo session. In this paper, a real-time orthophoto mosaicing-based tree counting framework is proposed to detect trees using sequential aerial images, which is very effective for fast detection of large areas. Firstly, to guarantee the speed and accuracy, a multi-planar assumption constrained graph optimization algorithm is proposed to estimate the camera pose and generate orthophoto mosaicing simultaneously. Secondly, to avoid time-consuming box or mask annotations, a point supervised method is designed for tree counting task, which greatly speeds up the entire workflow. We demonstrate the effectiveness of our method by performing extensive experiments on oil-palm and acacia trees. To avoid the delay between data acquisition and processing, the proposed framework algorithm is embedded into the UAV for completing tree counting tasks, which also reduces the quantity of data transmission from the UAV system to the ground station. We evaluate the proposed pipeline using sequential UAV images captured in Indonesia. The proposed pipeline achieves an F1-score of 98.2% for acacia tree detection and 96.3% for oil-palm tree detection with online orthophoto mosaicing generation. Full article
(This article belongs to the Special Issue Deep Learning in Remote Sensing Application)
Show Figures

Figure 1

Figure 1
<p>The algorithm presented in this paper is an integrated vision framework, which consists of “real-time stitching” and “fast tree counting” modules. This framework could be easily applied into the embedded system for UAVs; it makes the UAV support online georeferenced large-scale low-overlap aerial images stitching and fast tree counting tasks, which could greatly improve work efficiency.</p>
Full article ">Figure 2
<p>Blue curve represents the real terrain, green line represents the full-scene plane assumption, and orange lines represent multiplanar assumption.</p>
Full article ">Figure 3
<p>The relationship among the following significant parameters: flight speed is <span class="html-italic">V</span> (m/s), flight height is <span class="html-italic">h</span> (m), horizontal field of view (FOV) is <math display="inline"><semantics> <mi>θ</mi> </semantics></math>, the overlapping rate is <math display="inline"><semantics> <mi>α</mi> </semantics></math>, the interval of taking pictures is <span class="html-italic">t</span> (s). In order to guarantee the overlapping rate, the above-mentioned parameters should meet the equation shown in the figure.</p>
Full article ">Figure 4
<p>Mosaic results of the proposed method on acacia dataset. Some screenshots are highlighted to demonstrate the mosaic details. Our stitching quality is generally high compared with the offline results of state-of-the-art commercial software Pix4DMapper.</p>
Full article ">Figure 5
<p>Orthophoto results of the proposed stitching method on the oil-palm dataset. The results show that the quality of live mosaicing is comparable to that of Pix4DMapper. Some details even show that the output live mosaic looks better than offline-based commercial construction software Pix4DMapper. The blue bounding box of Pix4DMapper shows the blur condition, which is not shown in proposed method.</p>
Full article ">Figure 6
<p>Images with low overlapping rate (30%) are collected for stitching comparision experiments between the proposed method (<b>a</b>) and Pix4DMapper (<b>b</b>).</p>
Full article ">Figure 7
<p>Discussion about the influence of fast-changing light condition and low overlapping rate for our proposed stitching method. The yellow box shows the stitching result in slight light changes condition. And the blue box displays the misalignment stitching result in low overlapping rate and fast-changing condition.</p>
Full article ">Figure 8
<p>The description of the merged distance. If the center point distance between two adjacent tree masks is less than the preset merged distance, two points or more points in the boundaries from neighbouring tiles would be fused into one point.</p>
Full article ">Figure 9
<p>Precision, Recall, and F1 score under different <math display="inline"><semantics> <msub> <mi>d</mi> <mi>α</mi> </msub> </semantics></math> over acacia and oil-palm confused dataset. We could find that, 40 of <math display="inline"><semantics> <msub> <mi>d</mi> <mi>α</mi> </msub> </semantics></math> is the best parameter choice for the following experiments.</p>
Full article ">Figure 10
<p>The qualitative results of tree counting network trained with different terms of the proposed loss function. (<b>a</b>) Test images selected from acacia and oil-palm dataset. (<b>b</b>) Inference results just using point loss. (<b>c</b>) Prediction results improved with point loss and the first term of separate loss. (<b>d</b>) The performance both using point loss and separate loss.</p>
Full article ">Figure 11
<p>The examples of random annotations on acacia and oil-palm datasets.</p>
Full article ">Figure 12
<p>The annotation visualization with different annotation ways including point, bounding box, and mask annotation with the labelme toolbox.</p>
Full article ">
16 pages, 24944 KiB  
Article
Spectrometry of the Urban Lightscape
by Christopher Small
Technologies 2022, 10(4), 93; https://doi.org/10.3390/technologies10040093 - 13 Aug 2022
Cited by 2 | Viewed by 2352
Abstract
NASA’s Gateway to Astronaut Photography of Earth contains over 30,000 photos of ~2500 cataloged urban lightscapes (anthropogenic night light) taken from the International Space Station. A subset of over 100 of these multispectral DSLR photos are of sufficient spatial resolution, sharpness and exposure [...] Read more.
NASA’s Gateway to Astronaut Photography of Earth contains over 30,000 photos of ~2500 cataloged urban lightscapes (anthropogenic night light) taken from the International Space Station. A subset of over 100 of these multispectral DSLR photos are of sufficient spatial resolution, sharpness and exposure to be potentially useful for broadband spectral characterization of urban lightscapes. Spectral characterization of multiple urban lightscapes can provide a basis for quantifying intra and interurban variability in night light brightness, color and extent, as well as the potential for change analyses. A comparative analysis of simulated atmospheric transmissivity from the MODTRAN radiative transfer model indicates that the spectral slopes of transmissivity spectra are relatively insensitive model atmospheres, with variations in atmospheric path length and aerosol optical depth primarily affecting the bias of the spectrum rather than the slope. A mosaic of 18 intercalibrated, transmissivity-compensated RGB photos renders a spectral feature space bounded by four clearly defined spectral endmembers corresponding to white, yellow and red light sources, with brightness modulated by a dark background endmember. These four spectral endmembers form the basis of a linear spectral mixture model which can be inverted to provide estimates of the areal fraction of each endmember present within every pixel field of view. The resulting spectral feature spaces consistently show two distinct mixing trends extending from the dark endmember to flat spectrum (white–yellow) and warm spectrum (orange) sources. The distribution of illuminated pixels is strongly skewed toward a lower luminance background of warm spectrum street lighting with brighter lights, generally corresponding to point sources and major thoroughfares. Full article
(This article belongs to the Section Environmental Technology)
Show Figures

Figure 1

Figure 1
<p>Urban lightscape comparison of Beijing photographed at different dates, local times, view geometries, focal lengths, exposures and white balance settings. The 2010 and 2011 shots (<b>top</b>) were taken with the same camera (Nikon D3S) less than 4 months apart and differ primarily in lens focal length (hence, spatial resolution) and white balance temperature. The 2016 and 2020 shots (<b>bottom</b>) were taken with different cameras (Nikon D4 &amp; D5) using the same lens, exposure and ISO and similar white balance temperature, but different local times and view geometries. The greater number and brightness of white lights within the four inner ring roads in the 2020 shot may be a combined result of the earlier local time and more oblique view geometry imaging, as well as more illuminated facades and commercial lighting not seen in the near-nadir view shot taken after midnight, local time. Arrows in UL corners show north.</p>
Full article ">Figure 2
<p>Spatially variable atmospheric scattering effects for Las Vegas (<b>top</b>) and Beijing (<b>bottom</b>). Translucent clouds over Las Vegas are more conspicuous because of overglow effects extending beyond the periphery of the light sources, as well as the longer exposure of the 2012 image. Spatially varying sharpness within the lighted area of Beijing is more subtle but, nonetheless, distorts both the brightness and spatial extent of individual light sources. Compare the sharpness of these Beijing images with the 2016 and 2020 shots shown in <a href="#technologies-10-00093-f001" class="html-fig">Figure 1</a>. All 4 Beijing images were taken with the same 400 mm lens.</p>
Full article ">Figure 3
<p>Exposure, ISO and white balance settings for 122 high-quality urban lightscape photos taken between 2003 and 2020. Most photos were taken with high ISO and relatively low shutter speed for the telephoto focal lengths of the lenses used (distribution at top). The distribution of white balance settings is skewed toward low temperatures consistent with the widespread use of high- and low-pressure Sodium light sources.</p>
Full article ">Figure 4
<p>Atmospheric transmittance correction estimation. Nikon D3S spectral responses (<b>bottom</b>) are convolved with MODTRAN-derived atmospheric transmittance (<b>top</b>) for different visibilities and atmosphere models to produce response-weighted channel-specific estimates (circles) for transmittance loss corrections (C<sub>BGR</sub>).</p>
Full article ">Figure 5
<p>Color temperature-calibrated mosaics of 18 cities. The 2200 K mosaic corresponds to the color temperature of a high-pressure Sodium lamp while the 5500 K mosaic corresponds to natural daylight, showing the preponderance of warm spectrum light sources.</p>
Full article ">Figure 6
<p>Three-dimensional spectral feature space and spectral fraction space for the 18-city composite (5500 K). Density-shaded orthogonal projections of the principal component distributions (<b>left</b>) show luminance corresponding to PC 1 with white, yellow and red endmembers bounding a triangular plane of maximum luminance perpendicular to the gray axis extending to the dark (K) endmember. Since very few pixels are pure red, the distribution trends toward orange mixtures near the red endmember. Projections of the WYRK fraction space (<b>right</b>) are largely bounded by linear mixtures with a small number of blue pixels with slightly negative yellow or red fractions. The WYRK endmembers represent an optimized, more nearly orthogonal basis for the urban lightscape, as imaged by multispectral RGB sensors.</p>
Full article ">Figure 7
<p>Spectral endmember fraction mosaics of 18 cities. The 5500 K mosaic is unmixed with a 4-endmember linear mixture model with white, yellow, red and dark endmembers. As the dark endmember fraction modulates luminance, an RGB composite of the R, Y and W fractions (<b>top</b>) resembles the 5500 K RGB composite in <a href="#technologies-10-00093-f005" class="html-fig">Figure 5</a>. The RGB composite of Log<sub>10</sub> (R, Y, W) (<b>bottom</b>) partially offsets this dark fraction modulation to enhance the lower luminance mixed spectra. Both mosaics displayed with a 1% linear stretch applied.</p>
Full article ">Figure 8
<p>WYR fraction spaces for 8 contrasting lightscapes. The white–yellow projections show varying degrees of saturation on the binary mixing line related to varying exposures of different photographs. Regardless of exposure, all distributions are strongly skewed toward dimmer lights (near the origin), with distinct continua extending to a warm spectrum orange near the red endmember on the yellow–red projection.</p>
Full article ">Figure 9
<p>WYR and Log<sub>10</sub>WYR fraction composites for Paris illustrate the diversity of high luminance colored point sources superimposed on the pervasive low luminance warm spectrum street lighting.</p>
Full article ">
33 pages, 5349 KiB  
Article
A Multidisciplinary Approach for A Better Knowledge of the Benthic Habitat and Community Distribution in the Central and Western English Channel
by Jean-Claude Dauvin, Jean-Philippe Pezy, Emmanuel Poizot, Sophie Lozach and Alain Trentesaux
J. Mar. Sci. Eng. 2022, 10(8), 1112; https://doi.org/10.3390/jmse10081112 - 12 Aug 2022
Cited by 3 | Viewed by 1614
Abstract
About 80% of the seabed of the English Channel (EC) is covered by coarse sediment, from coarse sand to pebbles. Quantitative data on the benthic macrofauna in these types of sediment remains are rare due to the difficulty of using grab corers in [...] Read more.
About 80% of the seabed of the English Channel (EC) is covered by coarse sediment, from coarse sand to pebbles. Quantitative data on the benthic macrofauna in these types of sediment remains are rare due to the difficulty of using grab corers in such hard substrates. The deepest central part of the EC (45–101 m depth) was prospected during two VIDEOCHARM surveys in June 2010 and June 2011 to increase knowledge of such sublittoral coarse sediment benthic habitats. Sampling focussed on a longitudinal transect in the deepest part of the EC (13 boxes), extending from the western approach to the Greenwich meridian. Both indirect (side scan sonar, Remote Operated Vehicule) and direct (grab sampling with benthos determination, and grain-size analyses) approaches were used and combined, permitting description of the benthic habitats and communities using seven methods. Five benthic EUNIS habitats (European Nature Information System) were reported: MC3215, MD3211, MC4, MC3212 and MC4215, of which two extended main habitats (MC3211 and M23212) corresponded to an eastern/western gradient from sandy gravel to sandy gravel and pebbles sediment. Three other spatially discrete habitats were associated with poor coarse sand and gravel habitats as well as sandy gravel and pebbles with the presence of the brittle star Ophiothrix fragilis. Taxonomic richness of both extended habitats was on the same order of magnitude as the coarse sand habitat reported elsewhere in the EC, whilst the abundances were among the lowest in deeper areas with low nutrient input and low primary production. The epifauna appeared relatively homogenous in this type of sediment at the scale of the sampling area and was not determined to assign a EUNIS habitat/class. ROV footage illustrated the presence of large epifauna and provided valuable information to ground truth in other sampling methods such as side scan sonar mosaic. Grab photos showing surface sediment was relevant to determine the sediment type, whilst granulometric analyses gave additional information on fine particles content (typically very low). Full article
(This article belongs to the Section Marine Biology)
Show Figures

Figure 1

Figure 1
<p>Location of the 13 boxes (1 to 13, blue rectangle) sampled during the VIDEOCHARM surveys in June 2010 and June 2011 with the map of the three main superficial sediment types in the English Channel: orange, pebbles and large gravel; yellow: gravel and blue: sands and muds (from 31 in 29).</p>
Full article ">Figure 2
<p>Side scan sonar profiles collected during VIDEOCHARM surveys in June 2010 and June 2011 for the 13 boxes 1 to 13.</p>
Full article ">Figure 3
<p>Five main sediment classes described during the VIDEOCHARM surveys in June 2010 and June 2011.</p>
Full article ">Figure 4
<p>Example of scan sonar profiles collected during the VIDEOCHARM surveys in June 2010 ((<b>a</b>,<b>b</b>) boxes 1 and 3) and June 2011 ((<b>c</b>) box 13) with inset views for some areas of interest and snapshots of the surface sediments extracted from the ROV video footage.</p>
Full article ">Figure 5
<p>Cluster dendrogram showing the pattern of the 40 grab sampling stations (abundance per 0.5 m² of the accounted macrofauna retained on a 2 mm mesh sieve and 24% of similarity) according to the Bray–Curtis similarity after Log(X + 1) transformation of the abundances.</p>
Full article ">Figure 6
<p>Cluster dendrogram showing the pattern of the 40 grab sampling stations (abundance per 0.5 m² of the accounted macrofauna retained on a 1 mm mesh sieve and 29% of similarity) according to the Bray–Curtis similarity after Log(X + 1) transformation of the abundances.</p>
Full article ">Figure 7
<p>Cluster dendrogram showing the pattern of the 40 grab sampling stations (presence/absence of all taxa recorded per 0.5 m² of the motile and sessile macrofauna and 35% of similarity) according to the Sorensen similarity.</p>
Full article ">Figure 8
<p>Cluster dendrogram showing the pattern of the 30 ROV sampling stations (motile and sessile taxa identified from the video and 35% of similarity) according to the Bray–Curtis similarity calculated using three classes of abundance ((1) rare; (2) common; (3) abundant and square root transformation).</p>
Full article ">Figure 9
<p>Side scan sonar profiles collected during the VIDEOCHARM surveys in June 2010 (box 4) over the SHOM sedimentary cover in the English Channel.</p>
Full article ">Figure 10
<p>EUNIS habitat map from the library of marine habitats maps in European waters (EMODnet seabed habitats). VIDEOCHARM side scan sonar survey lines are shown with corresponding box number.</p>
Full article ">
22 pages, 2907 KiB  
Article
From Forest Dynamics to Wetland Siltation in Mountainous Landscapes: A RS-Based Framework for Enhancing Erosion Control
by Gonzalo Hernández-Romero, Jose Manuel Álvarez-Martínez, Ignacio Pérez-Silos, Ana Silió-Calzada, David R. Vieites and Jose Barquín
Remote Sens. 2022, 14(8), 1864; https://doi.org/10.3390/rs14081864 - 13 Apr 2022
Cited by 11 | Viewed by 3673
Abstract
Human activities have caused a significant change in the function and services that ecosystems have provided to society since historical times. In mountainous landscapes, the regulation of services such as water quality or erosion control has been impacted by land use and land [...] Read more.
Human activities have caused a significant change in the function and services that ecosystems have provided to society since historical times. In mountainous landscapes, the regulation of services such as water quality or erosion control has been impacted by land use and land cover (LULC) changes, especially the loss and fragmentation of forest patches. In this work, we develop a Remote Sensing (RS)-based modelling approach to identify areas for the implementation of nature-based solutions (NBS) (i.e., natural forest conservation and restoration) that allow reducing the vulnerability of aquatic ecosystems to siltation in mountainous regions. We used time series Landsat 5TM, 7ETM+, 8OLI and Sentinel 2A/2B MSI (S2) imagery to map forest dynamics and wetland distribution in Picos de Europa National Park (Cantabrian Mountains, northern Spain). We fed RS-based models with detailed in situ information based on photo-interpretation and fieldwork completed from 2017 to 2021. We estimated a forest cover increase rate of 2 ha/year comparing current and past LULC maps against external validation data. We applied this forest gain to a scenario generator model to derive a 30-year future LULC map that defines the potential forest extent for the study area in 2049. We then modelled the distribution of wetlands to identify the areas with the greatest potential for moisture accumulation. We used an S2 mosaic and topography-derived data such as the slope and topographic wetness index (TWI), which indicate terrain water accumulation. Overall accuracy scores reached values of 86% for LULC classification and 61% for wetland mapping. At the same time, we obtained the potential erosion using the NetMap software to identify potential sediment production, transport and deposition areas. Finally, forest dynamics, wetland distribution and potential erosion were combined in a multi-criteria analysis aiming to reduce the amount of sediment reaching selected wetlands. We achieved this by identifying the most suitable locations for the conservation and restoration of natural forests on slopes and in riparian areas, which may reduce the risk of soil erosion and maximise sediment filtering, respectively. The results show a network pattern for forest management that would allow for controlling erosion effects across space and time at three levels: one, by reducing the load that originates upslope in the absence of forest cover; two, by intersecting runoff at watercourses related to sediment transport; and three, by a lack of former barriers, by trapping erosion near to the receiving wetland systems, main river axes and contributing streams. In conclusion, the proposed methodology, which could be transferred to other mountain regions, allows to optimise investment for erosion prevention and wetland conservation by using only very specific areas of the landscape for habitat management (e.g., for NBS implementation). Full article
(This article belongs to the Special Issue New Insights into Ecosystem Monitoring Using Geospatial Techniques)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Study area location in the Cantabrian Mountains of northern Spain. (<b>b</b>) Shape of the area. (<b>c</b>) Detail showing current ortophotos of PNOA (CNIG; <a href="https://centrodedescargas.cnig.es" target="_blank">https://centrodedescargas.cnig.es</a>, accessed on 15 September 2020).</p>
Full article ">Figure 2
<p>Flow diagram showing the methodological steps followed in this study, indicating the method’s sections at each step.</p>
Full article ">Figure 3
<p>Boxplot represents the NDVI values of the reference points used to classify the July 2019 scene.</p>
Full article ">Figure 4
<p>Example of wetland, GEP and area of influence in the study area for: (<b>a</b>) erosion at source, (<b>b</b>) transport and (<b>c</b>) sedimentation processes.</p>
Full article ">Figure 5
<p>LULC map for the three scenarios in the study area: (<b>a</b>) 1989, (<b>b</b>) 2019 and (<b>c</b>) 2049 following the legend of <a href="#remotesensing-14-01864-t001" class="html-table">Table 1</a>: BLF Broadleaf forest; CNF Coniferous forest; SSH Shrublands; AGR Agricultural land, crops; GRA Pasture and grassland; DEN Denude rock and bare soil; UHD Urban and human-derived areas and WAE Water ecosystem.</p>
Full article ">Figure 6
<p>(<b>a</b>) Example of current forest distribution and future expansion. (<b>b</b>) Optimal areas to establish NBS related to forest conservation and restoration (e.g., rewilding) strategies.</p>
Full article ">
11 pages, 2200 KiB  
Article
Underwater Hyperspectral Imaging of Arctic Macroalgal Habitats during the Polar Night Using a Novel Mini-ROV-UHI Portable System
by Natalie Summers, Geir Johnsen, Aksel Mogstad, Håvard Løvås, Glaucia Fragoso and Jørgen Berge
Remote Sens. 2022, 14(6), 1325; https://doi.org/10.3390/rs14061325 - 9 Mar 2022
Cited by 13 | Viewed by 3470
Abstract
We describe an Underwater Hyperspectral Imager (UHI) deployed on an instrument-carrying platform consisting of two interconnected mini-ROVs (Remotely Operated Vehicle) for the mapping and monitoring of Arctic macroalgal habitats in Kongsfjorden (Svalbard) during the Polar Night. The mini-ROV-UHI system is easy to transport, [...] Read more.
We describe an Underwater Hyperspectral Imager (UHI) deployed on an instrument-carrying platform consisting of two interconnected mini-ROVs (Remotely Operated Vehicle) for the mapping and monitoring of Arctic macroalgal habitats in Kongsfjorden (Svalbard) during the Polar Night. The mini-ROV-UHI system is easy to transport, assemble and deploy from shore, even under the dark, icy and cold conditions of the Arctic Polar Night. The system can be operated by two persons, keeping the operational costs low. In vivo hyperspectral reflectance of collected specimens of brown, red and green macroalgae was measured with a spectrometer in the lab to provide a spectral library for supervised pigment group classification based on UHI photomosaics. The in situ UHI-photomosaics provided detailed information of the areal coverage of the seafloor substrate (16%), as well as brown (51% habitat cover), red (18%), and green (14%) macroalgae, with spatial resolution in the range of cm and spectral resolution of 2 nm. The collected specimens from the mapped area were also used for species identification and health state evaluation. This innovative UHI sampling method provides significant information about macroalgal distribution and physiology, and due to its flexibility in terms of deployment, it is applicable to a variety of environments. Full article
(This article belongs to the Section Ocean Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>(<b>A</b>,<b>B</b>) Geoposition of Svalbard. (<b>C</b>) The ROV-UHI study site outside the Marine Lab, Ny Ålesund in Kongsfjord. Current UHI survey site (red diamond) in front of the Marine Lab (black square). Aerial hyperspectral imaging study sites from [<a href="#B13-remotesensing-14-01325" class="html-bibr">13</a>] are shown as blue points (3–7). (<b>D</b>) ROV-UHI underneath pancake ice. Credits: (<b>A</b>,<b>B</b>) based on [<a href="#B14-remotesensing-14-01325" class="html-bibr">14</a>]; (<b>C</b>) N. Summers (modified from <a href="https://geokart.npolar.no/" target="_blank">https://geokart.npolar.no/</a>, accessed 16 September 2022) and (<b>D</b>) N. Summers.</p>
Full article ">Figure 2
<p>(<b>A</b>) Schematic front view of the double mini-ROV rig used (Blueye Pioneer, Blueye), as a carrier for an Underwater Hyperspectral Imager (UHI-4, Ecotone). 1. Mini-ROV, 2. UHI, 3. Altimeter, 4. Underwater electronic housing, 5. Buoyancy tubes (PVC tubes filled with incompressible foam). (<b>B</b>) Front view of the mini-ROV rig. Credits: (<b>A</b>) by Malin Bø Nevstad, (<b>B</b>) by Geir Johnsen.</p>
Full article ">Figure 3
<p>In situ images of mini-ROV UHI survey. (<b>A</b>) Front view of ROV-UHI rig showing illumination from lamp during kelp forest mapping. (<b>B</b>) Side view of the mini-ROV-UHI over the kelp forest habitat during the transect. (<b>C</b>) Aft view of ROV with lamp and UHI over the seafloor. Credits: (<b>A</b>–<b>C</b>) by Geir Johnsen.</p>
Full article ">Figure 4
<p>The 8 major macroalgal species sampled from Kongsfjorden macroalgal habitat in January 2020. (<b>A</b>) Chlorophyte <span class="html-italic">Ulva</span> sp.; (<b>B</b>) Rhodophytes <span class="html-italic">Palmaria palmata</span>; (<b>C</b>) unknown Rhodophyte. (<b>D</b>–<b>H</b>) Phaeophytes: (<b>D</b>) <span class="html-italic">Fucus distichus</span>, (<b>E</b>) <span class="html-italic">Laminaria digitata</span>, (<b>F</b>) <span class="html-italic">Saccharina latissima</span>, (<b>G</b>) <span class="html-italic">Desmarestia aculeata</span> and (<b>H</b>) <span class="html-italic">Alaria esculenta</span>.</p>
Full article ">Figure 5
<p>(<b>A</b>,<b>B</b>) Stages (1–3) for making a map of the macroalgal habitat. Stage 1, RGB visualisation of the UHI transect; Stage 2, Application of spectral angle mapper (SAM) algorithm based on the spectral library of brown, red and green macroalgal in vivo reflectance spectra taken in the lab (<b>C</b>); Stage 3, Support Vector Machine (SVM) classification of all remaining pixel into green, brown, red macroalgae and substrate using from SAM pixels as training data. (<b>C</b>) In vivo reflectance spectra (R(λ)) with standard error of the mean for each pigment group, measured with a spectrometer. Low R(λ) indicates high absorption at 650 nm (chl b) and 677 nm for green algae, 530–570 nm (phycoerythrin) and 680 nm (chl a) for red algae and lastly 535 nm (fucoxanthin), 630 nm (chl c) and 674 nm (chl a) for brown algae. (<b>D</b>) Percent areal cover of each OOI.</p>
Full article ">
15 pages, 6495 KiB  
Article
Comparative Evaluation for Tracking the Capability of Solar Cell Malfunction Caused by Soil Debris between UAV Video versus Photo-Mosaic
by Young-Seok Hwang, Stephan Schlüter, Seong-Il Park and Jung-Sup Um
Remote Sens. 2022, 14(5), 1220; https://doi.org/10.3390/rs14051220 - 2 Mar 2022
Cited by 2 | Viewed by 2260
Abstract
Monitoring the malfunction of the solar cells (for instance, 156 mm by 156 mm) caused by the soil debris requires a very low flight altitude when taking aerial photos, utilizing the autopilot function of unmanned aerial vehicle (UAV). The autopilot flight can only [...] Read more.
Monitoring the malfunction of the solar cells (for instance, 156 mm by 156 mm) caused by the soil debris requires a very low flight altitude when taking aerial photos, utilizing the autopilot function of unmanned aerial vehicle (UAV). The autopilot flight can only operate at a certain level of altitude that can guarantee collision avoidance for flight obstacles (for instance, power lines, trees, buildings) adjacent to the place where the solar panel is installed. For this reason, aerial photos taken by autopilot flight capture unnecessary objects (surrounding buildings and roads) around the solar panel at a tremendous level. Therefore, the autopilot-based thermal imaging causes severe data redundancy with very few matched key-points around the malfunctioned solar cells. This study aims to explore the tracking capability on soil debris defects in solar cell scale between UAV video versus photo-mosaic. This study experimentally validated that the video-based thermal imaging can track the thermal deficiency caused by the malfunction of the solar cell at the level of the photo-mosaic in terms of correlation of thermal signatures (0.98–0.99), detection on spatial patterns (81–100%), and distributional property (90–95%) with 2.5–3.4 times more matched key-points on solar cells. The results of this study could serve as a valuable reference for employing video stream in the process of investigating soil debris defects in solar cell scale. Full article
Show Figures

Figure 1

Figure 1
<p>Mosaic of thermal imageries and an in situ photo of the study site. Black box is the location of experimental targets. The magnified picture (left side) is the in-situ photo of experimental targets.</p>
Full article ">Figure 2
<p>The number of overlapping images in the thermal mosaic of the experimental site with the point cloud. The green areas indicate a degree of overlap by more than five images. The red and yellow areas indicate a low overlap, resulting in poor mosaic quality. (<b>a</b>) Photo-mosaic; (<b>b</b>) Video-mosaic.</p>
Full article ">Figure 3
<p>Differentiated distribution density of matched key-points between solar cells of the photo-mosaic and video-mosaic. (<b>A</b>) photo-mosaic; (<b>B</b>) video-mosaic; (<b>a1</b>,<b>b1</b>) Testbed 1; (<b>a2</b>,<b>b2</b>) Testbed 2; (<b>a3</b>,<b>b3</b>) Testbed 3. Solar panel includes one or more solar modules assembled as a pre-wired, field-installable unit [<a href="#B48-remotesensing-14-01220" class="html-bibr">48</a>]. The typical solar module is the aggregation of solar cells, 60 (6 by 10) or 72 (6 by 12) [<a href="#B1-remotesensing-14-01220" class="html-bibr">1</a>]. The magnified picture (red box) of the photo-mosaic (bottom-left) contains only two matched key-points, while the video (bottom-right) shows ten.</p>
Full article ">Figure 4
<p>Hot spot boundary on solar cells malfunctioning due to soil debris. (<b>a1</b>) Hot spot boundary (derived from photo-mosaic) overlaid on-field photo; (<b>b1</b>) Hot spot boundary (derived from video-mosaic) overlaid on-field photo; (<b>a2</b>) Hot spot boundary detected from photo-mosaic; (<b>b2</b>) Hot spot boundary detected from video-mosaic. The quadrangle (black box) represents the magnified portion presented in <a href="#remotesensing-14-01220-f005" class="html-fig">Figure 5</a>.</p>
Full article ">Figure 5
<p>Comparison of magnified image and hot spots between the photo-mosaic and video-mosaic. (<b>a1</b>) in situ photo and hot spot boundary detected from photo-mosaic; (<b>a2</b>) photo-mosaic and hot spot boundary detected from photo-mosaic; (<b>b1</b>) in situ photo and hot spots detected from video-mosaic; (<b>b2</b>) video-mosaic and hot spot boundary detected from video-mosaic. See the location of the image in <a href="#remotesensing-14-01220-f004" class="html-fig">Figure 4</a>.</p>
Full article ">Figure 6
<p>Scatter plots of SCSTs in matched hot spots between the photo-mosaic and video-mosaic. (<b>a</b>) Testbed 1; (<b>b</b>) Testbeds 2 and 3.</p>
Full article ">Figure 7
<p>Absolute frequencies. The blue bar is the mean adjusted values of SCSTs of the video photo-mosaic in matched hot spots. The yellow bar is the mean adjusted values of SCSTs of the photo-mosaic in matched hot spots. (<b>a</b>) Testbed 1; (<b>b</b>) Testbeds 2 and 3.</p>
Full article ">
22 pages, 14607 KiB  
Article
Functional Analysis for Habitat Mapping in a Special Area of Conservation Using Sentinel-2 Time-Series Data
by Simone Pesaresi, Adriano Mancini, Giacomo Quattrini and Simona Casavecchia
Remote Sens. 2022, 14(5), 1179; https://doi.org/10.3390/rs14051179 - 27 Feb 2022
Cited by 7 | Viewed by 3894
Abstract
The mapping and monitoring of natural and semi-natural habitats are crucial activities and are regulated by European policies and regulations, such as the 92/43/EEC. In the Mediterranean area, which is characterized by high vegetational and environmental diversity, the mapping and monitoring of habitats [...] Read more.
The mapping and monitoring of natural and semi-natural habitats are crucial activities and are regulated by European policies and regulations, such as the 92/43/EEC. In the Mediterranean area, which is characterized by high vegetational and environmental diversity, the mapping and monitoring of habitats are particularly difficult and often exclusively based on in situ observations. In this scenario, it is necessary to automate the generation of updated maps to support the decisions of policy makers. At present, the availability of high spatiotemporal resolution data provides new possibilities for improving the mapping and monitoring of habitats. In this work, we present a methodology that, starting from remotely sensed time-series data, generates habitat maps using supervised classification supported by Functional Data Analysis. We constructed the methodology using Sentinel-2 data in the Mediterranean Special Area of Conservation “Gola di Frasassi” (Code: IT5320003). In particular, the training set uses 308 field plots with 11 target classes (five forests, two shrubs, one grassland, one mosaic, one extensive crop, and one urban land). Starting from vegetation index time-series data, Functional Principal Component Analysis was applied to derive FPCA scores and components. In particular, in the classification stage, the FPCA scores are considered as features. The obtained results out-performed a previous map derived from photo-interpretation by domain experts. We obtained an overall accuracy of 85.58% using vegetation index time-series, topography, and lithology data. The main advantages of the proposed approach are the capability to efficiently compress high dimensional data (dense remote-sensing time series) providing results in a compact way (e.g., FPCA scores and mean seasonal time profiles) that: (i) facilitate the link between remote sensing with habitat mapping and monitoring and their ecological interpretation and (ii) could be complementary to species-based approaches in plant community ecology and phytosociology. Full article
(This article belongs to the Special Issue Use of Remote Sensing Techniques for Wildlife Habitat Assessment)
Show Figures

Figure 1

Figure 1
<p>Supervised pipeline to derive plant associations and habitat maps from Sentinel-2 time-series using Functional Principal Component Analysis.</p>
Full article ">Figure 2
<p>Study area: (<b>a</b>) Overview of study area at regional scale; (<b>b</b>) reference data of the Digital Elevation Model with the boundary of the Gola di Frasassi (Gorge of Frasassi) Special Area of Conservation (SAC IT5320003); and (<b>c</b>) entry point to the Gorge of Frasassi. Left: Mount Valmontangana. Right: Mount Frasassi.</p>
Full article ">Figure 3
<p>Plant associations and habitat maps of SAC <span class="html-italic">“Gola di Frasassi”</span>—code IT5320003 (Central Italy): (<b>a</b>) Map obtained by the supervised random forest classification of the main seasonal remotely sensed phenological variations as well as the main topographic and lithological predictors; and (<b>b</b>) map derived from the SIT-REM of Marche region [<a href="#B72-remotesensing-14-01179" class="html-bibr">72</a>] produced by experts through the photo-interpretation <span class="html-italic">phytosociological traditional method</span>. Legend numbers correspond to the plant associations and habitats listed in <a href="#remotesensing-14-01179-t001" class="html-table">Table 1</a>.</p>
Full article ">Figure A1
<p>Number of components (and their fraction of variation explained) extracted by Functional Principal Component Analysis from Sentinel-2 time-series: (<b>a</b>) NDVI, (<b>b</b>) GNDVI, (<b>c</b>) MCARI, (<b>d</b>) NDRE, (<b>e</b>) MNDWI, and (<b>f</b>) NDWI. Cumul. FVE is the cumulative Fraction of the Variance Explained.</p>
Full article ">Figure A2
<p>Spatiotemporal pattern of mean seasonal spectral variations, extracted by FPCA from the Sentinel-2 dense time-series. The first column shows the six vegetation index time-series (18,631 pixel-based time-series). FPCA considering a single time-series as a single object of analysis (as a function), and identifies main contrasting modes of variation during the year between the functions (second and third columns) and the respective spatial patterns according to the FPCA scores (fourth and fifth columns). Only the first two components are shown in this figure, for illustrative purposes.</p>
Full article ">Figure A3
<p>Seasonal temporal profiles of the target classes in the different spectral vegetation indices. The plotted MCARI values are scaled by a factor of 1000. The bold red line is the mean vegetation index value. The red polygon is the 10–90th percentile. The black line is the mean vegetation index values of whole study area (which is useful to appreciate the differences between the target class and mean of the study area). Row numbers correspond to the plant associations and habitats listed in <a href="#remotesensing-14-01179-t001" class="html-table">Table 1</a>.</p>
Full article ">
19 pages, 6273 KiB  
Article
Quantifying the Intra-Habitat Variation of Seagrass Beds with Unoccupied Aerial Vehicles (UAVs)
by David M. Price, Stacey L. Felgate, Veerle A. I. Huvenne, James Strong, Stephen Carpenter, Chris Barry, Anna Lichtschlag, Richard Sanders, Abel Carrias, Arlene Young, Valdemar Andrade, Eliceo Cobb, Tim Le Bas, Hannah Brittain and Claire Evans
Remote Sens. 2022, 14(3), 480; https://doi.org/10.3390/rs14030480 - 20 Jan 2022
Cited by 12 | Viewed by 4272
Abstract
Accurate knowledge of the spatial extent of seagrass habitats is essential for monitoring and management purposes given their ecological and economic significance. Extent data are typically presented in binary (presence/absence) or arbitrary, semi-quantitative density bands derived from low-resolution satellite imagery, which cannot resolve [...] Read more.
Accurate knowledge of the spatial extent of seagrass habitats is essential for monitoring and management purposes given their ecological and economic significance. Extent data are typically presented in binary (presence/absence) or arbitrary, semi-quantitative density bands derived from low-resolution satellite imagery, which cannot resolve fine-scale features and intra-habitat variability. Recent advances in consumer-grade unoccupied aerial vehicles (UAVs) have advanced our ability to survey large areas at higher resolution and at lower cost. This has improved the accessibility of mapping technologies to developing coastal nations, where a large proportion of the world’s seagrass habitats are found. Here, we present the application of UAV-gathered imagery to determine seagrass habitat extent and percent of canopy cover. Four contrasting sites were surveyed in the Turneffe Atoll Marine Reserve, Belize, and seagrass canopy cover was ground truthed from in situ quadrats. Orthomosaic images were created for each site from the UAV-gathered imagery. Three modelling techniques were tested to extrapolate the findings from quadrats to spatial information, producing binary (random forest) and canopy cover (random forest regression and beta regression) habitat maps. The most robust model (random forest regression) had an average absolute error of 6.8–11.9% (SE of 8.2–14), building upon previous attempts at mapping seagrass density from satellite imagery, which achieved errors between 15–20% approximately. The resulting maps exhibited great intra-habitat heterogeneity and different levels of patchiness, which were attributed to site energetics and, possibly, species composition. The extra information in the canopy cover maps provides greater detail and information for key management decisions and provides the basis for future spatial studies and monitoring programmes. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Map of Turneffe Atoll and sites survey with a UAV. Turneffe Atoll location is represented by the red square in the regional inset.</p>
Full article ">Figure 2
<p>Images of seagrass data collection: (<b>a</b>) in situ image of a quadrat; (<b>b</b>) image of a quadrat from a UAV flying at an altitude of 80 m; (<b>c</b>) high-resolution orthomosaics with some quadrats in view; and (<b>d</b>) same view as panel c at 50 cm per pixel.</p>
Full article ">Figure 3
<p>Site A. (<b>a</b>) Orthomosaic: red dots represent ground truth samples; (<b>b</b>) binary output of seagrass presence based on a random forest classifier; (<b>c</b>) percentage cover of seagrass based on a random forest regression presented in percentile classes; and (<b>d</b>) percentage cover of seagrass based on a beta regression presented in percentile classes. Maps of continuous percentage cover of seagrass are presented in the <a href="#app1-remotesensing-14-00480" class="html-app">Supplementary Materials</a>.</p>
Full article ">Figure 4
<p>Site B. (<b>a</b>) Orthomosaic: Red dots represent ground truth samples; (<b>b</b>) binary output of seagrass presence based on a random forest classifier; (<b>c</b>) predicted percentage cover of seagrass based on a random forest regression presented in percentile classes; and (<b>d</b>) predicted percentage cover of seagrass based on a beta regression presented in percentile classes. Maps of continuous percentage cover of seagrass are presented in the <a href="#app1-remotesensing-14-00480" class="html-app">Supplementary Materials</a>.</p>
Full article ">Figure 5
<p>Site C. (<b>A</b>) Orthomosaic: red dots represent ground truth samples; (<b>B</b>) binary output of seagrass presence based on a random forest classifier; (<b>C</b>) predicted percentage cover of seagrass based on a random forest regression presented in percentile classes; and (<b>D</b>) predicted percentage cover of seagrass based on a beta regression presented in percentile classes. Maps of continuous percentage cover of seagrass are presented in the <a href="#app1-remotesensing-14-00480" class="html-app">Supplementary Materials</a>.</p>
Full article ">Figure 6
<p>Site D. (<b>a</b>) Orthomosaic: red dots represent ground truth samples; (<b>b</b>) binary output of seagrass presence based on a random forest classifier; (<b>c</b>) predicted percentage cover of seagrass based on a random forest regression presented in percentile classes; and (<b>d</b>) predicted percentage cover of seagrass based on a beta regression presented in percentile classes. Maps of continuous percentage cover of seagrass are presented in the <a href="#app1-remotesensing-14-00480" class="html-app">Supplementary Materials</a>.</p>
Full article ">Figure 7
<p>Pie charts to represent the area of seagrass presence and percentage coverage as predicted by a random forest classifier, random forest regression, and a beta regression at four sites on Turneffe Atoll. Continuous data are represented within percentiles for clarity.</p>
Full article ">
18 pages, 4021 KiB  
Article
Topographic Variation in Forest Expansion Processes across a Mosaic Landscape in Western Canada
by Larissa Robinov, Chris Hopkinson and Mark C. Vanderwel
Land 2021, 10(12), 1355; https://doi.org/10.3390/land10121355 - 8 Dec 2021
Cited by 2 | Viewed by 2500
Abstract
Changes to historic fire and grazing regimes have been associated with the expansion of tree cover at forest–grassland boundaries. We evaluated forest expansion across a mosaic landscape in western Canada using aerial photos, airborne laser scanning, and field transects. The annual rate of [...] Read more.
Changes to historic fire and grazing regimes have been associated with the expansion of tree cover at forest–grassland boundaries. We evaluated forest expansion across a mosaic landscape in western Canada using aerial photos, airborne laser scanning, and field transects. The annual rate of forest expansion (0.12%) was on the low end of rates documented across North America and was greater from the 1970s to the 1990s than from the 1990s to 2018. Most forest expansion occurred within 50 m of established forests, and 68% of all tree regeneration in grasslands was within 15 m of the forest edge. The intensity of cattle grazing did not affect the tree regeneration density. Despite the slow pace of land cover change, grassland areas near the forest edge had an average of 20% canopy cover and 9 m canopy height, indicating the presence of tall but sporadic trees. The rate of forest expansion, density of tree regeneration, and tree cover within grasslands were all greater at lower elevations where trembling aspen (Populus tremuloides) and white spruce (Picea glauca) were the dominant tree species. We conclude that proportions of forest–grassland cover on this landscape are not expected to change dramatically in the absence of major fire over the next several decades. Full article
(This article belongs to the Section Landscape Ecology)
Show Figures

Figure 1

Figure 1
<p>Land cover type and data coverage within Cypress Hills Interprovincial Park. Panels show example areas of land cover change (<b>A</b>) on the western plateau, (<b>B</b>) in the Battle Creek valley, and (<b>C</b>) on the eastern plateau.</p>
Full article ">Figure 2
<p>Map of 20-year probability of grassland cells transitioning to forest in (<b>A</b>) time interval 1 and (<b>B</b>) time interval 2. Example areas of forest transition probability are shown for interval 1 (<b>C</b>–<b>E</b>) and interval 2 (<b>F</b>–<b>H</b>). Panels represent the same areas as in <a href="#land-10-01355-f001" class="html-fig">Figure 1</a>: (<b>C</b>,<b>F</b>) on the western plateau, (<b>D</b>,<b>G</b>) in the Battle Creek Valley, (<b>E</b>,<b>H</b>) on the eastern plateau.</p>
Full article ">Figure 3
<p>Changes in the 20-year probability of land cover change with increasing distance from the forest edge in interval 1 (<b>A</b>,<b>C</b>,<b>E</b>) and interval 2 (<b>B</b>,<b>D</b>,<b>F</b>). Lines and shaded regions represent the mean and 95% credible interval, respectively, of posterior estimates.</p>
Full article ">Figure 4
<p>Distribution of (<b>A</b>) upper canopy height and (<b>B</b>) canopy cover across spatial clusters averaging 1 ha in size of different land cover classes. Lower and upper bounds of the box represent the 25% and 75% quantile, respectively. Upper and lower whiskers extend to the largest and smallest values, respectively, but no more than 1.5 times the inter-quartile range. The central line represents the median value.Relationships of canopy height and cover to topographic variables varied between the different forest and grassland classes (<a href="#land-10-01355-t004" class="html-table">Table 4</a>). The models explained almost half the variation in forests established between the 1970s–1990s but less variation was explained for the other two categories (R<sup>2</sup> = 0.03–0.47), and only a few consistent patterns emerged.</p>
Full article ">Figure 5
<p>Effect of distance on regeneration abundance for individual transects. Note that the <span class="html-italic">y</span>-axis is on a logarithmic scale.Species-specific regeneration patterns generally followed the topographic distribution for each tree species (<a href="#land-10-01355-f006" class="html-fig">Figure 6</a>). White spruce regeneration was highest at low elevations and on steeper slopes with east and north aspects. Trembling aspen regeneration was highest at low elevations, but did not vary much with slope or aspect. Lodgepole pine regeneration was greatest at high elevations and on southerly aspects. The models showed no topographic effects on how far lodgepole pine or trembling aspen regeneration extended into the grasslands (<a href="#land-10-01355-f006" class="html-fig">Figure 6</a>). There was some indication that white spruce regeneration did not drop off as strongly with distance at higher elevations and southerly aspects, but the predicted density of regeneration at 25–50 m from the edge was negligible.</p>
Full article ">Figure 6
<p>Topographic influence on regeneration density at 5 m, 25 m, and 50 m from the forest edge for white spruce (<b>A</b>–<b>C</b>), trembling aspen (<b>D</b>–<b>F</b>), and lodgepole pine (<b>G</b>–<b>I</b>). Lines represent the posterior estimates and shaded areas around the lines represent the 95% credible interval. Panels where lines are not parallel to one another indicate that the effect of distance from the forest edge varies topographically (<b>A</b>,<b>C</b>). Note that the <span class="html-italic">y</span>-axis is on a logarithmic scale and ranges differ between panels.</p>
Full article ">
15 pages, 6443 KiB  
Article
Evaluating the Correlation between Thermal Signatures of UAV Video Stream versus Photomosaic for Urban Rooftop Solar Panels
by Young-Seok Hwang, Stephan Schlüter, Jung-Joo Lee and Jung-Sup Um
Remote Sens. 2021, 13(23), 4770; https://doi.org/10.3390/rs13234770 - 25 Nov 2021
Cited by 3 | Viewed by 2345
Abstract
The unmanned aerial vehicle (UAV) autopilot flight to survey urban rooftop solar panels needs a certain flight altitude at a level that can avoid obstacles such as high-rise buildings, street trees, telegraph poles, etc. For this reason, the autopilot-based thermal imaging has severe [...] Read more.
The unmanned aerial vehicle (UAV) autopilot flight to survey urban rooftop solar panels needs a certain flight altitude at a level that can avoid obstacles such as high-rise buildings, street trees, telegraph poles, etc. For this reason, the autopilot-based thermal imaging has severe data redundancy—namely, that non-solar panel area occupies more than 99% of ground target, causing a serious lack of the thermal markers on solar panels. This study aims to explore the correlations between the thermal signatures of urban rooftop solar panels obtained from a UAV video stream and autopilot-based photomosaic. The thermal signatures of video imaging are strongly correlated (0.89–0.99) to those of autopilot-based photomosaics. Furthermore, the differences in the thermal signatures of solar panels between the video and photomosaic are aligned in the range of noise equivalent differential temperature with a 95% confidence level. The results of this study could serve as a valuable reference for employing video stream-based thermal imaging to urban rooftop solar panels. Full article
Show Figures

Figure 1

Figure 1
<p>UAV thermal photomosaic for experimental site processed with Pix4D Mapper and images obtained from a UAV operated in the autopilot mode as well as in the manual mode (video): (<b>a</b>) autopilot-based thermal photomosaic; (<b>b</b>) 7.5 frames/s (15 frames /2 s); (<b>c</b>) 1 frame/s (1 frame/1 s); (<b>d</b>) 0.5 frame/s (1 frame/2 s).</p>
Full article ">Figure 2
<p>The number of overlapping images in thermal mosaic of the experimental site with the point cloud. The green areas indicate a degree of overlap by more than five images. The red and yellow areas indicate a low degree of overlap, resulting in poor mosaic quality: (<b>a</b>) photomosaic; (<b>b</b>) 7.5 frames/s (15 frames/2 s); (<b>c</b>) 1 frame/1 s; (<b>d</b>) 0.5 frames/s (1 frame /2 s).</p>
Full article ">Figure 3
<p>Locations of ground control points (GCPs) for building the photomosaics of the study area: (<b>a</b>) UAV video visible infrared (VIR) frame mosaic; (<b>b</b>) UAV video thermal infrared (TIR) frame mosaic.</p>
Full article ">Figure 4
<p>Scatter plots of SPSTs detected from photo (x-axis) and video (y-axis) mosaic: (<b>a</b>) 7.5 frames/s (15 frames/2 s); (<b>b</b>) 1 frame/1 s; (<b>c</b>) 0.5 frame/s (1 frame/2 s).</p>
Full article ">Figure 5
<p>QQ plot of sample data versus standard normal. x-axis presents the standard normal quantiles, while the y-axis presents the quantiles of the input sample: (<b>a</b>) temperature differences between <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>7.5</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>b</b>) temperature differences between <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>1</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>c</b>) temperature differences between <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>0.5</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> </mrow> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Results of Gaussian test of the differences between the SPSTs obtained from video and photomosaics. The blue line represents kernel density, while the orange line represents the theoretical Gaussian density with estimated empirical mean and standard deviation: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> versus <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>7.5</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> versus <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>1</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> </mrow> </msub> </mrow> </semantics></math>; (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mi>p</mi> </msub> </mrow> </semantics></math> versus <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>P</mi> <mi>S</mi> <msub> <mi>T</mi> <mrow> <mn>0.5</mn> <mi>f</mi> <mi>r</mi> <mi>a</mi> <mi>m</mi> <mi>e</mi> </mrow> </msub> </mrow> </semantics></math>.</p>
Full article ">
Back to TopTop