Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,740)

Search Parameters:
Keywords = radar remote sensing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 6889 KiB  
Article
Machine Learning-Based Detection of Icebergs in Sea Ice and Open Water Using SAR Imagery
by Zahra Jafari, Pradeep Bobby, Ebrahim Karami and Rocky Taylor
Remote Sens. 2025, 17(4), 702; https://doi.org/10.3390/rs17040702 - 19 Feb 2025
Viewed by 135
Abstract
Icebergs pose significant risks to shipping, offshore oil exploration, and underwater pipelines. Detecting and monitoring icebergs in the North Atlantic Ocean, where darkness and cloud cover are frequent, is particularly challenging. Synthetic aperture radar (SAR) serves as a powerful tool to overcome these [...] Read more.
Icebergs pose significant risks to shipping, offshore oil exploration, and underwater pipelines. Detecting and monitoring icebergs in the North Atlantic Ocean, where darkness and cloud cover are frequent, is particularly challenging. Synthetic aperture radar (SAR) serves as a powerful tool to overcome these difficulties. In this paper, we propose a method for automatically detecting and classifying icebergs in various sea conditions using C-band dual-polarimetric images from the RADARSAT Constellation Mission (RCM) collected throughout 2022 and 2023 across different seasons from the east coast of Canada. This method classifies SAR imagery into four distinct classes: open water (OW), which represents areas of water free of icebergs; open water with target (OWT), where icebergs are present within open water; sea ice (SI), consisting of ice-covered regions without any icebergs; and sea ice with target (SIT), where icebergs are embedded within sea ice. Our approach integrates statistical features capturing subtle patterns in RCM imagery with high-dimensional features extracted using a pre-trained Vision Transformer (ViT), further augmented by climate parameters. These features are classified using XGBoost to achieve precise differentiation between these classes. The proposed method achieves a low false positive rate of 1% for each class and a missed detection rate ranging from 0.02% for OWT to 0.04% for SI and SIT, along with an overall accuracy of 96.5% and an area under curve (AUC) value close to 1. Additionally, when the classes were merged for target detection (combining SI with OW and SIT with OWT), the model demonstrated an even higher accuracy of 98.9%. These results highlight the robustness and reliability of our method for large-scale iceberg detection along the east coast of Canada. Full article
Show Figures

Figure 1

Figure 1
<p>Distribution of targets over date and location.</p>
Full article ">Figure 2
<p>These figures show four sample RGB images from the RCM dataset, where Red = HH, Green = HV, and Blue = (HH-HV)/2. (<b>A</b>,<b>B</b>) depict OW and SI, while (<b>C</b>,<b>D</b>) show icebergs in OW and SI. Only red circles highlight icebergs; other bright pixels represent clutter or sea ice.</p>
Full article ">Figure 3
<p>Block diagram illustrating the proposed system.</p>
Full article ">Figure 4
<p>The impact of despeckling on iceberg images in the HH channel from the SAR dataset, using mean, bilateral, and Lee filters.</p>
Full article ">Figure 5
<p>(<b>A</b>) shows that feature #780 exhibits the most overlap and is considered a weak feature. (<b>B</b>) In contrast, feature #114 is the strongest feature, displaying the least overlap.</p>
Full article ">Figure 6
<p>ROC curves for the evaluated models: (<b>A</b>) ViTFM, (<b>B</b>) StatFM, (<b>C</b>) ViTStatFM, and (<b>D</b>) ViTStatClimFM. The curves illustrate the classification performance across OW, OWT, SI, and SIT categories.</p>
Full article ">Figure 7
<p>Confusion matrices depicting the classification performance of the hybrid model with climate features: (<b>A</b>) represents the classification performance across all four classes, (<b>B</b>) highlights the model’s ability to distinguish between target-containing patches and those without targets, and (<b>C</b>) evaluates the classification of sea ice (SI and SIT) versus open water (OW and OWT).</p>
Full article ">Figure 8
<p>Application of the proposed method to a calibrated RCM image acquired on 23 June 2023. (<b>A</b>) The RCM image overlaid on the Labrador coast. (<b>B</b>) Corresponding ice chart from the Canadian Ice Service for the same region and date. (<b>C</b>) Probability map for OW. (<b>D</b>) Probability map for SI. (<b>E</b>) Probability map for OWT. (<b>F</b>) Probability map for SIT.</p>
Full article ">Figure 9
<p>An extracted section from the full RCM image captured on 23 June 2023, showing icebergs embedded in SI. Red triangles indicate ground truth points, while green circles represent model predictions.</p>
Full article ">Figure 10
<p>Missed targets located near patch borders, illustrating boundary effects. (<b>A</b>) A missed target near the top-left patch border. (<b>B</b>) A missed target within a central region affected by boundary artifacts. (<b>C</b>) A missed target near the bottom-right patch border, highlighting prediction inconsistencies at patch edges.</p>
Full article ">
25 pages, 4721 KiB  
Article
Human Respiration and Motion Detection Based on Deep Learning and Signal Processing Techniques to Support Search and Rescue Teams
by Özden Niyaz, Mehmet Ziya Erenoğlu, Ahmet Serdar Türk, Sultan Aldirmaz Colak, Burcu Erkmen and Nurhan Türker Tokan
Appl. Sci. 2025, 15(4), 2097; https://doi.org/10.3390/app15042097 - 17 Feb 2025
Viewed by 174
Abstract
The quick and effective detection of humans trapped under debris is crucial in search and rescue operations. This study explores the use of antennas operating within the 150–650 MHz frequency range to identify human respiration and movement under building wreckage. A debris model [...] Read more.
The quick and effective detection of humans trapped under debris is crucial in search and rescue operations. This study explores the use of antennas operating within the 150–650 MHz frequency range to identify human respiration and movement under building wreckage. A debris model consisting of construction materials was generated at the laboratory, and attenuation characteristics were observed to set ideal operating frequencies. Time-dependent transmission coefficient data were collected over 20 s and processed using short-time Fourier transform, wavelet transform, and empirical mode decomposition for time-frequency analysis. To enhance signal clarity, denoising techniques were applied before the radar signals were categorized into three classes: empty debris, human respiration, and human movement. Generative adversarial networks augmented environmental noise data to enrich training datasets comprising nine subsets. Deep learning models, including temporal convolutional networks, long short-term memory, and convolutional neural networks, were employed for classification. Hyperparameter optimization via random search further refined model performance. Results indicate that the convolutional neural networks using short-time Fourier transform data consistently achieved the highest classification accuracy across subsets. These findings demonstrate the potential of combining radar with deep learning for reliable human detection under debris, advancing rescue efforts in disaster scenarios. Full article
Show Figures

Figure 1

Figure 1
<p>Radar based human detection system.</p>
Full article ">Figure 2
<p>Physically constructed debris model in our laboratory. (<b>a</b>) Front view. (<b>b</b>) Side view.</p>
Full article ">Figure 3
<p>Variation of the raw measured magnitude and phase of <math display="inline"><semantics> <msub> <mi>S</mi> <mn>21</mn> </msub> </semantics></math> signals at 405 MHz for different scenarios (<b>a</b>) Empty debris. (<b>b</b>) Respiration. (<b>c</b>) Movement. (<b>d</b>) Combined plots.</p>
Full article ">Figure 4
<p>Analysis of the <math display="inline"><semantics> <msub> <mi>S</mi> <mn>21</mn> </msub> </semantics></math> signals measured for different orientation angles of the subject: (<b>a</b>) FT of the measured phase of <math display="inline"><semantics> <msub> <mi>S</mi> <mn>21</mn> </msub> </semantics></math> signals at 405 MHz. (<b>b</b>) Average standard deviation of the phase and magnitude at different frequencies.</p>
Full article ">Figure 5
<p>STFT images of the measured signal for 20 s: (<b>a</b>) when the debris is empty. A subject is inside the debris with orientation of: (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>0</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>45</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>90</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>180</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>270</mn> <mo>°</mo> </msup> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>CWT images of the measured signal for 20 s: (<b>a</b>) when the debris is empty. A subject is inside the debris with orientation of: (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>0</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>45</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>90</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>180</mn> <mo>°</mo> </msup> </mrow> </semantics></math>. (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>270</mn> <mo>°</mo> </msup> </mrow> </semantics></math>.</p>
Full article ">Figure 7
<p>Selected IMFs of the EMD analysis for a measured signal of a moving subject inside the debris with <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> <msup> <mn>0</mn> <mo>°</mo> </msup> </mrow> </semantics></math> orientation angle.</p>
Full article ">Figure 8
<p>Comparison of original and GAN-augmented noise signals (Magnitude).</p>
Full article ">Figure 9
<p>Comparison of original and GAN-augmented noise signals (Phase).</p>
Full article ">Figure 10
<p>LSTM of original and GAN−: (<b>a</b>) Network. (<b>b</b>) Memory block with one cell.</p>
Full article ">Figure 11
<p>Convolutional neural network topology.</p>
Full article ">Figure 12
<p>Confusion matrices of the DL models for Subset 3: (<b>a</b>) LSTM. (<b>b</b>) TCN.</p>
Full article ">Figure 13
<p>Confusion matrices of the DL models for Subset 4: (<b>a</b>) LSTM. (<b>b</b>) TCN. (<b>c</b>) CNN (with STFT data). (<b>d</b>) CNN (with CWT data).</p>
Full article ">
20 pages, 4530 KiB  
Article
Mapping Forest Aboveground Biomass Using Multi-Source Remote Sensing Data Based on the XGBoost Algorithm
by Dejun Wang, Yanqiu Xing, Anmin Fu, Jie Tang, Xiaoqing Chang, Hong Yang, Shuhang Yang and Yuanxin Li
Forests 2025, 16(2), 347; https://doi.org/10.3390/f16020347 - 15 Feb 2025
Viewed by 228
Abstract
Aboveground biomass (AGB) serves as an important indicator for assessing the productivity of forest ecosystems and exploring the global carbon cycle. However, accurate estimation of forest AGB remains a significant challenge, especially when integrating multi-source remote sensing data, and the effects of different [...] Read more.
Aboveground biomass (AGB) serves as an important indicator for assessing the productivity of forest ecosystems and exploring the global carbon cycle. However, accurate estimation of forest AGB remains a significant challenge, especially when integrating multi-source remote sensing data, and the effects of different feature combinations for AGB estimation results are unclear. In this study, we proposed a method for estimating forest AGB by combining Gao Fen 7 (GF-7) stereo imagery with data from Sentinel-1 (S1), Sentinel-2 (S2), and the Advanced Land Observing Satellite digital elevation model (ALOS DEM), and field survey data. The continuous tree height (TH) feature was derived using GF-7 stereo imagery and the ALOS DEM. Spectral features were extracted from S1 and S2, and topographic features were extracted from the ALOS DEM. Using these features, 15 feature combinations were constructed. The recursive feature elimination (RFE) method was used to optimize each feature combination, which was then input into the extreme gradient boosting (XGBoost) model for AGB estimation. Different combinations of features used to estimate forest AGB were compared. The best model was selected for mapping AGB distribution at 30 m resolution. The outcomes showed that the forest AGB model was composed of 13 features, including TH, topographic, and spectral features extracted from S1 and S2 data. This model achieved the best prediction performance, with a determination coefficient (R2) of 0.71 and a root mean square error (RMSE) of 18.11 Mg/ha. TH was found to be the most important predictive feature, followed by S2 optical features, topographic features, and S1 radar features. Full article
(This article belongs to the Section Forest Inventory, Modeling and Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>True-color image map of the study area. The red and blue lines show the spatial coverage of the forward and backward images of GF-7, respectively. The red, green, and blue triangular markers indicate the locations of the 2012, 2022, and 2024 sampling data, respectively.</p>
Full article ">Figure 2
<p>Flowchart for the overall workflow of estimating forest AGB using combined multi-source remote sensing data.</p>
Full article ">Figure 3
<p>Distribution of GCPs and TPs used for DSM generation. (<b>a</b>) Ground control points, (<b>b</b>) Tie points.</p>
Full article ">Figure 4
<p>Performance of XGBoost models for AGB estimation using different feature combinations. (<b>a</b>) TS1S2D, (<b>b</b>) TS2D, (<b>c</b>) TS1S2.</p>
Full article ">Figure 5
<p>Relative importance ranking of features based on the XGBoost model built with the TS1S2D feature combination.</p>
Full article ">Figure 6
<p>Spatial distribution of AGB in the research area. (<b>a</b>) Distribution of AGB predicted by the XGBoost model with TS1S2D combination; (<b>b</b>) S2 true-color image of the region within the red box; (<b>c</b>) zoomed-in view of the red box in the AGB distribution map.</p>
Full article ">Figure 7
<p>Forest AGB estimation based on XGBoost model and TS1S2D feature combination. (<b>a</b>) Coniferous forests, (<b>b</b>) Broadleaf forests.</p>
Full article ">Figure 8
<p>Feature importance rankings for different forest types based on the XGBoost model and TS1S2D feature combination. (<b>a</b>) Coniferous forests, (<b>b</b>) Broadleaf forests.</p>
Full article ">Figure 9
<p>AGB difference maps. (<b>a</b>) TS2D–TS1S2D, (<b>b</b>) TS1S2–TS1S2D.</p>
Full article ">Figure 10
<p>Comparison of AGB distributions in this study and published datasets (Zhang et al. [<a href="#B41-forests-16-00347" class="html-bibr">41</a>], Yang et al. [<a href="#B44-forests-16-00347" class="html-bibr">44</a>], Chang et al. [<a href="#B52-forests-16-00347" class="html-bibr">52</a>]). The horizontal line in each box plot represents the median, the black dot indicates the mean, and the width of the violin plot reflects the data proportion.</p>
Full article ">
22 pages, 6150 KiB  
Article
An Unambiguous Super-Resolution Algorithm for TDM-MIMO-SAR 3D Imaging Applications on Fast-Moving Platforms
by Sheng Guan, Mingming Wang, Xingdong Liang, Yunlong Liu and Yanlei Li
Remote Sens. 2025, 17(4), 639; https://doi.org/10.3390/rs17040639 - 13 Feb 2025
Viewed by 315
Abstract
Multiple-Input Multiple-Output (MIMO) radar enjoys the advantages of a high degree of freedom and relatively large virtual aperture, so it has various forms of applications in several aspects such as remote sensing, autonomous driving and radar imaging. Among all multiplexing schemes, Time-Division Multiplexing [...] Read more.
Multiple-Input Multiple-Output (MIMO) radar enjoys the advantages of a high degree of freedom and relatively large virtual aperture, so it has various forms of applications in several aspects such as remote sensing, autonomous driving and radar imaging. Among all multiplexing schemes, Time-Division Multiplexing (TDM)-MIMO radar gains a wide range of interests, as it has a simple and low-cost hardware system which is easy to implement. However, the time-division nature of TDM-MIMO leads to the dilemma between the lower Pulse Repetition Interval (PRI) and more transmitters, as the PRI of a TDM-MIMO system is proportional to the number of transmitters while the number of transmitters significantly affects the resolution of MIMO radar. Moreover, a high PRI is often needed to obtain unambiguous imaging results for MIMO-SAR 3D imaging applications on a fast-moving platform such as a car or an aircraft. Therefore, it is of vital importance to develop an algorithm which can achieve unambiguous TDM-MIMO-SAR 3D imaging even when the PRI is low. Inspired by the motion compensation problem associated with TDM-MIMO radar imaging, this paper proposes a novel imaging algorithm which can utilize the phase shift induced by the time-division nature of TDM-MIMO radar to achieve unambiguous MIMO-SAR 3D imaging. A 2D-Compressed Sensing (CS)-based method is employed and the proposed method, which is called HPC-2D-FISTA, is verified by simulation data. Finally, a real-world experiment is conducted to show the unambiguous imaging ability of the proposed method compared with the ordinary matched-filter-based method. The effect of velocity error is also analyzed with simulation results. Full article
Show Figures

Figure 1

Figure 1
<p>The geometry of MIMO-SAR 3D imaging: (<b>a</b>) a typical example of an airborne MIMO-SAR application with a linear MIMO array; (<b>b</b>) a simplified situation of point targets imaging scenario.</p>
Full article ">Figure 2
<p>The geometry of virtual array approximation for TDM-MIMO-SAR 3D imaging.</p>
Full article ">Figure 3
<p>The three-dimensional data cube of the echo signal is divided into <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>N</mi> </mrow> <mrow> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> separate range bins after the fast-time pulse compression.</p>
Full article ">Figure 4
<p>The measured phase of the echo signal in the same range bin and the same TDM period of one point target after pulse compression: (<b>a</b>) the measured phase of 32 virtual elements, when <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>z</mi> </mrow> <mrow> <mi>p</mi> </mrow> </msub> <mo>=</mo> <mn>0.5</mn> <mo> </mo> <mi mathvariant="normal">m</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>p</mi> </mrow> </msub> <mo>=</mo> <mn>0</mn> <mo> </mo> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>; (<b>b</b>) the measured phase of 32 virtual elements, when <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>z</mi> </mrow> <mrow> <mi>p</mi> </mrow> </msub> <mo>=</mo> <mn>0.5</mn> <mo> </mo> <mi mathvariant="normal">m</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>p</mi> </mrow> </msub> <mo>=</mo> <mn>1</mn> <mo> </mo> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>The matched filter-based imaging scheme with phase compensation: (<b>a</b>) shows the entire process scheme; (<b>b</b>) demonstrates the process of echo signal of one point target.</p>
Full article ">Figure 5 Cont.
<p>The matched filter-based imaging scheme with phase compensation: (<b>a</b>) shows the entire process scheme; (<b>b</b>) demonstrates the process of echo signal of one point target.</p>
Full article ">Figure 6
<p>The proposed imaging scheme: (<b>a</b>) shows the entire process scheme; (<b>b</b>) demonstrates the process of echo signal of one point target.</p>
Full article ">Figure 7
<p>The simulation results show the difference between real targets and gating lobes: (<b>a</b>) the 2D-CS process results of gating lobes; (<b>b</b>) the 2D-CS process results of three real targets; (<b>c</b>) the extracted main diagonal of the 2D-CS process results shown in (<b>a</b>); (<b>d</b>) the extracted main diagonal of the 2D-CS process results shown in (<b>b</b>).</p>
Full article ">Figure 8
<p>The simulation results demonstrate the comparison of the MF-based method and the proposed method: (<b>a</b>) the location of the point targets; (<b>b</b>) the 2D projection of point targets; (<b>c</b>) imaging results of the MF-based method when the platform velocity is 10 m/s; (<b>d</b>) imaging results of the proposed method when the platform velocity is 10 m/s; (<b>e</b>) imaging results of the MF-based method when the platform velocity is 25 m/s; (<b>f</b>) imaging results of the proposed method when the platform velocity is 25 m/s; (<b>g</b>) imaging results of the MF-based method when the platform velocity is 50 m/s; (<b>h</b>) imaging results of the proposed method when the platform velocity is 50 m/s.</p>
Full article ">Figure 9
<p>The simulation results demonstrate the effect of velocity error. The exact speed of the platform is 50 m/s: (<b>a</b>) the location of the point targets; (<b>b</b>) the 2D projection of point targets; (<b>c</b>) results with a measured velocity of 50 m/s; (<b>d</b>) results with a measured velocity of 47.5 m/s; (<b>e</b>) results with a measured velocity of 45 m/s; (<b>f</b>) results with a measured velocity of 42.5 m/s.</p>
Full article ">Figure 10
<p>The structure of antenna elements on AWR2243: (<b>a</b>) a photo of AWR2243 cascaded radar; (<b>b</b>) the structure of AWR2243 antenna elements and chips.</p>
Full article ">Figure 11
<p>The experiment setup: (<b>a</b>,<b>b</b>) the AWR2243 MIMO radar is installed on a vehicle which reaches 3 m/s; (<b>c</b>) the vehicle is moving straight along a road at maximum speed; (<b>d</b>) three corner reflectors are set by the road; (<b>e</b>) 2D projection of the MF-based algorithm imaging result; (<b>f</b>) 2D projection of HPC-2D-FISTA imaging result.</p>
Full article ">Figure 12
<p>The experiment setup: (<b>a</b>) the AWR2243 MIMO radar is installed on a car which reaches 20 km/h; (<b>b</b>) the vehicle is moving straight along a road at 20 km/h and the radar is collecting data reflected by the target car; (<b>c</b>) 2D projection of the MF-based algorithm imaging result; (<b>d</b>) 2D projection of HPC-2D-FISTA imaging result.</p>
Full article ">
17 pages, 3052 KiB  
Article
Estimation of Daylily Leaf Area Index by Synergy Multispectral and Radar Remote-Sensing Data Based on Machine-Learning Algorithm
by Minhuan Hu, Jingshu Wang, Peng Yang, Ping Li, Peng He and Rutian Bi
Agronomy 2025, 15(2), 456; https://doi.org/10.3390/agronomy15020456 - 13 Feb 2025
Viewed by 287
Abstract
Rapid and accurate leaf area index (LAI) determination is important for monitoring daylily growth, yield estimation, and field management. Because of low estimation accuracy of empirical models based on single-source data, we proposed a machine-learning algorithm combining optical and microwave remote-sensing data as [...] Read more.
Rapid and accurate leaf area index (LAI) determination is important for monitoring daylily growth, yield estimation, and field management. Because of low estimation accuracy of empirical models based on single-source data, we proposed a machine-learning algorithm combining optical and microwave remote-sensing data as well as the random forest regression (RFR) importance score to select features. A high-precision LAI estimation model for daylilies was constructed by optimizing feature combinations. The RFR importance score screened the top five important features, including vegetation indices land surface water index (LSWI), generalized difference vegetation index (GDVI), normalized difference yellowness index (NDYI), and backscatter coefficients VV and VH. Vegetation index features characterized canopy moisture and the color of daylilies, and the backscatter coefficient reflected dielectric properties and geometric structure. The selected features were sensitive to daylily LAI. The RFR algorithm had good anti-noise performance and strong fitting ability; thus, its accuracy was better than the partial least squares regression and artificial neural network models. Synergistic optical and microwave data more comprehensively reflected the physical and chemical properties of daylilies, making the RFR-VI-BC05 model after feature selection better than the others ( r = 0.711, RMSE = 0.498, and NRMSE = 9.10%). This study expanded methods for estimating daylily LAI by combining optical and radar data, providing technical support for daylily management. Full article
Show Figures

Figure 1

Figure 1
<p>Location and sampling distribution of the study area.</p>
Full article ">Figure 2
<p>Technical route.</p>
Full article ">Figure 3
<p>The importance score of features. (<b>a</b>) shows the importance scores of the vegetation index, (<b>b</b>) displays the importance scores of the backscattering coefficient, and (<b>c</b>) presents the combined importance scores of both the vegetation index and backscattering coefficient.</p>
Full article ">Figure 4
<p>Regression prediction models based on radar data.</p>
Full article ">Figure 5
<p>Regression prediction models based on optical data.</p>
Full article ">Figure 6
<p>Regression prediction models based on multisource remote-sensing data.</p>
Full article ">Figure 7
<p>LAI inversion results of daylily and classification of LAI in each township.</p>
Full article ">
21 pages, 53374 KiB  
Article
FloodKAN: Integrating Kolmogorov–Arnold Networks for Efficient Flood Extent Extraction
by Cong Wang, Xiaohan Zhang and Liwei Liu
Remote Sens. 2025, 17(4), 564; https://doi.org/10.3390/rs17040564 - 7 Feb 2025
Viewed by 484
Abstract
Flood events are among the most destructive natural catastrophes worldwide and pose serious threats to socioeconomic systems, ecological environments, and the safety of human life and property. With the advancement of remote sensing technology, synthetic aperture radar (SAR) has provided new means for [...] Read more.
Flood events are among the most destructive natural catastrophes worldwide and pose serious threats to socioeconomic systems, ecological environments, and the safety of human life and property. With the advancement of remote sensing technology, synthetic aperture radar (SAR) has provided new means for flood monitoring. However, traditional methods have limitations when dealing with high noise levels and complex terrain backgrounds. To address this issue, in this study, we adopt an improved U-Net model incorporating the Kolmogorov–Arnold Network (KAN), referred to as UKAN, for the efficient extraction of flood inundation extents from multisource remote sensing data. UKAN integrates the efficient nonlinear mapping capabilities of KAN layers with the multiscale feature fusion mechanism of U-Net, enabling better capturing of complex nonlinear relationships and global features. Experiments were conducted on the C2S-MS Floods and MMFlood datasets, and the results indicate that the UKAN model outperforms traditional models in terms of metrics such as the intersection over union (IoU), precision, recall, and F1 score. On the C2S-MS Floods dataset and the MMFlood dataset, UKAN achieves IoUs of 87.95% and 78.31%, respectively, representing improvements of approximately 3.5 and three percentage points, respectively, over those of the traditional U-Net. Moreover, the model has significant advantages in terms of parameter efficiency and computational efficiency. These findings suggest that the UKAN model possesses greater accuracy and robustness in flood inundation area extraction tasks, which is highly important for increasing the monitoring and early warning capabilities of flood disasters. Full article
(This article belongs to the Section Remote Sensing in Geology, Geomorphology and Hydrology)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Example of C2S-MS Floods data.</p>
Full article ">Figure 2
<p>Example of MMFlood data.</p>
Full article ">Figure 3
<p>Overall architecture of flood UKAN.</p>
Full article ">Figure 4
<p>Structures of the convolution block and KAN block.</p>
Full article ">Figure 5
<p>Structure of the KAN layer.</p>
Full article ">Figure 6
<p>Comparison of results for C2S-MS Floods.</p>
Full article ">Figure 7
<p>Comparison of results for MMFlood.</p>
Full article ">Figure 8
<p>Comparative analysis of failure cases in flood inundation prediction across models.</p>
Full article ">Figure 9
<p>Comparison of classification performance for C2S-MS Floods.</p>
Full article ">Figure 10
<p>Comparison of classification performance for MMFlood.</p>
Full article ">Figure 11
<p>Evaluation of segmentation models: metrics, parameters, and computational complexity (input size = (16, 4, 256, 256)).</p>
Full article ">
71 pages, 4216 KiB  
Review
Advances in Remote Sensing and Deep Learning in Coastal Boundary Extraction for Erosion Monitoring
by Marc-André Blais and Moulay A. Akhloufi
Geomatics 2025, 5(1), 9; https://doi.org/10.3390/geomatics5010009 - 6 Feb 2025
Viewed by 449
Abstract
Erosion is a critical geological process that degrades soil and poses significant risks to human settlements and natural habitats. As climate change intensifies, effective coastal erosion management and prevention have become essential for our society and the health of our planet. Given the [...] Read more.
Erosion is a critical geological process that degrades soil and poses significant risks to human settlements and natural habitats. As climate change intensifies, effective coastal erosion management and prevention have become essential for our society and the health of our planet. Given the vast extent of coastal areas, erosion management efforts must prioritize the most vulnerable and critical regions. Identifying and prioritizing these areas is a complex task that requires the accurate monitoring and forecasting of erosion and its potential impacts. Various tools and techniques have been proposed to assess the risks, impacts and rates of coastal erosion. Specialized methods, such as the Coastal Vulnerability Index, have been specifically designed to evaluate the susceptibility of coastal areas to erosion. Coastal boundaries, a critical factor in coastal erosion monitoring, are typically extracted from remote sensing images. Due to the extensive scale of coastal areas and the complexity of the data, manually extracting coastal boundaries is challenging. Recently, artificial intelligence, particularly deep learning, has emerged as a promising and essential tool for this task. This review provides an in-depth analysis of remote sensing and deep learning for extracting coastal boundaries to assist in erosion monitoring. Various remote sensing imaging modalities (optical, thermal, radar), platforms (satellites, drones) and datasets are first presented to provide the context for this field. Artificial intelligence and its associated metrics are then discussed, followed by an exploration of deep learning algorithms for extracting coastal boundaries. The presented algorithms range from basic convolutional networks to encoder–decoder architectures and attention mechanisms. An overview of how these extracted boundaries and other deep learning algorithms can be utilized for monitoring coastal erosion is also provided. Finally, the current gaps, limitations and potential future directions in this field are identified. This review aims to offer critical insights into the future of erosion monitoring and management through deep learning-based boundary extraction. Full article
Show Figures

Figure 1

Figure 1
<p>Overview of the review structure.</p>
Full article ">Figure 2
<p>Electromagnetic spectrum.</p>
Full article ">Figure 3
<p>Results of pan-sharpening.</p>
Full article ">Figure 4
<p>Visualization of composites: FC1 (NIR, Red, Green), FC2 (NIR, SWIR2, Red) and true color (Red, Green, Blue).</p>
Full article ">Figure 5
<p>Examples of indices for the same region.</p>
Full article ">Figure 6
<p>Examples of datasets: RGB in the first row, the corresponding masks in the second.</p>
Full article ">Figure 7
<p>(<b>a</b>) AI subsets showing the hierarchy of AI, ML and DL. (<b>b</b>) A simple neural network diagram explaining its components.</p>
Full article ">Figure 8
<p>Differences in algorithms.</p>
Full article ">Figure 9
<p>UNet structure.</p>
Full article ">Figure 10
<p>DeepLabV3+ structure.</p>
Full article ">Figure 11
<p>DANet-SMIW structure.</p>
Full article ">Figure 12
<p>SRMA structure.</p>
Full article ">Figure 13
<p>LaeNet structure.</p>
Full article ">Figure 14
<p>CSAFNet structure.</p>
Full article ">
26 pages, 13415 KiB  
Article
A Methodology for the Multitemporal Analysis of Land Cover Changes and Urban Expansion Using Synthetic Aperture Radar (SAR) Imagery: A Case Study of the Aburrá Valley in Colombia
by Ahmed Alejandro Cardona-Mesa, Rubén Darío Vásquez-Salazar, Juan Camilo Parra, César Olmos-Severiche, Carlos M. Travieso-González and Luis Gómez
Remote Sens. 2025, 17(3), 554; https://doi.org/10.3390/rs17030554 - 6 Feb 2025
Viewed by 554
Abstract
The Aburrá Valley, located in the northwestern region of Colombia, has undergone significant land cover changes and urban expansion in recent decades, driven by rapid population growth and infrastructure development. This region, known for its steep topography and dense urbanization, faces considerable environmental [...] Read more.
The Aburrá Valley, located in the northwestern region of Colombia, has undergone significant land cover changes and urban expansion in recent decades, driven by rapid population growth and infrastructure development. This region, known for its steep topography and dense urbanization, faces considerable environmental challenges. Monitoring these transformations is essential for informed territorial planning and sustainable development. This study leverages Synthetic Aperture Radar (SAR) imagery from the Sentinel-1 mission, covering 2017–2024, to propose a methodology for the multitemporal analysis of land cover dynamics and urban expansion in the valley. The novel proposed methodology comprises several steps: first, monthly SAR images were acquired for every year under study from 2017 to 2024, ensuring the capture of surface changes. These images were properly calibrated, rescaled, and co-registered. Then, various multitemporal fusions using statistics operations were proposed to detect and find different phenomena related to land cover and urban expansion. The methodology also involved statistical fusion techniques—median, mean, and standard deviation—to capture urbanization dynamics. The kurtosis calculations highlighted areas where infrequent but significant changes occurred, such as large-scale construction projects or sudden shifts in land use, providing a statistical measure of surface variability throughout the study period. An advanced clustering technique segmented images into distinctive classes, utilizing fuzzy logic and a kernel-based method, enhancing the analysis of changes. Additionally, Pearson correlation coefficients were calculated to explore the relationships between identified land cover change classes and their spatial distribution across nine distinct geographic zones in the Aburrá Valley. The results highlight a marked increase in urbanization, particularly along the valley’s periphery, where previously vegetated areas have been replaced by built environments. Additionally, the visual inspection analysis revealed areas of high variability near river courses and industrial zones, indicating ongoing infrastructure and construction projects. These findings emphasize the rapid and often unplanned nature of urban growth in the region, posing challenges to both natural resource management and environmental conservation efforts. The study underscores the need for the continuous monitoring of land cover changes using advanced remote sensing techniques like SAR, which can overcome the limitations posed by cloud cover and rugged terrain. The conclusions drawn suggest that SAR-based multitemporal analysis is a robust tool for detecting and understanding urbanization’s spatial and temporal dynamics in regions like the Aburrá Valley, providing vital data for policymakers and planners to promote sustainable urban development and mitigate environmental degradation. Full article
Show Figures

Figure 1

Figure 1
<p>The Aburrá Valley (white line) between the valleys of the Magdalena and Cauca rivers. Data were acquired from ALOS PALSAR Terrain Corrected and data from IGAC.</p>
Full article ">Figure 2
<p>Region of interest (yellow bounding box) selected from the interior of the Aburrá Valley (red line) and the municipalities that are part of it (green lines). Data were acquired from IGAC.</p>
Full article ">Figure 3
<p>Proposed methodology for the multitemporal analysis <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>M</mi> <msub> <mi>A</mi> <mn>1</mn> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Proposed methodology for kurtosis multitemporal analysis, <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>M</mi> <msub> <mi>A</mi> <mn>2</mn> </msub> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>Proposed methodology for analysis of zonal land cover changes.</p>
Full article ">Figure 6
<p>Samples of resulting images of the multitemporal analysis methodology proposed in <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>M</mi> <msub> <mi>A</mi> <mn>1</mn> </msub> </mrow> </semantics></math> (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mrow> <mi>M</mi> <mi>d</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math>, (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mi>σ</mi> </msub> </mrow> </semantics></math>, (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mi>M</mi> </msub> </mrow> </semantics></math>, (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mrow> <mi>C</mi> <mi>V</mi> </mrow> </msub> </mrow> </semantics></math>, (<b>e</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mi>C</mi> </msub> </mrow> </semantics></math> for the year 2018, and (<b>f</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <msub> <mi>F</mi> <mi>K</mi> </msub> </mrow> </semantics></math> of <math display="inline"><semantics> <mrow> <mi>S</mi> <mi>M</mi> <msub> <mi>A</mi> <mn>2</mn> </msub> </mrow> </semantics></math> for 2017–2014. Scale, coordinate frame (grid), and north correspond to the region described in the Study area section.</p>
Full article ">Figure 7
<p>Areas of analysis of the results by the SMA1 methodological route and kurtosis. (<b>A</b>). Central Park in Bello (<b>B</b>). Parques del Río Medellín (<b>C</b>). Arkadia Shopping center; (<b>D</b>). Peldar Plant (<b>E</b>). La García water supply reservoir (<b>F</b>). Conasfaltos dam (<b>G</b>). La Ayurá stream basin in Envigado (<b>H</b>). Central Park in Bello (<b>I</b>). Avenida Regional Norte (<b>J</b>). Vía Distribuidora Sur.</p>
Full article ">Figure 8
<p>Side-by-side comparison of the Aburrá Valley. (<b>a</b>) Division into 9 geographical zones. (<b>b</b>) The corresponding correlation coefficients for 5 different land cover change types of the 9 zones.</p>
Full article ">Figure 9
<p>Color maps for every change class in the Aburrá Valley’s nine geographical zones.</p>
Full article ">
25 pages, 13341 KiB  
Article
Static-Aperture Synthesis Method in Remote Sensing and Non-Destructive Testing Applications
by Olha Inkarbaieva, Denys Kolesnikov, Danyil Kovalchuk, Volodymyr Pavlikov, Volodymyr Ponomaryov, Beatriz Garcia-Salgado, Valerii Volosyuk and Semen Zhyla
Mathematics 2025, 13(3), 502; https://doi.org/10.3390/math13030502 - 3 Feb 2025
Viewed by 625
Abstract
The study is dedicated to the statistical optimization of radar imaging of surfaces with the synthetic aperture radar (SAR) technique, assuming a static surface area and applying the ability to move a sensor along a nonlinear trajectory via developing a new method and [...] Read more.
The study is dedicated to the statistical optimization of radar imaging of surfaces with the synthetic aperture radar (SAR) technique, assuming a static surface area and applying the ability to move a sensor along a nonlinear trajectory via developing a new method and validating its operability for remote sensing and non-destructive testing. The developed models address the sensing geometry for signals reflected from a surface along with the observation signal–noise equation, including correlation properties. Moreover, the optimal procedures for coherent radar imaging of surfaces with the static SAR technology are synthesized according to the maximum likelihood estimation (MLE). The features of the synthesized algorithm are the decoherence of the received oscillations, the matched filtering of the received signals, and the possibility of using continuous signal coherence. Furthermore, the developed optimal and quasi-optimal algorithms derived from the proposed MLE have been investigated. The novel framework for radio imaging has demonstrated good overall operability and efficiency during simulation modeling (using the MATLAB environment) for real sensing scenes. The developed algorithms of spatio–temporal signal processing in systems with a synthesized antenna with nonlinear carrier trajectories open a promising direction for creating new methods of high-precision radio imaging from UAVs and helicopters. Full article
Show Figures

Figure 1

Figure 1
<p>Surface-sensing geometry in static-aperture synthesis of antenna.</p>
Full article ">Figure 2
<p>Block diagram of the radar for radio imaging of surfaces with the static-aperture synthesis technique.</p>
Full article ">Figure 3
<p>Block diagram of the radar imaging in the simulation model.</p>
Full article ">Figure 4
<p>Scanner trajectory (<b>a</b>) and the corresponding ambiguity function (<b>b</b>) for linear motion along the <span class="html-italic">x</span>-coordinate.</p>
Full article ">Figure 5
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for linear diagonal motion.</p>
Full article ">Figure 6
<p>Scanner trajectory (<b>a</b>) and the corresponding ambiguity function (<b>b</b>) for L-shaped motion.</p>
Full article ">Figure 7
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for circular motion.</p>
Full article ">Figure 8
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for an hourglass-shape motion.</p>
Full article ">Figure 9
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for a Y-shaped motion.</p>
Full article ">Figure 10
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for Z-shaped motion.</p>
Full article ">Figure 11
<p>Scanner trajectory (<b>a</b>) and corresponding uncertainty function (<b>b</b>) for a square motion.</p>
Full article ">Figure 12
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for an “isosceles triangle” motion.</p>
Full article ">Figure 13
<p>Scanner trajectory (<b>a</b>) and corresponding ambiguity function (<b>b</b>) for a W-shaped motion.</p>
Full article ">Figure 14
<p>Ideal radar image (<b>a</b>) and radar images obtained for the following motion trajectories: (<b>b</b>) linear motion along the <span class="html-italic">x</span>-coordinate; (<b>c</b>) linear diagonal motion; (<b>d</b>) L-shaped motion, (<b>e</b>) circular motion; (<b>f</b>) hourglass-shape motion; (<b>g</b>) Y-shaped motion; (<b>h</b>) Z-shaped motion; (<b>i</b>) square motion; (<b>j</b>) “isosceles triangle” motion; (<b>k</b>) W-shaped motion.</p>
Full article ">Figure 15
<p>Scanner trajectories and corresponding ambiguity functions for a square motion with (<b>a</b>,<b>b</b>) 2.5 mm variation; (<b>c</b>,<b>d</b>) 5 mm variation; (<b>e</b>,<b>f</b>) 7.5 mm variation.</p>
Full article ">Figure 15 Cont.
<p>Scanner trajectories and corresponding ambiguity functions for a square motion with (<b>a</b>,<b>b</b>) 2.5 mm variation; (<b>c</b>,<b>d</b>) 5 mm variation; (<b>e</b>,<b>f</b>) 7.5 mm variation.</p>
Full article ">Figure 16
<p>Radar images obtained for the square trajectory with variations: (<b>a</b>) 2.5 mm; (<b>b</b>) 5 mm; (<b>c</b>) 7.5 mm.</p>
Full article ">Figure 17
<p>MSE errors as function of the radio scanner trajectory variations (red line represents the trend of MSE increase over the variation value).</p>
Full article ">
15 pages, 1439 KiB  
Technical Note
An Optimized Diffuse Kalman Filter for Frequency and Phase Synchronization in Distributed Radar Networks
by Xueyin Geng, Jun Wang, Bin Yang and Jinping Sun
Remote Sens. 2025, 17(3), 497; https://doi.org/10.3390/rs17030497 - 31 Jan 2025
Viewed by 462
Abstract
Distributed radar networks have emerged as a key technology in remote sensing and surveillance due to their high transmission power and robustness against node failures. When performing coherent beamforming with multiple radars, frequency and phase deviations introduced by independent oscillators lead to a [...] Read more.
Distributed radar networks have emerged as a key technology in remote sensing and surveillance due to their high transmission power and robustness against node failures. When performing coherent beamforming with multiple radars, frequency and phase deviations introduced by independent oscillators lead to a decrease in transmission power. This paper proposes an optimized diffuse Kalman filter (ODKF) for the frequency and phase synchronization. Specifically, each radar locally estimates its frequency and phase, then shares this information with neighboring nodes, which are used for incremental update and diffusion update to adjust local estimates. To further reduce synchronization errors, we incorporate a self-feedback strategy in the diffusion step, in which each node balances its own estimate with neighbor information by optimizing the diagonal weights in the diffusion matrix. Numerical simulations demonstrate the superior performance of the proposed method in terms of mean squared deviation (MSD) and convergence speed. Full article
(This article belongs to the Special Issue Array and Signal Processing for Radar)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) A schematic diagram of wireless synchronization for distributed UAV-borne radars, with <math display="inline"><semantics> <mrow> <mi>N</mi> <mo>=</mo> <mn>5</mn> </mrow> </semantics></math> as an example. Each colored line represents the transmitted waveform propagating toward a far field point target. (<b>b</b>) The transmitted signals are coherently superimposed at the point target.</p>
Full article ">Figure 2
<p>Radiated normalized energy of 5 radars randomly arranged in the x-y plane, with the statistical beamforming gain at <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math> (indicated by a red cross symbol), is designated as the evaluation metric for synchronization performance. (<b>a</b>) Pattern of synchronized radars. (<b>b</b>) Pattern of unsynchronized radars with <math display="inline"><semantics> <mrow> <msub> <mi>σ</mi> <mi>φ</mi> </msub> <mo>=</mo> <msup> <mn>20</mn> <mo>∘</mo> </msup> </mrow> </semantics></math> phase errors.</p>
Full article ">Figure 3
<p>Diagram of the diffusion Kalman filter.</p>
Full article ">Figure 4
<p>Normalized statistical beamforming gain with phase variances <math display="inline"><semantics> <msub> <mi>σ</mi> <mi>ϕ</mi> </msub> </semantics></math>.</p>
Full article ">Figure 5
<p>Normalized statistical beamforming gain with frequency variances <math display="inline"><semantics> <mrow> <msub> <mi>σ</mi> <msub> <mi>f</mi> <mi>D</mi> </msub> </msub> <mo>,</mo> <msub> <mi>σ</mi> <mi>f</mi> </msub> </mrow> </semantics></math> and time <span class="html-italic">t</span>.</p>
Full article ">Figure 6
<p>The network topology consists of <math display="inline"><semantics> <mrow> <mi>N</mi> <mo>=</mo> <mn>20</mn> </mrow> </semantics></math> distributed radar nodes labeled 1–20, where the dotted lines represent the wireless synchronization links.</p>
Full article ">Figure 7
<p>Frequency synchronization deviations (in Hz) over 15 iterations for DFPC, KF-DFPC, Metropolis-based DKF, and ODKF methods, under conditions <math display="inline"><semantics> <mrow> <msub> <mi>σ</mi> <mi>f</mi> </msub> <mo>=</mo> <mn>100</mn> </mrow> </semantics></math> Hz and <math display="inline"><semantics> <mrow> <msub> <mi>σ</mi> <msub> <mi>f</mi> <mi>D</mi> </msub> </msub> <mo>=</mo> <mn>50</mn> </mrow> </semantics></math> Hz. Each colored line represents the frequency deviation of a different node.</p>
Full article ">Figure 8
<p>Phase synchronization deviations (in degree) over 15 iterations for DFPC, KF-DFPC, Metropolis-based DKF, and ODKF methods, under condition <math display="inline"><semantics> <mrow> <msub> <mi>σ</mi> <mi>φ</mi> </msub> <mo>=</mo> <mi>π</mi> </mrow> </semantics></math>. Each colored line represents the phase deviation of a different node.</p>
Full article ">Figure 9
<p>Comparison of normalized MSD for frequency synchronization using DKF, DFPC, KF-DFPC, and FA-DKF with different <math display="inline"><semantics> <mi>γ</mi> </semantics></math> value.</p>
Full article ">Figure 10
<p>Comparison of normalized MSD for phase synchronization using DKF, DFPC, KF-DFPC, and FA-DKF with different <math display="inline"><semantics> <mi>γ</mi> </semantics></math> value.</p>
Full article ">Figure 11
<p>Eigenvalue distributions of the diffusion matrix <math display="inline"><semantics> <mi mathvariant="bold">C</mi> </semantics></math>. The red dotted line indicates the second largest eigenvalue.</p>
Full article ">
27 pages, 2703 KiB  
Article
Optimization of Autoencoders for Speckle Reduction in SAR Imagery Through Variance Analysis and Quantitative Evaluation
by Ahmed Alejandro Cardona-Mesa, Rubén Darío Vásquez-Salazar, Jean P. Diaz-Paz, Henry O. Sarmiento-Maldonado, Luis Gómez and Carlos M. Travieso-González
Mathematics 2025, 13(3), 457; https://doi.org/10.3390/math13030457 - 30 Jan 2025
Viewed by 478
Abstract
Speckle reduction in Synthetic Aperture Radar (SAR) images is a crucial challenge for effective image analysis and interpretation in remote sensing applications. This study proposes a novel deep learning-based approach using autoencoder architectures for SAR image despeckling, incorporating analysis of variance (ANOVA) for [...] Read more.
Speckle reduction in Synthetic Aperture Radar (SAR) images is a crucial challenge for effective image analysis and interpretation in remote sensing applications. This study proposes a novel deep learning-based approach using autoencoder architectures for SAR image despeckling, incorporating analysis of variance (ANOVA) for hyperparameter optimization. The research addresses significant gaps in existing methods, such as the lack of rigorous model evaluation and the absence of systematic optimization techniques for deep learning models in SAR image processing. The methodology involves training 240 autoencoder models on real-world SAR data, with performance metrics evaluated using Mean Squared Error (MSE), Structural Similarity Index (SSIM), Peak Signal-to-Noise Ratio (PSNR), and Equivalent Number of Looks (ENL). By employing Pareto frontier optimization, the study identifies models that effectively balance denoising performance with the preservation of image fidelity. The results demonstrate substantial improvements in speckle reduction and image quality, validating the effectiveness of the proposed approach. This work advances the application of deep learning in SAR image denoising, offering a comprehensive framework for model evaluation and optimization. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Ground-truth, (<b>b</b>) ground-truth zoom, (<b>c</b>) noisy, (<b>d</b>) noisy zoom.</p>
Full article ">Figure 2
<p>General autoencoder (AE) architecture.</p>
Full article ">Figure 3
<p>Autoencoder (AE) architecture.</p>
Full article ">Figure 4
<p>Summary of the proposed method.</p>
Full article ">Figure 5
<p>(<b>a</b>) Ground-truth. (<b>b</b>) Ground-truth zoom. (<b>c</b>) Noisy. (<b>d</b>) Noisy zoom. (<b>e</b>) Filtered M94. (<b>f</b>) Filtered zoom M94. (<b>g</b>) Filtered M160. (<b>h</b>) Filtered zoom M160.</p>
Full article ">Figure 6
<p>Scatter plot of metrics with Pareto Frontier models highlighted.</p>
Full article ">Figure 7
<p>Box plot of metrics for 17 optimum models.</p>
Full article ">Figure 8
<p>Confidence interval of MSE.</p>
Full article ">Figure 9
<p>Confidence interval of SSIM.</p>
Full article ">Figure 10
<p>Confidence interval of PSNR.</p>
Full article ">Figure 11
<p>Confidence interval of ENL.</p>
Full article ">Figure 12
<p>(<b>a</b>) Ground-truth. (<b>b</b>) Ground-truth zoom. (<b>c</b>) Noisy. (<b>d</b>) Noisy zoom. (<b>e</b>) Filtered M180. (<b>f</b>) Filtered zoom M180.</p>
Full article ">Figure 13
<p>(<b>a</b>) Ground-truth. (<b>b</b>) Ground-truth zoom. (<b>c</b>) Noisy. (<b>d</b>) Noisy zoom. (<b>e</b>) Filtered M0. (<b>f</b>) Filtered zoom M0. (<b>g</b>) Filtered M36. (<b>h</b>) Filtered zoom M36. (<b>i</b>) Filtered M180. (<b>j</b>) Filtered zoom M180.</p>
Full article ">
29 pages, 21542 KiB  
Article
Study of Hydrologic Connectivity and Tidal Influence on Water Flow Within Louisiana Coastal Wetlands Using Rapid-Repeat Interferometric Synthetic Aperture Radar
by Bhuvan K. Varugu, Cathleen E. Jones, Talib Oliver-Cabrera, Marc Simard and Daniel J. Jensen
Remote Sens. 2025, 17(3), 459; https://doi.org/10.3390/rs17030459 - 29 Jan 2025
Viewed by 575
Abstract
The exchange of water, sediment, and nutrients in wetlands occurs through a complex network of channels and overbank flow. Although optical sensors can map channels at high resolution, they fail to identify narrow intermittent channels colonized by vegetation. Here we demonstrate an innovative [...] Read more.
The exchange of water, sediment, and nutrients in wetlands occurs through a complex network of channels and overbank flow. Although optical sensors can map channels at high resolution, they fail to identify narrow intermittent channels colonized by vegetation. Here we demonstrate an innovative application of rapid-repeat interferometric synthetic aperture radar (InSAR) to study hydrologic connectivity and tidal influences in Louisiana’s coastal wetlands, which can provide valuable insights into water flow dynamics, particularly in vegetation-covered and narrow channels where traditional optical methods struggle. Data used were from the airborne UAVSAR L-band sensor acquired for the Delta-X mission. We applied interferometric techniques to rapid-repeat (~30 min) SAR imagery of the southern Atchafalaya basin acquired during two flights encompassing rising-to-high tides and ebbing-to-low tides. InSAR coherence is used to identify and differentiate permanent open water channels from intermittent channels in which flow occurs underneath the vegetation canopy. The channel networks at rising and ebbing tides show significant differences in the extent of flow, with vegetation-filled small channels more clearly identified at rising-to-high tide. The InSAR phase change is used to identify locations on channel banks where overbank flow occurs, which is a critical component for modeling wetland hydrodynamics. This is the first study to use rapid-repeat InSAR to monitor tidal impacts on water flow dynamics in wetlands. The results show that the InSAR method outperforms traditional optical remote sensing methods in monitoring water flow in vegetation-covered wetlands, providing high-resolution data to support hydrodynamic models and critical support for wetland protection and management. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Study area showing the UAVSAR image footprint (blue rectangle) and other areas discussed in the text. (<b>b</b>) Water level at the Amerada Pass and Berwick tide gauges during the two days of the UAVSAR acquisitions. (<b>c</b>,<b>d</b>) Plots showing the tidal variation during the ebbing-to-low tide (<b>c</b>) and rising-to-high tide (<b>d</b>) UAVSAR flights. The times of the UAVSAR acquisitions are indicated with vertical lines.</p>
Full article ">Figure 2
<p>Mean coherence image of NN and NN1 and 2 interferograms for the Wax Lake Delta area during rising tide (<b>a</b>,<b>b</b>) and ebbing tide (<b>c</b>,<b>d</b>) (Geographic location and extent of the area are shown with a purple dotted rectangle in <a href="#remotesensing-17-00459-f001" class="html-fig">Figure 1</a>. See <a href="#app1-remotesensing-17-00459" class="html-app">Supplement Figure S1</a> for full frame version). (<b>e</b>) Illustration showing the water level in a channel containing vegetation and SAR backscatter from the channel surface depending on the presence of emergent vegetation, which varies across the tidal cycle. (<b>f</b>) Variation in coherence with time over pixels identified as open water, wetland, and small/intermittent channels (locations were identified using high-resolution NAIP imagery (<a href="#app1-remotesensing-17-00459" class="html-app">Supplement Figure S4</a>)). Orange line in (<b>d</b>) shows the island used to identify representative locations for (<b>f</b>).</p>
Full article ">Figure 3
<p>Histograms of mean coherence of (<b>a</b>) NN and (<b>b</b>) NN1 and 2 rising-tide interferograms. (<b>c</b>) Flowchart of the steps to derive the channel maps from coherence showing the thresholds used. Pixels below the <math display="inline"><semantics> <mrow> <msub> <mi>γ</mi> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mo>_</mo> <mi>N</mi> <mi>N</mi> </mrow> </msub> </mrow> </semantics></math> threshold are identified as open water channels. Pixels below the <math display="inline"><semantics> <mrow> <msub> <mi>γ</mi> <mrow> <mi>m</mi> <mi>e</mi> <mi>a</mi> <mi>n</mi> <mo>_</mo> <mi>N</mi> <mi>N</mi> <mn>12</mn> </mrow> </msub> </mrow> </semantics></math> threshold are identified as intermittent channels only if they are not identified as open water pixels.</p>
Full article ">Figure 4
<p>(<b>a</b>) Connected components grown over channel mask. (<b>b</b>) Connected components after morphological erosion operation. (<b>c</b>) Channel bank pixels obtained after subtracting (<b>b</b>) from (<b>a</b>). (<b>d</b>) Flowchart of the steps to derive overbank flow from channel mask and phase timeseries. Geographic location and extent of the area in are shown with an orange dotted rectangle in <a href="#remotesensing-17-00459-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 5
<p>(<b>a</b>) Channel mask using mean coherence of rising-tide NN interferograms; (<b>b</b>) channel mask using mean coherence of rising-tide NN1 and 2 interferograms; (<b>c</b>) combined mask retaining channels from (<b>a</b>) as open water channels and channels only in (<b>b</b>) as intermittent channels. The wetland category (green) contains pixels that maintained high average coherence in both NN and NN1 and 2 interferograms. (<b>d</b>) InSAR phase velocity from rising-tide data showing areas of water-level change. The warmer colors (yellow-red) indicate a rise in water level and the cooler colors (cyan-blue) indicate a fall in water level.</p>
Full article ">Figure 6
<p>Comparison of optical (<b>a</b>) AVIRIS-NG and (<b>b</b>) NAIP water channel masks with (<b>c</b>) UAVSAR mask. Classification accuracy indicating the location of water class FPs and FNs in the UAVSAR mask compared to (<b>d</b>) AVIRIS-NG and (<b>e</b>) NAIP masks. In d and e, white color indicates pixels’ class matches between both optical and SAR masks (see <a href="#app1-remotesensing-17-00459" class="html-app">Supplement Figure S2</a> for full frame version).</p>
Full article ">Figure 7
<p>(<b>a</b>) InSAR phase velocity (<math display="inline"><semantics> <mrow> <msub> <mi>ϕ</mi> <mrow> <mi>v</mi> <mi>e</mi> <mi>l</mi> </mrow> </msub> </mrow> </semantics></math>) for the rising-tide data set containing signals from both water-level change and tropospheric water vapor delay, the latter particularly evident in the far range (right-hand side); (<b>b</b>) illustrated procedure to identify noisy CCs shown for a single CC; (<b>c</b>–<b>e</b>) procedure applied to three different areas indicated with black rectangles in (<b>a</b>): (<b>c</b>) InSAR phase velocity; (<b>d</b>) phase velocity averaged over the interior of each CC; (<b>e</b>) noisy CCs removed after comparison of average CC phase velocity to the average phase velocity on the CC’s bank. Dotted ellipses focus on the CCs illustrated.</p>
Full article ">Figure 8
<p>(<b>a</b>) SAR channel mask derived from rising-tide InSAR data. (<b>b</b>) Equivalent channel network observed with SAR using the ebbing-tide InSAR data. (<b>c</b>) Focus on the rectangle in (<b>a</b>) for the rising-tide channel network and (<b>d</b>) ebbing-tide channel network.</p>
Full article ">Figure 9
<p>(<b>a</b>) SAR channel mask and locations of islands (white polygons) showing the extent of intermittent channels identified by (<b>b</b>) SAR, (<b>c</b>) AVIRIS-NG, and (<b>d</b>) NAIP-derived channel masks. White polygon in (<b>a</b>) show the location of islands zoomed in (<b>b</b>–<b>d</b>).</p>
Full article ">Figure 10
<p>Water-level increase and decrease identified by InSAR phase change on the banks of water channels as a function of time during (<b>a</b>) rising tide and (<b>b</b>) ebbing tide. Geographic location and extent of the area are shown with an orange dotted rectangle in <a href="#remotesensing-17-00459-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 11
<p>UAVSAR phase change observed in the interior of the wetland during (<b>a</b>,<b>d</b>) rising tide and (<b>b</b>,<b>e</b>) ebbing tide. The difference between where the water goes during the different stages of the tides is clear. Overbank flow is more extensive at rising/high tide, but small channels still transport water within the island interior at ebbing/low tide. (<b>c</b>,<b>f</b>) Optical images to show the landscape (source: Google Earth [<a href="#B86-remotesensing-17-00459" class="html-bibr">86</a>,<a href="#B87-remotesensing-17-00459" class="html-bibr">87</a>]). Service Layer Credits: © 2024 Airbus. Geographic location of the areas in a&amp;d are shown with red circle and ellipse in <a href="#remotesensing-17-00459-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 12
<p>(<b>a</b>,<b>b</b>) UAVSAR phase velocity showing water-level change rate for rising tide and ebbing tide; (<b>c</b>,<b>d</b>) channels derived using a threshold on phase; (<b>e</b>,<b>f</b>) channels derived using a threshold on coherence; and (<b>g</b>,<b>h</b>) channels derived using both phase and coherence.</p>
Full article ">Figure 13
<p>The influence of acquisition time demonstrated on a continuously water-logged area in the interior of the wetland using close-ups of (<b>a</b>) SAR; (<b>b</b>) AVIRIS-NG; (<b>c</b>) NAIP-derived channel masks and (<b>d</b>) the aerial image of the corresponding area [<a href="#B96-remotesensing-17-00459" class="html-bibr">96</a>]. Geographic location and extent of the area are shown with a white polygon in <a href="#remotesensing-17-00459-f009" class="html-fig">Figure 9</a>.</p>
Full article ">
45 pages, 20140 KiB  
Article
Development and Experimental Validation of a Sense-and-Avoid System for a Mini-UAV
by Marco Fiorio, Roberto Galatolo and Gianpietro Di Rito
Drones 2025, 9(2), 96; https://doi.org/10.3390/drones9020096 - 26 Jan 2025
Viewed by 821
Abstract
This paper provides an overview of the three-year effort to design and implement a prototypical sense-and-avoid (SAA) system based on a multisensory architecture leveraging data fusion between optical and radar sensors. The work was carried out within the context of the Italian research [...] Read more.
This paper provides an overview of the three-year effort to design and implement a prototypical sense-and-avoid (SAA) system based on a multisensory architecture leveraging data fusion between optical and radar sensors. The work was carried out within the context of the Italian research project named TERSA (electrical and radar technologies for remotely piloted aircraft systems) undertaken by the University of Pisa in collaboration with its industrial partners, aimed at the design and development of a series of innovative technologies for remotely piloted aircraft systems of small scale (MTOW < 25 Kgf). The system leverages advanced computer vision algorithms and an extended Kalman filter to enhance obstacle detection and tracking capabilities. The “Sense” module processes environmental data through a radar and an electro-optical sensor, while the “Avoid” module utilizes efficient geometric algorithms for collision prediction and evasive maneuver computation. A novel hardware-in-the-loop (HIL) simulation environment was developed and used for validation, enabling the evaluation of closed-loop real-time interaction between the “Sense” and “Avoid” subsystems. Extensive numerical simulations and a flight test campaign demonstrate the system’s effectiveness in real-time detection and the avoidance of non-cooperative obstacles, ensuring compliance with UAV aero mechanical and safety constraints in terms of minimum separation requirements. The novelty of this research lies in (1) the design of an innovative and efficient visual processing pipeline tailored for SWaP-constrained mini-UAVs, (2) the formulation an EKF-based data fusion strategy integrating optical data with a custom-built Doppler radar, and (3) the development of a unique HIL simulation environment with realistic scenery generation for comprehensive system evaluation. The findings underscore the potential for deploying such advanced SAA systems in tactical UAV operations, significantly contributing to the safety of flight in non-segregated airspaces Full article
Show Figures

Figure 1

Figure 1
<p>Reference TERSA aircraft.</p>
Full article ">Figure 2
<p>Minimum detection range analytical computation [<a href="#B30-drones-09-00096" class="html-bibr">30</a>].</p>
Full article ">Figure 3
<p>(<b>a</b>) Minimum turn radius as a function of aerodynamic, structural, and propulsive constraints for TERSA aircraft. (<b>b</b>) Minimum detection required range as a function of intruder aircraft airspeed.</p>
Full article ">Figure 4
<p>Maximum attainable flight path angle @ <math display="inline"><semantics> <mrow> <mi>h</mi> <mo>=</mo> <mn>1000</mn> <mo> </mo> <mi mathvariant="normal">m</mi> </mrow> </semantics></math> for stationary climb.</p>
Full article ">Figure 5
<p>SAA typical system integration high-level scheme.</p>
Full article ">Figure 6
<p>High-level schematics of the proposed SAA system.</p>
Full article ">Figure 7
<p>Intruder aircraft azimuth and elevation angle reconstruction.</p>
Full article ">Figure 8
<p>Sense module computer vision pipeline.</p>
Full article ">Figure 9
<p>Visual representation of the pyramidal expansion of the algorithm.</p>
Full article ">Figure 10
<p>Example of KLT feature matcher output.</p>
Full article ">Figure 11
<p>Flowchart of KLT algorithm.</p>
Full article ">Figure 12
<p>High-level scheme of the finite state machine handling the conflict detection and resolution problem.</p>
Full article ">Figure 13
<p>Radar signal processing architecture.</p>
Full article ">Figure 14
<p>(<b>a</b>) Example of a range-Doppler map, (<b>b</b>) CFAR technique, (<b>c</b>) CA-CFAR technique.</p>
Full article ">Figure 15
<p>(<b>a</b>) Bi-dimensional CFAR; (<b>b</b>) binary mask showing detected tracks.</p>
Full article ">Figure 16
<p>Phase monopulse configuration.</p>
Full article ">Figure 17
<p>Metrics used for the evaluation of avoidance maneuvers (<b>a</b>); initial position of the intruder aircraft (<b>b</b>).</p>
Full article ">Figure 17 Cont.
<p>Metrics used for the evaluation of avoidance maneuvers (<b>a</b>); initial position of the intruder aircraft (<b>b</b>).</p>
Full article ">Figure 18
<p>Results of evasive maneuver for different initial position and velocity states of the intruder aircraft; (<b>a</b>) minimum distance; (<b>b</b>) maximum normal deviation with respect to the original trajectory; (<b>c</b>) UAV’s 2D trajectory; (<b>d</b>) UAV aileron deflection.</p>
Full article ">Figure 18 Cont.
<p>Results of evasive maneuver for different initial position and velocity states of the intruder aircraft; (<b>a</b>) minimum distance; (<b>b</b>) maximum normal deviation with respect to the original trajectory; (<b>c</b>) UAV’s 2D trajectory; (<b>d</b>) UAV aileron deflection.</p>
Full article ">Figure 19
<p>Intruder position state (<b>a</b>–<b>c</b>) and velocity state (<b>d</b>–<b>f</b>) reconstruction by the EKF for a starboard encounter; <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>u</mi> <mi>a</mi> <mi>v</mi> </mrow> </msub> <mo>=</mo> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>i</mi> <mi>t</mi> <mi>n</mi> <mi>r</mi> </mrow> </msub> <mo>=</mo> <mn>22</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>. The first dashed vertical line indicates the moment the intruder is detectable (within radar range); the second dashed line indicates the moment where the EKF has reached convergence, and its output is fed into avoidance algorithms.</p>
Full article ">Figure 19 Cont.
<p>Intruder position state (<b>a</b>–<b>c</b>) and velocity state (<b>d</b>–<b>f</b>) reconstruction by the EKF for a starboard encounter; <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>u</mi> <mi>a</mi> <mi>v</mi> </mrow> </msub> <mo>=</mo> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>i</mi> <mi>t</mi> <mi>n</mi> <mi>r</mi> </mrow> </msub> <mo>=</mo> <mn>22</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>. The first dashed vertical line indicates the moment the intruder is detectable (within radar range); the second dashed line indicates the moment where the EKF has reached convergence, and its output is fed into avoidance algorithms.</p>
Full article ">Figure 20
<p>High-level scheme of the complete simulation framework.</p>
Full article ">Figure 21
<p>Video stream send (<b>a</b>) and receive (<b>b</b>) pipeline schematics.</p>
Full article ">Figure 22
<p>SAA camera viewpoint as rendered in Flight Gear (flying over the city of Pisa).</p>
Full article ">Figure 23
<p>Results of a HIL collision scenario in the complete simulation environment. Intruder approaching from starboard side. <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> </msub> <mo>=</mo> <mn>22</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>; <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>t</mi> <mi>r</mi> </mrow> </msub> <mo>=</mo> <mn>50</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math> Three-dimensional trajectory (<b>a</b>), Euler angles (<b>b</b>), speed components in NED reference frame (<b>c</b>), load factors (<b>d</b>).</p>
Full article ">Figure 23 Cont.
<p>Results of a HIL collision scenario in the complete simulation environment. Intruder approaching from starboard side. <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>U</mi> <mi>A</mi> <mi>V</mi> </mrow> </msub> <mo>=</mo> <mn>22</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math>; <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>V</mi> </mrow> <mrow> <mi>i</mi> <mi>n</mi> <mi>t</mi> <mi>r</mi> </mrow> </msub> <mo>=</mo> <mn>50</mn> <mo> </mo> <mi mathvariant="normal">m</mi> <mo>/</mo> <mi mathvariant="normal">s</mi> </mrow> </semantics></math> Three-dimensional trajectory (<b>a</b>), Euler angles (<b>b</b>), speed components in NED reference frame (<b>c</b>), load factors (<b>d</b>).</p>
Full article ">Figure 24
<p>Comparison between elevation (<b>a</b>) and azimuth (<b>b</b>) measurements with ground truth for the complete simulation framework.</p>
Full article ">Figure 25
<p>Flight Gear rendering of a starboard collision scenario on a coastal landscape with intruder position highlighted with a red bounding box as detected by sense algorithms within the complete simulation framework. Subfigures (<b>a</b>–<b>f</b>) show the evolution of the evasive maneuver.</p>
Full article ">Figure 25 Cont.
<p>Flight Gear rendering of a starboard collision scenario on a coastal landscape with intruder position highlighted with a red bounding box as detected by sense algorithms within the complete simulation framework. Subfigures (<b>a</b>–<b>f</b>) show the evolution of the evasive maneuver.</p>
Full article ">Figure 26
<p>SAA system prototype.</p>
Full article ">Figure 27
<p>(<b>a</b>) Antenna placement within the nose mockup; (<b>b</b>) camera sensor chosen for the system implementation; (<b>c</b>) Jetson Nano vision processing unit.</p>
Full article ">Figure 28
<p>SAA system position and UAV trajectory during flight test at Lucca-Tassignano Airport.</p>
Full article ">Figure 29
<p>Reconstructed UAV position states vs. telemetry in base reference frame: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>x</mi> </mrow> </semantics></math> position state, (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>y</mi> </mrow> </semantics></math> position state, (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>z</mi> </mrow> </semantics></math> position state.</p>
Full article ">Figure 30
<p>Relative azimuth (<b>a</b>,<b>c</b>) and elevation (<b>b</b>,<b>d</b>) angles between target UAV and SAA reference frame for two different flight phases.</p>
Full article ">
21 pages, 7506 KiB  
Article
Radar Scattering Analysis of Multi-Scale Complex Targets by Fast VSBR-MoM Method in Urban Scenes
by Zhou Cong, Jihong Gu, Ying Zhang, Jie Yang and Dazhi Ding
Remote Sens. 2025, 17(3), 398; https://doi.org/10.3390/rs17030398 - 24 Jan 2025
Viewed by 428
Abstract
An innovative and efficient hybrid technique, which combines the Method of Moments (MoM) with Volume Meshed Shooting and Bouncing Ray (VSBR), is presented to analyze the scattering of metallic–dielectric mixed multi-scale structures in urban scenes. Additionally, the technique can rapidly generate radar images [...] Read more.
An innovative and efficient hybrid technique, which combines the Method of Moments (MoM) with Volume Meshed Shooting and Bouncing Ray (VSBR), is presented to analyze the scattering of metallic–dielectric mixed multi-scale structures in urban scenes. Additionally, the technique can rapidly generate radar images at different angles, which is useful for the video remote sensing community. By dividing the mixed multi-scale targets into sub-regions, different solvers are employed to compute the scattering contributions based on their varying electrical sizes. For large-scale sub-regions, the VSBR method is employed on both the medium and metal parts, leading to a multilevel electromagnetic field including both the direct induced field and the multi-reflection field. This field contributes to the integral equation in the MoM sub-regions. Additionally, interactions from the MoM region are considered within the VSBR region, followed by a surface current integration to compute the scattered field. When addressing mixed multi-scale electromagnetic problems, this technique proves to be more efficient and easier to implement in general-purpose computer codes. Furthermore, this method is accelerated using the local coupling (LC) theory and fast multipole method (FMM). Using this fast computational method, efficient simulation results of radar images at different angles for the scenes can be obtained and the numerical results demonstrate the efficiency and accuracy of the proposed method. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) The trees, buildings, and rough surface model in urban scenes. (<b>b</b>) The metal–dielectric mixed multi-scale model in urban scenes.</p>
Full article ">Figure 2
<p>(<b>a</b>) Wave propagation in buildings by rays. (<b>b</b>) Wave propagation in leaves by rays.</p>
Full article ">Figure 3
<p>The divided VSBR region and MoM region for multi-scale targets in an urban scene.</p>
Full article ">Figure 4
<p>The schematic diagram of the iterative hybrid VSBR-MoM method.</p>
Full article ">Figure 5
<p>The diagram of local coupling theory in the fast VSBR-MoM method.</p>
Full article ">Figure 6
<p>FMM acceleration in the fast VSBR-MoM method.</p>
Full article ">Figure 7
<p>(<b>a</b>) One case of cube models structure, (<b>b</b>) bistatic RCS of cube model (case 1), (<b>c</b>) another case of cube models, and (<b>d</b>) bistatic RCS of cube model structure (case 2).</p>
Full article ">Figure 8
<p>(<b>a</b>) One case of one patch leaf structure, (<b>b</b>) monostatic RCS of one patch leaf structure, (<b>c</b>,<b>e</b>,<b>g</b>) three cases of multiple leaf combination models, and (<b>d</b>,<b>f</b>,<b>h</b>) monostatic RCS of three cases of multiple leaf combination models.</p>
Full article ">Figure 9
<p>(<b>a</b>) Tree model structure and (<b>b</b>) monostatic RCS of tree model.</p>
Full article ">Figure 10
<p>(<b>a</b>) metallic–dielectric mixed SLICY model and (<b>b</b>) HH polarization monostatic RCS of the SLICY model.</p>
Full article ">Figure 11
<p>(<b>a</b>) Metallic–dielectric mixed drone model, (<b>b</b>) four cases of simulated multi-scale drone model, and (<b>c</b>) monostatic RCS of the four cases metallic–dielectric mixed drone model.</p>
Full article ">Figure 12
<p>(<b>a</b>) The calculation time of different cases for the ray tracing part, (<b>b</b>) the calculation time of different cases for the MoM part, and (<b>c</b>) the calculation time of different cases for the interacting part.</p>
Full article ">Figure 13
<p>(<b>a</b>) The done and rough surface model, (<b>b</b>) the VV polarization imaging result simulated by our hybrid method, and (<b>c</b>) the VV polarization imaging result simulated by the traditional ray tracing method.</p>
Full article ">Figure 14
<p>(<b>a</b>) The port model, (<b>b</b>) the region divination for the port model, and (<b>c</b>) the VV polarization multi-angle imaging results simulated by our method.</p>
Full article ">Figure 14 Cont.
<p>(<b>a</b>) The port model, (<b>b</b>) the region divination for the port model, and (<b>c</b>) the VV polarization multi-angle imaging results simulated by our method.</p>
Full article ">
19 pages, 5807 KiB  
Article
BurgsVO: Burgs-Associated Vertex Offset Encoding Scheme for Detecting Rotated Ships in SAR Images
by Mingjin Zhang, Yaofei Li, Jie Guo, Yunsong Li and Xinbo Gao
Remote Sens. 2025, 17(3), 388; https://doi.org/10.3390/rs17030388 - 23 Jan 2025
Viewed by 423
Abstract
Synthetic Aperture Radar (SAR) is a crucial remote sensing technology with significant advantages. Ship detection in SAR imagery has garnered significant attention. However, existing ship detection methods often overlook feature extraction, and the unique imaging mechanisms of SAR images hinder the direct application [...] Read more.
Synthetic Aperture Radar (SAR) is a crucial remote sensing technology with significant advantages. Ship detection in SAR imagery has garnered significant attention. However, existing ship detection methods often overlook feature extraction, and the unique imaging mechanisms of SAR images hinder the direct application of conventional natural image feature extraction techniques. Moreover, oriented bounding box-based detection methods often prioritize accuracy excessively, leading to increased parameters and computational costs, which in turn elevate computational load and model complexity. To address these issues, we propose a novel two-stage detector, Burgs-rooted vertex offset encoding scheme (BurgsVO), for detecting rotated ships in SAR images. BurgsVO consists of two key modules: the Burgs equation heuristics module, which facilitates feature extraction, and the average diagonal vertex offset (ADVO) encoding scheme, which significantly reduces computational costs. Specifically, the Burgs equation module integrates temporal information with spatial data for effective feature aggregation, establishing a strong foundation for subsequent object detection. The ADVO encoding scheme reduces parameters through anchor transformation, leveraging geometric similarities between quadrilaterals and triangles to further reduce computational costs. Experimental results on the RSSDD and RSDD benchmarks demonstrate that the proposed BurgsVO outperforms the state-of-the-art detectors in both accuracy and efficiency. Full article
Show Figures

Figure 1

Figure 1
<p>The overall framework of our method.</p>
Full article ">Figure 2
<p>The structure of the Burgess equation heuristic module.</p>
Full article ">Figure 3
<p>Illustration of an OBB represented by an ADVO.</p>
Full article ">Figure 4
<p>Decoding regression diagram of ADVO.</p>
Full article ">Figure 5
<p>Visualization of the detection results of different methods on RSSDD. Red rectangles indicate the actual ship targets. Green and purple rectangles represent the detection results of five comparative methods and our method, respectively.</p>
Full article ">Figure 6
<p>Visualization of the detection results of different methods on RSDD. Red rectangles indicate the actual ship targets. Green and blue rectangles represent the detection results of five comparative methods and our method, respectively.</p>
Full article ">Figure 7
<p>Algorithm Performance under ship size variations. Red rectangles indicate the actual ship targets, and purple rectangles represent the detection results of our method.</p>
Full article ">Figure 8
<p>Speed vs. accuracy on the RSSDD test set.</p>
Full article ">
Back to TopTop