Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

Sensing and Signal Analysis in Synthetic Aperture Radar Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Radar Sensors".

Deadline for manuscript submissions: closed (15 May 2024) | Viewed by 9223

Special Issue Editors


E-Mail Website
Guest Editor
School of Electronics, Information, and Electrical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Interests: deep learning for SAR image analysis; multi-source image fusion; radar imaging techniques
Special Issues, Collections and Topics in MDPI journals
Center of Digital Innovation, Tongji University, Shanghai 100092, China
Interests: intelligent sensing and recognition; machine learning and pattern recognition for visual and SAR image interpretation

E-Mail Website
Guest Editor
Shanghai Key Lab. of Intelligent Sensing and Recognition, School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Interests: radar target recognition; remote sensing data processing; multimodal navigation technology

Special Issue Information

Dear Colleagues, 

SAR, a type of active imaging sensor that works in the microwave band, has developed rapidly in recent decades. Various high-resolution, multi-mode spaceborne and airborne SAR sensors have driven the technological progress in many application areas. In recent years, SAR systems and applications have been continuously broadened, such as small UAV SAR, millimeter-wave vehicle-mounted SAR, distributed/passive SAR, etc., as well as new signal- and image-processing methods developed with artificial intelligence. Therefore, the current Special Issue aims to bring together the latest relevant research including new architectures of SAR systems and corresponding signal processing and image interpretation methods.

The Guest Editors invite contributions to this Special Issue of Sensors in relation to topics including, but not limited to:

(1) New SAR system architectures;

(2) Millimeter-wave vehicle-mounted SAR;

(3) Distributed/passive SAR;

(4) SAR signal processing theory and methods;

(5) SAR image interpretation methods;

(6) SAR target detection and recognition;

(7) SAR processing with multimodal data.

Dr. Zenghui Zhang
Dr. Weiwei Guo
Prof. Dr. Wenxian Yu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

21 pages, 5940 KiB  
Article
Sub-Nyquist SAR Imaging and Error Correction Via an Optimization-Based Algorithm
by Wenjiao Chen, Li Zhang, Xiaocen Xing, Xin Wen and Qiuxuan Zhang
Sensors 2024, 24(9), 2840; https://doi.org/10.3390/s24092840 - 29 Apr 2024
Viewed by 770
Abstract
Sub-Nyquist synthetic aperture radar (SAR) based on pseudo-random time–space modulation has been proposed to increase the swath width while preserving the azimuthal resolution. Due to the sub-Nyquist sampling, the scene can be recovered by an optimization-based algorithm. However, these methods suffer from some [...] Read more.
Sub-Nyquist synthetic aperture radar (SAR) based on pseudo-random time–space modulation has been proposed to increase the swath width while preserving the azimuthal resolution. Due to the sub-Nyquist sampling, the scene can be recovered by an optimization-based algorithm. However, these methods suffer from some issues, e.g., manually tuning difficulty and the pre-definition of optimization parameters, and a low signal–noise ratio (SNR) resistance. To address these issues, a reweighted optimization algorithm, named pseudo-ℒ0-norm optimization algorithm, is proposed for the sub-Nyquist SAR system in this paper. A modified regularization model is first built by applying the scene prior information to nearly acquire the number of nonzero elements based on Bayesian estimation, and then this model is solved by the Cauchy–Newton method. Additionally, an error correction method combined with our proposed pseudo-ℒ0-norm optimization algorithm is also present to eliminate defocusing in the motion-induced model. Finally, experiments with simulated signals and strip-map TerraSAR-X images are carried out to demonstrate the effectiveness and superiority of our proposed algorithm. Full article
(This article belongs to the Special Issue Sensing and Signal Analysis in Synthetic Aperture Radar Systems)
Show Figures

Figure 1

Figure 1
<p>The imaging geometry of the sub-Nyquist SAR. <span class="html-fig-inline" id="sensors-24-02840-i001"><img alt="Sensors 24 02840 i001" src="/sensors/sensors-24-02840/article_deploy/html/images/sensors-24-02840-i001.png"/></span> denotes the Nyquist samples. <span class="html-fig-inline" id="sensors-24-02840-i002"><img alt="Sensors 24 02840 i002" src="/sensors/sensors-24-02840/article_deploy/html/images/sensors-24-02840-i002.png"/></span> demonstrates the real azimuthal samples chosen uniformly from Nyquist samples in the sub-Nyquist SAR system based on the pseudo-random time–space modulation.</p>
Full article ">Figure 2
<p>The imaging geometry with the position error. The solid line denotes the realistic track, and the dashed line denotes the hypothetical track. <math display="inline"><semantics> <mrow> <mi>R</mi> <mfenced> <mrow> <mi>η</mi> <mo>,</mo> <msub> <mi>ζ</mi> <mi>R</mi> </msub> </mrow> </mfenced> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mi>E</mi> </msub> <mfenced> <mrow> <mi>η</mi> <mo>,</mo> <msub> <mi>ζ</mi> <mi>R</mi> </msub> </mrow> </mfenced> </mrow> </semantics></math> are the hypothetical slant range and real slant range with the position error, respectively. <math display="inline"><semantics> <mrow> <msub> <mi>ζ</mi> <mi>R</mi> </msub> </mrow> </semantics></math> is the pitch angle. <b><span class="html-italic">P</span></b> is the target point. <math display="inline"><semantics> <mi>H</mi> </semantics></math> is the orbital height. Figure (<b>b</b>) is the projection of target point <b><span class="html-italic">P</span></b> on the XOZ plane. <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>x</mi> <mfenced> <mi>η</mi> </mfenced> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>z</mi> <mfenced> <mi>η</mi> </mfenced> </mrow> </semantics></math> are the range between the realistic position and the hypothetical position on the x-axis and z-axis, respectively.</p>
Full article ">Figure 3
<p>Sea–land interface scene in the SAR image.</p>
Full article ">Figure 4
<p>Sea containing several boats in the SAR image.</p>
Full article ">Figure 5
<p><span class="html-italic">NMSE</span> vs. the iterative number under different algorithms.</p>
Full article ">Figure 6
<p>The recovered result under different algorithms.</p>
Full article ">Figure 7
<p>The recovered result based on the pseudo-<math display="inline"><mi>ℒ</mi></math><sub>0</sub>-norm optimization algorithm and <math display="inline"><mi>ℒ</mi></math><sub>1</sub>-norm optimization algorithm.</p>
Full article ">Figure 7 Cont.
<p>The recovered result based on the pseudo-<math display="inline"><mi>ℒ</mi></math><sub>0</sub>-norm optimization algorithm and <math display="inline"><mi>ℒ</mi></math><sub>1</sub>-norm optimization algorithm.</p>
Full article ">Figure 8
<p>The reconstructed results. (<b>a</b>) The original image; (<b>b</b>) the reconstructed scene without error correction; (<b>c</b>) the reconstructed scene with error correction based on the <math display="inline"><mi>ℒ</mi></math><sub>1</sub>-norm optimization algorithm; and (<b>d</b>) the reconstructed scene with error correction based on the pseudo-<math display="inline"><mi>ℒ</mi></math><sub>0</sub>-norm optimization algorithm.</p>
Full article ">Figure 9
<p>The reconstructed result. (<b>a</b>) The scene reconstruction without error correction; (<b>b</b>) the scene reconstruction with error correction based on the <math display="inline"><mi>ℒ</mi></math><sub>1</sub>-norm optimization algorithm; and (<b>c</b>) the scene reconstruction with error correction based on the pseudo-<math display="inline"><mi>ℒ</mi></math><sub>0</sub>-norm optimization algorithm.</p>
Full article ">
17 pages, 4782 KiB  
Article
Utilizing Polarization Diversity in GBSAR Data-Based Object Classification
by Filip Turčinović, Marin Kačan, Dario Bojanjac, Marko Bosiljevac and Zvonimir Šipuš
Sensors 2024, 24(7), 2305; https://doi.org/10.3390/s24072305 - 5 Apr 2024
Cited by 1 | Viewed by 1079
Abstract
In recent years, the development of intelligent sensor systems has experienced remarkable growth, particularly in the domain of microwave and millimeter wave sensing, thanks to the increased availability of affordable hardware components. With the development of smart Ground-Based Synthetic Aperture Radar (GBSAR) system [...] Read more.
In recent years, the development of intelligent sensor systems has experienced remarkable growth, particularly in the domain of microwave and millimeter wave sensing, thanks to the increased availability of affordable hardware components. With the development of smart Ground-Based Synthetic Aperture Radar (GBSAR) system called GBSAR-Pi, we previously explored object classification applications based on raw radar data. Building upon this foundation, in this study, we analyze the potential of utilizing polarization information to improve the performance of deep learning models based on raw GBSAR data. The data are obtained with a GBSAR operating at 24 GHz with both vertical (VV) and horizontal (HH) polarization, resulting in two matrices (VV and HH) per observed scene. We present several approaches demonstrating the integration of such data into classification models based on a modified ResNet18 architecture. We also introduce a novel Siamese architecture tailored to accommodate the dual input radar data. The results indicate that a simple concatenation method is the most promising approach and underscore the importance of considering antenna polarization and merging strategies in deep learning applications based on radar data. Full article
(This article belongs to the Special Issue Sensing and Signal Analysis in Synthetic Aperture Radar Systems)
Show Figures

Figure 1

Figure 1
<p>ResNet18 modification for raw GBSAR data [<a href="#B16-sensors-24-02305" class="html-bibr">16</a>]. In (<b>a</b>), the matrix dimensions reduce after each conv group, while in (<b>b</b>) (with modification), the horizontal dimension remains the same throughout the process.</p>
Full article ">Figure 2
<p>Developed GBSAR-Pi [<a href="#B16-sensors-24-02305" class="html-bibr">16</a>].</p>
Full article ">Figure 3
<p>Scheme of GBSAR-Pi [<a href="#B24-sensors-24-02305" class="html-bibr">24</a>]. GBSAR-Pi is based on microcomputer Raspberry Pi and FMCW module. The module can emit horizontally or vertically polarized EM waves.</p>
Full article ">Figure 4
<p>Examples of raw radar data recorded with horizontal (<b>left</b>) and vertical (<b>right</b>) polarization. Observed scene contained a metal cuboid (Mc).</p>
Full article ">Figure 5
<p>Examples of the datasets: SUB, AVG, MIX_ROWS, and MIX_COL i JOIN. The examples are the combinations of the matrices depicted in <a href="#sensors-24-02305-f004" class="html-fig">Figure 4</a>, i.e., they represent a metal cuboid object.</p>
Full article ">Figure 6
<p>Minimum, maximum, and average classification accuracy of the HH, VV, SUB, AVG, MIX_ROWS, MIX_COL, and JOIN models.</p>
Full article ">Figure 7
<p>Minimum, maximum, and average accuracy of HH, VV, JOIN, Ensemble, Siamese and Siamese 2 models.</p>
Full article ">Figure 8
<p>Architecture of the Siamese model with two separate branches for HH and VV matrices. Each branch functions as a distinct HH or VV model. The branches connect their feature vectors to form a Siamese feature vector, which is then fed into a fully connected layer.</p>
Full article ">Figure 9
<p>Comparison of the average accuracy per class of HH, VV, and JOIN models.</p>
Full article ">
21 pages, 5293 KiB  
Article
A Radar Echo Simulator for the Synthesis of Randomized Training Data Sets in the Context of AI-Based Applications
by Jonas Schorlemer, Jochen Altholz, Jan Barowski, Christoph Baer, Ilona Rolfes and Christian Schulz
Sensors 2024, 24(3), 836; https://doi.org/10.3390/s24030836 - 27 Jan 2024
Viewed by 1307
Abstract
Supervised machine learning algorithms usually require huge labeled data sets to produce sufficiently good results. For many applications, these data sets are still not available today, and the reasons for this can be manifold. As a solution, the missing training data can be [...] Read more.
Supervised machine learning algorithms usually require huge labeled data sets to produce sufficiently good results. For many applications, these data sets are still not available today, and the reasons for this can be manifold. As a solution, the missing training data can be generated by fast simulators. This procedure is well studied and allows filling possible gaps in the training data, which can further improve the results of a machine learning model. For this reason, this article deals with the development of a two-dimensional electromagnetic field simulator for modeling the response of a radar sensor in an imaging system based on the synthetic aperture radar principle. The creation of completely random scenes is essential to achieve data sets with large variance. Therefore, special emphasis is placed on the development of methods that allow creating random objects, which can then be assembled into an entire scene. In the context of this contribution, we focus on humanitarian demining with regard to improvised explosive devices using a ground-penetrating radar system. This is an area where the use of trained classifiers is of great importance, but in practice, there are little to no labeled datasets for the training process. The simulation results show good agreement with the measurement results obtained in a previous contribution, demonstrating the possibility of enhancing sparse training data sets with synthetic data. Full article
(This article belongs to the Special Issue Sensing and Signal Analysis in Synthetic Aperture Radar Systems)
Show Figures

Figure 1

Figure 1
<p>Procedure of the training process. The description parameters are passed to a randomizer, which generates the scene for the simulator. The simulator generates the training data, which are cross-validated with the real measurements and then used to train the classifier.</p>
Full article ">Figure 2
<p>Refraction situation for a single antenna position with an idealized point target buried in soil material with a flat surface.</p>
Full article ">Figure 3
<p>Generation of a random object. (<b>a</b>) Randomized curve which can be interpreted as the radius, depending on the angle. (<b>b</b>) Interpolation of the curve on a two-dimensional grid using a polar coordinate system. (<b>c</b>) Two-dimensional noise. (<b>d</b>) Filtered noise using a Gaussian filter. (<b>e</b>) Final object.</p>
Full article ">Figure 4
<p>Continuous deformation of a reference object.</p>
Full article ">Figure 5
<p>Generation of randomized objects by deformation of a reference object. (<b>a</b>) Deformation using a rectangular reference or a reference of nested squares. (<b>b</b>) Mine models built from the references. The numbers shown in the mine model on the left side denote the following components: (1) plastic container, (2) additional metal pieces, (3) detonator, (4) explosive material, (5) air gap, and (6) battery.</p>
Full article ">Figure 6
<p>Implementation of the proposed simulation approach.</p>
Full article ">Figure 7
<p>Simulated ground scenes with the corresponding normalized SAR images including the contour of the mine. (<b>a</b>) Reference image containing the mine model in a perfectly homogeneous material. All losses were neglected. (<b>b</b>) Mine model in an inhomogeneous soil material with a non-planar surface, considering dielectric losses. (<b>c</b>) Mine model under the considerations of additional dielectric scatterers inside the ground material. (<b>d</b>) Complete scene under the assumption of additional metal interferers and vegetation on the earth surface. [<a href="#B39-sensors-24-00836" class="html-bibr">39</a>].</p>
Full article ">Figure 8
<p>Simulation of randomly generated IEDs in a GPR environment. (<b>a</b>) IED build from a square reference in a homogeneous and lossless soil material under the assumption of a flat earth surface. (<b>b</b>) IED build from a nested square reference representing a bottle. (<b>c</b>,<b>d</b>) IEDs in a complex environment containing dielectric and metallic scatterers as well as plants on the earth surface.</p>
Full article ">
18 pages, 164375 KiB  
Article
Evaluating Urban Building Damage of 2023 Kahramanmaras, Turkey Earthquake Sequence Using SAR Change Detection
by Xiuhua Wang, Guangcai Feng, Lijia He, Qi An, Zhiqiang Xiong, Hao Lu, Wenxin Wang, Ning Li, Yinggang Zhao, Yuedong Wang and Yuexin Wang
Sensors 2023, 23(14), 6342; https://doi.org/10.3390/s23146342 - 12 Jul 2023
Cited by 15 | Viewed by 4431
Abstract
On February 6, 2023 (local time), two earthquakes (Mw7.8 and Mw7.7) struck central and southern Turkey, causing extensive damage to several cities and claiming a toll of 40,000 lives. In this study, we propose a method for seismic building damage assessment and analysis [...] Read more.
On February 6, 2023 (local time), two earthquakes (Mw7.8 and Mw7.7) struck central and southern Turkey, causing extensive damage to several cities and claiming a toll of 40,000 lives. In this study, we propose a method for seismic building damage assessment and analysis by combining SAR amplitude and phase coherence change detection. We determined building damage in five severely impacted urban areas and calculated the damage ratio by measuring the urban area and the damaged area. The largest damage ratio of 18.93% is observed in Nurdagi, and the smallest ratio of 7.59% is found in Islahiye. We verified the results by comparing them with high-resolution optical images and AI recognition results from the Microsoft team. We also used pixel offset tracking (POT) technology and D-InSAR technology to obtain surface deformation using Sentinel-1A images and analyzed the relationship between surface deformation and post-earthquake urban building damage. The results show that Nurdagi has the largest urban average surface deformation of 0.48 m and Antakya has the smallest deformation of 0.09 m. We found that buildings in the areas with steeper slopes or closer to earthquake faults have higher risk of collapse. We also discussed the influence of SAR image parameters on building change recognition. Image resolution and observation geometry have a great influence on the change detection results, and the resolution can be improved by various means to raise the recognition accuracy. Our research findings can guide earthquake disaster assessment and analysis and identify influential factors of earthquake damage. Full article
(This article belongs to the Special Issue Sensing and Signal Analysis in Synthetic Aperture Radar Systems)
Show Figures

Figure 1

Figure 1
<p>Flowchart used in this study.</p>
Full article ">Figure 2
<p>Surface deformation of the five urban areas caused by the 2023 Mw7.7 and Mw7.8 Turkey earthquakes obtained by POT. (<b>a</b>) Surface deformation of central southern Turkey. The blue box indicates the Sentinel-1 image coverage; the dark gray lines are active faults; the magenta lines are seismogenic faults; and the red pentagrams indicate the epicenter of the two main earthquakes and the Mw6.8 aftershock. The epicenter and focal mechanism are cited from GCMT. The red dots are the location of the affected cities. The inset shows the regional seismotectonic background: The red box is the study area shown in (<b>a</b>); the black line is the fault zone; the north one is North Anatolian fault zone (NAFZ); and the south are East Anatolian fault zone (EAFZ) and the Death Sea fault zone (DSFZ). The red pentagonal stars are the epicenters, (<b>b</b>–<b>f</b>) zoom-in of the deformation map of the five cities. The red polygon delineates the urban boundary.</p>
Full article ">Figure 3
<p>Urban building damage identification results with descending data. The red polygon delineates the urban boundary. Yellow-filled polygons are the identified damaged buildings.</p>
Full article ">Figure 4
<p>Comparison of the building damage identification results with optical images in urban areas. The first column is the pre-earthquake optical images; the second column is the post-earthquake optical images superimposed damage area contour (cyan polygon). The third column is the post-earthquake optical image superimposed damage area contours and recognition results (yellow filled polygon), and the forth column is the post-earthquake optical image superimposed damage area contours and coherence results. (<b>a</b>–<b>d</b>) is in Turkoglu; (<b>e</b>–<b>h</b>) is in Islahiye; (<b>i</b>–<b>p</b>) is in Marash; and (<b>q</b>–<b>t</b>) is in Nurdagi.</p>
Full article ">Figure 5
<p>Comparison of urban building damage identification results with Microsoft artificial intelligence identification results. The first column is the post-earthquake optical images; the second column is the post-earthquake optical images superimposed on the artificial intelligence identification results of the Microsoft team (red block is the damaged building; blue block is the undamaged building). The third column is the post-earthquake optical images superimposed on the identification results in this study. (<b>a</b>) is in Marash; (<b>b</b>) is in Turkoglu; (<b>c</b>) is in Nurdagi; (<b>d</b>) is in Islahiye.</p>
Full article ">Figure 6
<p>Building damage identification results. The number of damaged buildings comes from the artificial intelligence identification results of the Microsoft team.</p>
Full article ">Figure 7
<p>Radar chart of influencing factors of building damage.</p>
Full article ">Figure 8
<p>Comprehensive analysis chart of influencing factors. The red box above is the range of the damage ratio of the five cities. The following red box is the range of the superposition results of the average slope and the distance between the urban center and the seismogenic fault of the five cities.</p>
Full article ">Figure 9
<p>Building echo examples. Gray-filled rectangles are the buildings that receive radar microwave signal, and the white-filled rectangle is the building that does not receive radar microwave signal. Black arrows are radar signals. (<b>a</b>) The microwave signal direction is on the left of the building. (<b>b</b>) The microwave signal direction is on the right of the building.</p>
Full article ">Figure 10
<p>Urban building damage identification results with ascending data. The red polygon delineates the urban boundary. Yellow-filled polygons are the identified damaged buildings.</p>
Full article ">Figure A1
<p>Surface deformation of the five urban areas caused by the 2023 Mw7.7 and Mw7.8 Turkey earthquakes which were obtained by D-InSAR. (<b>a</b>) Surface deformation of central southern Turkey. The blue box indicates the Sentinel-1 image cover; the dark gray lines are active faults; the yellow lines are seismogenic faults; and the magenta pentagrams indicate the epicenter of the two main earthquakes and the Mw6.8 aftershock. The epicenter and focal mechanism are cited from GCMT. The magenta circles with white outline are the location of the affected cities. The inset shows surface deformation results with the unwrapping of central southern Turkey: the yellow lines are seismogenic faults, and the magenta pentagrams indicate the epicenter of the two main earthquakes and the Mw6.8 aftershock. The magenta dots are the location of the affected cities. (<b>b</b>–<b>f</b>) zoom-in of the deformation map of the five cities. The red polygon delineates the urban boundary.</p>
Full article ">

Review

Jump to: Research

18 pages, 4874 KiB  
Review
Review on Phase Synchronization Methods for Spaceborne Multistatic Synthetic Aperture Radar
by Qiang Lin, Shiqiang Li and Weidong Yu
Sensors 2024, 24(10), 3122; https://doi.org/10.3390/s24103122 - 14 May 2024
Viewed by 1005
Abstract
Multistatic synthetic aperture radar (SAR) is a special mode of SAR system. The radar transmitter and receiver are located on different satellites, which brings many advantages, such as flexible baseline configuration, diverse receiving modes, and more detailed ground object classification information. The multistatic [...] Read more.
Multistatic synthetic aperture radar (SAR) is a special mode of SAR system. The radar transmitter and receiver are located on different satellites, which brings many advantages, such as flexible baseline configuration, diverse receiving modes, and more detailed ground object classification information. The multistatic SAR has been widely used in interferometry, moving target detection, three-dimensional imaging, and other fields. The frequency offset between different oscillators will cause a modulation phase error in the signal. Therefore, phase synchronization is one of the most critical problems to be addressed in distributed SAR systems. This article reviews phase synchronization techniques, which are mainly divided into two methods: synchronization by direct microwave link and synchronization by a data-based estimation algorithm. Furthermore, the future development of synchronization technology is anticipated. Full article
(This article belongs to the Special Issue Sensing and Signal Analysis in Synthetic Aperture Radar Systems)
Show Figures

Figure 1

Figure 1
<p>The type of multistatic SAR system. (<b>a</b>) Fully active SAR system; (<b>b</b>) Semi-active SAR system.</p>
Full article ">Figure 2
<p>Schematic diagram of formation configuration: (<b>a</b>) TechSat-21 formation configuration [<a href="#B7-sensors-24-03122" class="html-bibr">7</a>]; (<b>b</b>) cartwheel formation configuration; (<b>c</b>) pendulum formation configuration [<a href="#B5-sensors-24-03122" class="html-bibr">5</a>]; (<b>d</b>) dual-helix formation configuration [<a href="#B3-sensors-24-03122" class="html-bibr">3</a>].</p>
Full article ">Figure 3
<p>Synchronization in multistatic systems: (<b>a</b>) time synchronization; (<b>b</b>) phase synchronization; (<b>c</b>) beam synchronization.</p>
Full article ">Figure 4
<p>The multistatic SAR system phase error generation diagram.</p>
Full article ">Figure 5
<p>Phase error power spectrum of monostatic and bistatic SAR echoes generated by oscillator phase noise. <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>S</mi> </mrow> <mrow> <mi>ϕ</mi> <mo>,</mo> <mi>o</mi> <mi>s</mi> <mi>c</mi> </mrow> </msub> <mfenced separators="|"> <mrow> <mi>f</mi> </mrow> </mfenced> </mrow> </semantics></math> uses <math display="inline"><semantics> <mrow> <mfenced open="{" close="}" separators="|"> <mrow> <msub> <mrow> <mi>h</mi> </mrow> <mrow> <mn>4</mn> </mrow> </msub> <mo>=</mo> <mo>−</mo> <mn>95</mn> <mo> </mo> <mi mathvariant="normal">d</mi> <mi mathvariant="normal">B</mi> <mo>,</mo> <mtext> </mtext> <msub> <mrow> <mi>h</mi> </mrow> <mrow> <mn>3</mn> </mrow> </msub> <mo>=</mo> <mo>−</mo> <mn>90</mn> <mo> </mo> <mi mathvariant="normal">d</mi> <mi mathvariant="normal">B</mi> <mo>,</mo> <mtext> </mtext> <msub> <mrow> <mi>h</mi> </mrow> <mrow> <mn>2</mn> </mrow> </msub> <mo>=</mo> <mo>−</mo> <mn>200</mn> <mo> </mo> <mi mathvariant="normal">d</mi> <mi mathvariant="normal">B</mi> <mo>,</mo> <mtext> </mtext> <msub> <mrow> <mi>h</mi> </mrow> <mrow> <mn>1</mn> </mrow> </msub> <mo>=</mo> <mo>−</mo> <mn>130</mn> <mo> </mo> <mi mathvariant="normal">d</mi> <mi mathvariant="normal">B</mi> <mo>,</mo> <mtext> </mtext> <msub> <mrow> <mi>h</mi> </mrow> <mrow> <mn>0</mn> </mrow> </msub> <mo>=</mo> <mo>−</mo> <mn>155</mn> <mo> </mo> <mi mathvariant="normal">d</mi> <mi mathvariant="normal">B</mi> </mrow> </mfenced> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Phase synchronization and SAR processing flow block diagram.</p>
Full article ">Figure 7
<p>Synchronization scheme proposed in [<a href="#B39-sensors-24-03122" class="html-bibr">39</a>].</p>
Full article ">Figure 8
<p>(<b>a</b>) the configuration and orientation of the synchrohorn on TDX [<a href="#B46-sensors-24-03122" class="html-bibr">46</a>]; 1-6 represents 6 synchronization antennas; (<b>b</b>) pulse exchange synchronization of TanDEM-X satellites [<a href="#B5-sensors-24-03122" class="html-bibr">5</a>]; (<b>c</b>) some echo data were missing.</p>
Full article ">Figure 9
<p>Timing diagrams of noninterrupted synchronization pulse exchange in the LuTan-1 mission [<a href="#B50-sensors-24-03122" class="html-bibr">50</a>].</p>
Full article ">Figure 10
<p>(<b>a</b>) Phase synchronization based on GNSS data [<a href="#B74-sensors-24-03122" class="html-bibr">74</a>]; (<b>b</b>) zero-base line phase and frequency synchronization experiment [<a href="#B40-sensors-24-03122" class="html-bibr">40</a>]; (<b>c</b>) short baseline phase and frequency synchronization experiments [<a href="#B40-sensors-24-03122" class="html-bibr">40</a>].</p>
Full article ">Figure 11
<p>Potential synchronization links of Mirror SAR [<a href="#B18-sensors-24-03122" class="html-bibr">18</a>]: (<b>a</b>) a phase-preserving radar data link and (<b>b</b>) double-mirror synchronization.</p>
Full article ">
Back to TopTop