Vessel Target Detection in Spaceborne–Airborne Collaborative SAR Images via Proposal and Polarization Fusion
<p>Process of spaceborne–airborne collaborative SAR observation and the associated vessel target detection task. In more detail, the spaceborne platform achieves the global large area observation and the airborne platform achieves the local focused observation. For the local area of interest, the collaborative spaceborne SAR image <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>s</mi> </msub> </semantics></math> and airborne SAR image <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>a</mi> </msub> </semantics></math> are fused in a certain manner. Then, the fused image is used for thresholding and vessel target detection.</p> "> Figure 2
<p>Flowchart of the proposed TPPIE image fusion method.</p> "> Figure 3
<p>Experimental data acquired from the spaceborne–airborne SAR collaboration system. Subgraphs (<b>a</b>–<b>c</b>) are the spaceborne SAR images, subgraphs (<b>d</b>–<b>i</b>) are the dual-polarization airborne SAR images, and subgraphs (<b>j</b>–<b>l</b>) are the corresponding ground truths. Three scenes are considered in our experiments, which correspond to the three rows; that is, (<b>a</b>,<b>d</b>,<b>g</b>,<b>j</b>), (<b>b</b>,<b>e</b>,<b>h</b>,<b>k</b>), and (<b>c</b>,<b>f</b>,<b>i</b>,<b>l</b>), respectively.</p> "> Figure 4
<p>Illustrative example of TSPM generation: (<b>a</b>) Proposal matrix <math display="inline"><semantics> <msub> <mi mathvariant="bold">P</mi> <mi>s</mi> </msub> </semantics></math> extracted from the spaceborne SAR image <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>s</mi> </msub> </semantics></math>; (<b>b</b>) Proposal matrix <math display="inline"><semantics> <msub> <mi mathvariant="bold">P</mi> <mi>a</mi> </msub> </semantics></math> extracted from the airborne SAR image <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>a</mi> </msub> </semantics></math>; (<b>c</b>) TSPM <math display="inline"><semantics> <msub> <mi mathvariant="bold">T</mi> <mi>s</mi> </msub> </semantics></math> with respect to <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>s</mi> </msub> </semantics></math>; (<b>d</b>) TSPM <math display="inline"><semantics> <msub> <mi mathvariant="bold">T</mi> <mi>a</mi> </msub> </semantics></math> with respect to <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>a</mi> </msub> </semantics></math>; and (<b>e</b>) Intersected TSPM <math display="inline"><semantics> <msub> <mi mathvariant="bold">T</mi> <mi>f</mi> </msub> </semantics></math>. The ground truth boundaries of vessel targets are also plotted as red lines. By using the intersected TSPM <math display="inline"><semantics> <msub> <mi mathvariant="bold">T</mi> <mi>f</mi> </msub> </semantics></math>, we can significantly reduce the false alarms induced by the inshore interferences and sea clutter, while retaining the potential target regions in <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>s</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="bold">I</mi> <mi>a</mi> </msub> </semantics></math>.</p> "> Figure 5
<p>Illustrative example of APR feature exploitation: (<b>a</b>) Airborne SAR image with horizontal polarization, denoted as <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>H</mi> <mi>H</mi> </mrow> </msubsup> </semantics></math>; (<b>b</b>) Airborne SAR image with vertical polarization, denoted as <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>V</mi> <mi>V</mi> </mrow> </msubsup> </semantics></math>; (<b>c</b>) PR map with respect to <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>H</mi> <mi>H</mi> </mrow> </msubsup> </semantics></math> and <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>V</mi> <mi>V</mi> </mrow> </msubsup> </semantics></math>; (<b>d</b>) APR map with respect to <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>H</mi> <mi>H</mi> </mrow> </msubsup> </semantics></math> and <math display="inline"><semantics> <msubsup> <mi mathvariant="bold">I</mi> <mi>a</mi> <mrow> <mi>V</mi> <mi>V</mi> </mrow> </msubsup> </semantics></math>; and (<b>e</b>) Composite map exploiting the APR feature, as well as the original SAR images. This composite map achieves high contrast between vessel targets and sea clutter, thus improving the TCR performance.</p> "> Figure 6
<p>Experimental results for scene 1: (<b>a</b>) Spaceborne image; (<b>b</b>) airborne image; (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>]; (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>]; (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>]; (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>]; (<b>i</b>) proposed ITSPM fusion; and (<b>j</b>) proposed TPPIE fusion. The proposed methods can achieve better fused images with enhanced target regions and suppressed background regions, compared to the other image fusion approaches.</p> "> Figure 7
<p>Experimental results for scene 2: (<b>a</b>) Spaceborne image; (<b>b</b>) airborne image; (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>]; (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>]; (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>]; (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>]; (<b>i</b>) proposed ITSPM fusion; and (<b>j</b>) proposed TPPIE fusion. The proposed methods can achieve better fused images with enhanced target regions and suppressed background regions, compared to the other image fusion approaches.</p> "> Figure 8
<p>Experimental results for scene 3: (<b>a</b>) Spaceborne image; (<b>b</b>) airborne image; (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>]; (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>]; (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>]; (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>]; (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>]; (<b>i</b>) proposed ITSPM fusion; and (<b>j</b>) proposed TPPIE fusion. The proposed methods can achieve better fused images with enhanced target regions and suppressed background regions, compared to the other image fusion approaches.</p> "> Figure 9
<p>Superpixel-based detection results of different methods for Experimental Scene 1 with respect to <math display="inline"><semantics> <mrow> <mi>PFA</mi> <mo>=</mo> <mn>0.03</mn> </mrow> </semantics></math>: (<b>a</b>) Spaceborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.0804</mn> </mrow> </semantics></math>); (<b>b</b>) airborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.4549</mn> </mrow> </semantics></math>); (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.4191</mn> </mrow> </semantics></math>); (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.5464</mn> </mrow> </semantics></math>); (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.5356</mn> </mrow> </semantics></math>); (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3249</mn> </mrow> </semantics></math>); (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.4314</mn> </mrow> </semantics></math>); (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.4704</mn> </mrow> </semantics></math>); (<b>i</b>) proposed ITSPM fusion (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mrow> <mn>0.8956</mn> </mrow> </mrow> </semantics></math>); and (<b>j</b>) proposed TPPIE fusion (PD = <b>0.9853</b>). The red lines denote the boundaries of target vessels. The proposed TPPIE fusion method had the best detection performance.</p> "> Figure 10
<p>Superpixel-based detection results of different methods for Experimental Scene 2 with respect to <math display="inline"><semantics> <mrow> <mi>PFA</mi> <mo>=</mo> <mn>0.05</mn> </mrow> </semantics></math>: (<b>a</b>) Spaceborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.2009</mn> </mrow> </semantics></math>); (<b>b</b>) airborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3922</mn> </mrow> </semantics></math>); (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3181</mn> </mrow> </semantics></math>); (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.6167</mn> </mrow> </semantics></math>); (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3478</mn> </mrow> </semantics></math>); (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.2762</mn> </mrow> </semantics></math>); (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3261</mn> </mrow> </semantics></math>); (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.4741</mn> </mrow> </semantics></math>); (<b>i</b>) proposed ITSPM fusion (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mrow> <mn>0.9492</mn> </mrow> </mrow> </semantics></math>); and (<b>j</b>) proposed TPPIE fusion (PD = <b>0.9909</b>). The red lines denote the boundaries of target vessels. The proposed TPPIE fusion method had the best detection performance.</p> "> Figure 11
<p>Superpixel-based detection results of different methods for Experimental Scene 3 with respect to <math display="inline"><semantics> <mrow> <mi>PFA</mi> <mo>=</mo> <mn>0.1</mn> </mrow> </semantics></math>: (<b>a</b>) Spaceborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.6133</mn> </mrow> </semantics></math>); (<b>b</b>) airborne image processing (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.3960</mn> </mrow> </semantics></math>); (<b>c</b>) additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.7253</mn> </mrow> </semantics></math>); (<b>d</b>) multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.6850</mn> </mrow> </semantics></math>); (<b>e</b>) PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.7420</mn> </mrow> </semantics></math>); (<b>f</b>) DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.7243</mn> </mrow> </semantics></math>); (<b>g</b>) fuzzy fusion [<a href="#B16-remotesensing-13-03957" class="html-bibr">16</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.7528</mn> </mrow> </semantics></math>); (<b>h</b>) AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>] (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mn>0.7305</mn> </mrow> </semantics></math>); (<b>i</b>) proposed ITSPM fusion (<math display="inline"><semantics> <mrow> <mi>PD</mi> <mo>=</mo> <mrow> <mn>0.8998</mn> </mrow> </mrow> </semantics></math>); and (<b>j</b>) proposed TPPIE fusion (PD = <b>0.9796</b>). The red lines denote the boundaries of target vessels. The proposed TPPIE fusion method had the best detection performance.</p> "> Figure 12
<p>ROC curves of different methods for Experimental Scene 1. When pursuing a relatively high PD (e.g., larger than 0.98), the PFAs of the proposed ITSPM and TPPIE fusion methods were far less than those of the other methods, including spaceborne image processing, airborne image processing, additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>], DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>], fuzzy fusion, and AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>].</p> "> Figure 13
<p>ROC curves of different methods for Experimental Scene 2. When pursuing a relatively high PD (e.g., larger than 0.98), the PFAs of the proposed ITSPM and TPPIE fusion methods were far less than those of the other methods, including spaceborne image processing, airborne image processing, additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>], DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>], fuzzy fusion, and AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>].</p> "> Figure 14
<p>ROC curves of different methods for Experimental Scene 3. When pursuing a relatively high PD (e.g., larger than 0.96), the PFAs of the proposed ITSPM and TPPIE fusion methods were far less than those of the other methods, including spaceborne image processing, airborne image processing, additive fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], multiplicative fusion [<a href="#B10-remotesensing-13-03957" class="html-bibr">10</a>], PCA fusion [<a href="#B12-remotesensing-13-03957" class="html-bibr">12</a>], DWT fusion [<a href="#B13-remotesensing-13-03957" class="html-bibr">13</a>], fuzzy fusion, and AANN fusion [<a href="#B18-remotesensing-13-03957" class="html-bibr">18</a>].</p> ">
Abstract
:1. Introduction
1.1. Related Work
1.2. Main Contributions
1.3. Paper Outline
2. Vessel Detection through Spaceborne–Airborne Collaborative SAR Observations
3. Proposed Method
3.1. TSPM Generation
3.1.1. NG-Based TP Extraction
- SVM Classifier Training Step: Given training SAR images, the number of bounding boxes of vessel targets and non-target backgrounds (e.g., sea clutter and inshore interferences) are set as and , respectively. These bounding boxes are resized to a small size . Then, for each resized bounding box, the NG value is calculated as [32]
- TP Extraction Step: Next, the TPs from test SAR images and are extracted using the SVM classifier trained in the first step. In more detail, the test spaceborne or airborne SAR image is first resized to different pre-defined sizes and, for each resultant image, the corresponding NG map is calculated based on (2). Then, the resized TPs are extracted with the aid of a sliding window on each NG map and the trained SVM classifier. Finally, the normal TPs are obtained by resizing all the NG maps to the size of the original test image. To represent the regions occupied by the TPs for and , we introduce the proposal matrices and , respectively. In the proposal matrix , , its entry is equal to one if belongs to any TP in ; otherwise, it is equal to zero, where and .
3.1.2. Construction of Edge-Based Morphological Candidate Map
3.2. APR Feature Exploitation
3.3. Generation of Fused Image
- First, we used 22 vessel target samples and 47 non-target samples to train the SVM classifier (i.e., and ). These training samples were selected from the spaceborne SAR image corresponding to a global large area observation, as shown in the left subgraph of Figure 1. The scattering characteristic differences between target samples and non-target samples can help to extract satisfactory TPs that cover most of the potential target regions and discriminate them from non-target regions, including inshore interferences and sea clutter. Moreover, the size of resized bounding boxes in the training step and the size of sliding window in the TP extraction step were both set to , as adopted in [31,32], as this can help the algorithm to work well while maintaining low computational complexity;
- Then, we considered the shape and size of structure elements in (3). For our experiments, disk-shaped structure elements were used in the morphological closing operation, as they better coincided with the target boundaries than other shapes. Moreover, the size of structure elements was set to be relatively large, comparable to the width of vessel targets (i.e., about 12 m in our datasets). This setting can ensure the extraction of as many of the candidate target regions as possible;
- Another consideration is the selection of the parameter in (7) and the SF in (6). In general, the parameter is set as a small positive number, helping to discriminate the regions occupied by sea clutter and vessel targets. For our experiments, a proper value of would be taken within the empirical range of . The SF can be determined by histogram analysis of the PR map. In more detail, the parameter was set as the mean of edge values of the highest bin (in the histogram analysis). For our experiments, the values of were generally within a small neighborhood around one.
Algorithm 1 The proposed TPPIE method. |
• Stage 1—TSPM Generation
|
4. Experimental Results
4.1. Dataset Description
4.2. Performance Measures
4.3. Illustrative Examples for Intersected TSPM in (5) and Composite Map Exploiting APR in (8)
4.4. Image Quality Evaluation
4.5. Detection Performance Evaluation
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Rosen, P.A.; Hensley, S.; Wheeler, K.; Sadowy, G.; Miller, T.; Shaffer, S.; Muellerschoen, R.; Jones, C.; Madsen, S.; Zebker, H. UAVSAR: New NASA airborne SAR system for research. IEEE Aerosp. Electron. Syst. Mag. 2007, 22, 21–28. [Google Scholar] [CrossRef]
- Solimene, R.; Catapano, I.; Gennarelli, G.; Cuccaro, A.; Dell’Aversano, A.; Soldovieri, F. SAR imaging algorithms and some unconventional applications: A unified mathematical overview. IEEE Signal Process. Mag. 2014, 31, 90–98. [Google Scholar] [CrossRef]
- Tan, Z.; Zhang, Z.; Xing, T.; Huang, X.; Gong, J.; Ma, J. Exploit Direction Information for Remote Ship Detection. Remote Sens. 2021, 13, 2155. [Google Scholar] [CrossRef]
- Zhang, T.; Jiang, L.; Xiang, D.; Ban, Y.; Pei, L.; Xiong, H. Ship detection from PolSAR imagery using the ambiguity removal polarimetric notch filter. ISPRS J. Photogramm. Remote Sens. 2019, 157, 41–58. [Google Scholar] [CrossRef]
- Rodger, M.; Guida, R. Classification-Aided SAR and AIS Data Fusion for Space-Based Maritime Surveillance. Remote Sens. 2021, 13, 104. [Google Scholar] [CrossRef]
- Yang, Z.; Li, Z.; Zhu, J.; Wang, Y.; Wu, L. Use of SAR/InSAR in Mining Deformation Monitoring, Parameter Inversion, and Forward Predictions: A Review. IEEE Geosci. Remote Sens. Mag. 2020, 8, 71–90. [Google Scholar] [CrossRef]
- Iancu, B.; Soloviev, V.; Zelioli, L.; Lilius, J. ABOships-An Inshore and Offshore Maritime Vessel Detection Dataset with Precise Annotations. Remote Sens. 2021, 13, 988. [Google Scholar] [CrossRef]
- Pelich, R.; Longépé, N.; Mercier, G.; Hajduch, G.; Garello, R. Vessel refocusing and velocity estimation on SAR imagery using the fractional Fourier transform. IEEE Trans. Geosci. Remote Sens. 2015, 54, 1670–1684. [Google Scholar] [CrossRef]
- Ao, D.; Datcu, M.; Schwarz, G.; Hu, C. Moving ship velocity estimation using TanDEM-X data based on subaperture decomposition. IEEE Geosci. Remote Sens. Lett. 2018, 15, 1560–1564. [Google Scholar] [CrossRef]
- Salentinig, A.; Gamba, P. A general framework for urban area extraction exploiting multiresolution SAR data fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 2009–2018. [Google Scholar] [CrossRef]
- Gupta, S.; Singh, D.; Kumar, S. Fusion of texture and wavelet features of PALSAR image using LDA and PCA for land cover classification. Int. J. Image Data Fusion 2017, 8, 354–374. [Google Scholar] [CrossRef]
- Wu, T.; Ren, Q.; Chen, X.; Niu, L.; Ruan, X. Highway bridge detection based on PCA fusion in airborne multiband high resolution SAR images. In Proceedings of the 2011 International Symposium on Image and Data Fusion, Tengchong, China, 9–11 August 2011; pp. 1–3. [Google Scholar]
- Yue, J.; Yang, R.; Huan, R. Pixel level fusion for multiple SAR images using PCA and wavelet transform. In Proceedings of the 2006 CIE International Conference on Radar, Shanghai, China, 16–19 October 2006; pp. 1–4. [Google Scholar]
- Chen, X.; Chen, W. Double-layer fuzzy fusion for multiview through-wall radar images. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2075–2079. [Google Scholar] [CrossRef]
- Seng, C.H.; Bouzerdoum, A.; Amin, M.G.; Phung, S.L. Probabilistic fuzzy image fusion approach for radar through wall sensing. IEEE Trans. Image Process. 2013, 22, 4938–4951. [Google Scholar] [CrossRef] [PubMed]
- Filippidis, A.; Jain, L.C.; Martin, N. Fuzzy rule based fusion technique to automatically detect aircraft in SAR images. In Proceedings of the 1st International Conference on Conventional and Knowledge Based Intelligent Electronic Systems, Adelaide, SA, Australia, 21–23 May 1997; Volume 2, pp. 435–441. [Google Scholar]
- Yang, X.; Wang, J.; Zhu, R. Random walks for synthetic aperture radar image fusion in framelet domain. IEEE Trans. Image Process. 2017, 27, 851–865. [Google Scholar] [CrossRef] [PubMed]
- Fasano, L.; Latini, D.; Machidon, A.; Clementini, C.; Schiavon, G.; Del Frate, F. SAR Data Fusion Using Nonlinear Principal Component Analysis. IEEE Geosci. Remote Sens. Lett. 2019, 17, 1543–1547. [Google Scholar] [CrossRef]
- Ghasrodashti, E.K.; Karami, A.; Heylen, R.; Scheunders, P. Spatial resolution enhancement of hyperspectral images using spectral unmixing and bayesian sparse representation. Remote Sens. 2017, 9, 541. [Google Scholar] [CrossRef] [Green Version]
- Mao, Q.; Peng, J.; Wang, Y. Resolution Enhancement of Remotely Sensed Land Surface Temperature: Current Status and Perspectives. Remote Sens. 2021, 13, 1306. [Google Scholar] [CrossRef]
- Zare, M.; Helfroush, M.S.; Kazemi, K.; Scheunders, P. Hyperspectral and Multispectral Image Fusion Using Coupled Non-Negative Tucker Tensor Decomposition. Remote Sens. 2021, 13, 2930. [Google Scholar] [CrossRef]
- Ghassemian, H. A review of remote sensing image fusion methods. Inf. Fusion 2016, 32, 75–89. [Google Scholar] [CrossRef]
- Shuangao, W.; Padmanaban, R.; Mbanze, A.A.; Silva, J.; Shamsudeen, M.; Cabral, P.; Campos, F.S. Using satellite image fusion to evaluate the impact of land use changes on ecosystem services and their economic values. Remote Sens. 2021, 13, 851. [Google Scholar] [CrossRef]
- Yu, Y.; Wang, B.; Zhang, L. Hebbian-based neural networks for bottom-up visual attention and its applications to ship detection in SAR images. Neurocomputing 2011, 74, 2008–2017. [Google Scholar] [CrossRef]
- Zhai, L.; Li, Y.; Su, Y. Inshore ship detection via saliency and context information in high-resolution SAR images. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1870–1874. [Google Scholar] [CrossRef]
- Wang, S.; Wang, M.; Yang, S.; Jiao, L. New hierarchical saliency filtering for fast ship detection in high-resolution SAR images. IEEE Trans. Geosci. Remote Sens. 2017, 55, 351–362. [Google Scholar] [CrossRef]
- Li, T.; Liu, Z.; Xie, R.; Ran, L. An improved superpixel-level CFAR detection method for ship targets in high-resolution SAR images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 184–194. [Google Scholar] [CrossRef]
- An, Q.; Pan, Z.; Liu, L.; You, H. DRBox-v2: An improved detector with rotatable boxes for target detection in SAR images. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8333–8349. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, M.; Xu, P.; Guo, Z. SAR ship detection using sea-land segmentation-based convolutional neural network. In Proceedings of the 2017 International Workshop on Remote Sensing with Intelligent Processing (RSIP), Shanghai, China, 18–21 May 2017; pp. 1–4. [Google Scholar]
- Liu, N.; Cao, Z.; Cui, Z.; Pi, Y.; Dang, S. Multi-scale proposal generation for ship detection in SAR images. Remote Sens. 2019, 11, 526. [Google Scholar] [CrossRef] [Green Version]
- Dai, H.; Du, L.; Wang, Y.; Wang, Z. A modified CFAR algorithm based on object proposals for ship target detection in SAR images. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1925–1929. [Google Scholar] [CrossRef]
- Cheng, M.M.; Zhang, Z.; Lin, W.Y.; Torr, P. BING: Binarized normed gradients for objectness estimation at 300fps. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 3286–3293. [Google Scholar]
- Seng, C.H.; Bouzerdoum, A.; Amin, M.G.; Phung, S.L. Two-stage fuzzy fusion with applications to through-the-wall radar imaging. IEEE Geosci. Remote Sens. Lett. 2013, 10, 687–691. [Google Scholar] [CrossRef]
- Zhang, T.; Yang, Z.; Xiong, H. PolSAR Ship Detection Based on the Polarimetric Covariance Difference Matrix. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 3348–3359. [Google Scholar] [CrossRef]
- Liu, T.; Yang, Z.; Marino, A.; Gao, G.; Yang, J. PolSAR Ship Detection Based on Neighborhood Polarimetric Covariance Matrix. IEEE Trans. Geosci. Remote Sens. 2021, 59, 4874–4887. [Google Scholar] [CrossRef]
- Hong, Y.; Leng, C.; Zhang, X.; Pei, Z.; Cheng, I.; Basu, A. HOLBP: Remote Sensing Image Registration Based on Histogram of Oriented Local Binary Pattern Descriptor. Remote Sens. 2021, 13, 2328. [Google Scholar] [CrossRef]
- Cao, S.Y.; Shen, H.L.; Chen, S.J.; Li, C. Boosting Structure Consistency for Multispectral and Multimodal Image Registration. IEEE Trans. Image Process. 2020, 29, 5147–5162. [Google Scholar] [CrossRef]
- Zimmer, V.A.; Ballester, M.Á.G.; Piella, G. Multimodal image registration using Laplacian commutators. Inf. Fusion 2019, 49, 130–145. [Google Scholar] [CrossRef]
- Li, Z.; Zhang, H.; Huang, Y. A Rotation-Invariant Optical and SAR Image Registration Algorithm Based on Deep and Gaussian Features. Remote Sens. 2021, 13, 2628. [Google Scholar] [CrossRef]
- Reigber, A.; Scheiber, R.; Jager, M.; Prats-Iraola, P.; Hajnsek, I.; Jagdhuber, T.; Papathanassiou, K.P.; Nannini, M.; Aguilera, E.; Baumgartner, S.; et al. Very-high-resolution airborne synthetic aperture radar imaging: Signal processing and applications. Proc. IEEE 2013, 101, 759–783. [Google Scholar] [CrossRef] [Green Version]
- Dodgson, N.A. Quadratic interpolation for image resampling. IEEE Trans. Image Process. 1997, 6, 1322–1326. [Google Scholar] [CrossRef] [PubMed]
- Wang, X.; Li, G.; Zhang, X.P.; He, Y. Ship Detection in SAR Images via Local Contrast of Fisher Vectors. IEEE Trans. Geosci. Remote Sens. 2020, 58, 6467–6479. [Google Scholar] [CrossRef]
- Yu, W.; Wang, Y.; Liu, H.; He, J. Superpixel-based CFAR target detection for high-resolution SAR images. IEEE Geosci. Remote Sens. Lett. 2016, 13, 730–734. [Google Scholar] [CrossRef]
Experimental Scene | Figure 3a–c | Figure 3d–i |
---|---|---|
Platform | Gaofen-3 (Spaceborne platform) | UAV (Airborne platform) |
Frequency band | C | Ku |
Center frequency | GHz | GHz |
Polarization | HH | HH, VV |
Incidence angle | – | |
Center look angle | ∼ | |
Imaging mode | Spotlight | Strip map |
Resolution | ||
Center imaging time | 27 September 2019 06:01:03 | 27 September 2019 06:02:56 |
Observed area | Shanghai port | Shanghai port |
Image size | Scene 1, Figure 3a: | Scene 1, Figure 3d,g: |
Scene 2, Figure 3b: | Scene 2, Figure 3e,h: | |
Scene 3, Figure 3c: | Scene 3, Figure 3f,i: |
Experimental Scene 1 | TCR [dBs] | TIF [dBs] |
---|---|---|
Spaceborne image only | − | |
Airborne image only | − | |
Additive fusion [10] | ||
Multiplicative fusion [10] | ||
PCA fusion [12] | ||
DWT fusion [13] | ||
Fuzzy fusion [16] | ||
AANN fusion [18] | ||
Proposed ITSPM | 7.51 | |
Proposed TPPIE | 13.73 |
Experimental Scene 2 | TCR [dBs] | TIF [dBs] |
---|---|---|
Spaceborne image only | − | |
Airborne image only | − | |
Additive fusion [10] | ||
Multiplicative fusion [10] | ||
PCA fusion [12] | ||
DWT fusion [13] | ||
Fuzzy fusion [16] | ||
AANN fusion [18] | ||
Proposed ITSPM | 7.95 | |
Proposed TPPIE | 12.02 |
Experimental Scene 3 | TCR [dBs] | TIF [dBs] |
---|---|---|
Spaceborne image only | − | |
Airborne image only | − | |
Additive fusion [10] | ||
Multiplicative fusion [10] | ||
PCA fusion [12] | ||
DWT fusion [13] | ||
Fuzzy fusion [16] | ||
AANN fusion [18] | ||
Proposed ITSPM | 6.25 | |
Proposed TPPIE | 9.45 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, D.; Wang, X.; Cheng, Y.; Li, G. Vessel Target Detection in Spaceborne–Airborne Collaborative SAR Images via Proposal and Polarization Fusion. Remote Sens. 2021, 13, 3957. https://doi.org/10.3390/rs13193957
Zhu D, Wang X, Cheng Y, Li G. Vessel Target Detection in Spaceborne–Airborne Collaborative SAR Images via Proposal and Polarization Fusion. Remote Sensing. 2021; 13(19):3957. https://doi.org/10.3390/rs13193957
Chicago/Turabian StyleZhu, Dong, Xueqian Wang, Yayun Cheng, and Gang Li. 2021. "Vessel Target Detection in Spaceborne–Airborne Collaborative SAR Images via Proposal and Polarization Fusion" Remote Sensing 13, no. 19: 3957. https://doi.org/10.3390/rs13193957