Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (652)

Search Parameters:
Keywords = sea surface imaging

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 15268 KiB  
Article
Automatic Reading and Reporting Weather Information from Surface Fax Charts for Ships Sailing in Actual Northern Pacific and Atlantic Oceans
by Jun Jian, Yingxiang Zhang, Ke Xu and Peter J. Webster
J. Mar. Sci. Eng. 2024, 12(11), 2096; https://doi.org/10.3390/jmse12112096 - 19 Nov 2024
Viewed by 296
Abstract
This study is aimed to improve the intelligence level, efficiency, and accuracy of ship safety and security systems by contributing to the development of marine weather forecasting. The accurate and prompt recognition of weather fax charts is very important for navigation safety. This [...] Read more.
This study is aimed to improve the intelligence level, efficiency, and accuracy of ship safety and security systems by contributing to the development of marine weather forecasting. The accurate and prompt recognition of weather fax charts is very important for navigation safety. This study employed many artificial intelligent (AI) methods including a vectorization approach and target recognition algorithm to automatically detect the severe weather information from Japanese and US weather charts. This enabled the expansion of an existing auto-response marine forecasting system’s applications toward north Pacific and Atlantic Oceans, thus enhancing decision-making capabilities and response measures for sailing ships at actual sea. The OpenCV image processing method and YOLOv5s/YOLO8vn algorithm were utilized to make template matches and locate warning symbols and weather reports from surface weather charts. After these improvements, the average accuracy of the model significantly increased from 0.920 to 0.928, and the detection rate of a single image reached a maximum of 1.2 ms. Additionally, OCR technology was applied to retract texts from weather reports and highlighted the marine areas where dense fog and great wind conditions are likely to occur. Finally, the field tests confirmed that this auto and intelligent system could assist the navigator within 2–3 min and thus greatly enhance the navigation safety in specific areas in the sailing routes with minor text-based communication costs. Full article
(This article belongs to the Special Issue Ship Performance in Actual Seas)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Weather report and warning symbols in JMA surface weather fax chart retrieved from imocwx.com.(accessed on 14 Feb 2022). (<b>b</b>) Warning symbols and wind barbs in a 48 h surface forecast chart issued by US National Weather Service at 1758 UTC on 13 February 2022.</p>
Full article ">Figure 2
<p>JMA (<b>a</b>) original surface fax weather chart, (<b>b</b>) averaged base chart, (<b>c</b>) after binarization, (<b>d</b>) difference between the original (<b>a</b>) and base chart (<b>c</b>), resulting in a pure weather map.</p>
Full article ">Figure 3
<p>Flow chart of the auto-warning system for JMA charts.</p>
Full article ">Figure 4
<p>YOLOv5s-CBAM(SE) network structure diagram, the parts related with CBAM(SE) are marked in gray.</p>
Full article ">Figure 5
<p>YOLOv8n model structure diagram.</p>
Full article ">Figure 6
<p>Comparison of weather briefing text recognition results.</p>
Full article ">Figure 7
<p>Recognition of warning symbols “hPa”, “GW”, “SW”, and “FOG[W]” from the chart <a href="#jmse-12-02096-f002" class="html-fig">Figure 2</a>d.</p>
Full article ">Figure 8
<p>Comparison of detection results of wind barb (interception).</p>
Full article ">Figure 8 Cont.
<p>Comparison of detection results of wind barb (interception).</p>
Full article ">Figure 9
<p>Training process visualization (<b>a</b>) train-loss and (<b>b</b>) mAP values for original and improved YOLOv5s.</p>
Full article ">Figure 10
<p>(<b>a</b>) JMA charts with warning symbols detected, (<b>b</b>) JMA charts with warning area colored, red and yellow for wind speeds greater than 50 kts and 35–49 kts, green for visibility &lt; 0.3 nm (<b>c</b>) US charts with wind levels colored.</p>
Full article ">Figure 11
<p>Field tests of the auto-warning system (<b>upper</b> and <b>middle</b>) US case (<b>bottom</b>) and JMA case.</p>
Full article ">
18 pages, 10136 KiB  
Article
The Combination Application of FY-4 Satellite Products on Typhoon Saola Forecast on the Sea
by Chun Yang, Bingying Shi and Jinzhong Min
Remote Sens. 2024, 16(21), 4105; https://doi.org/10.3390/rs16214105 - 2 Nov 2024
Viewed by 620
Abstract
Satellite data play an irreplaceable role in global observation data systems. Effective comprehensive application of satellite products will inevitably improve numerical weather prediction. FengYun-4 (FY-4) series satellites can provide not only radiance data but also retrieval data with high temporal and spatial resolutions. [...] Read more.
Satellite data play an irreplaceable role in global observation data systems. Effective comprehensive application of satellite products will inevitably improve numerical weather prediction. FengYun-4 (FY-4) series satellites can provide not only radiance data but also retrieval data with high temporal and spatial resolutions. To evaluate the potential benefits of the combination application of FY-4 Advanced Geostationary Radiance Imager (AGRI) products on Typhoon Saola analysis and forecast, two group of experiments are set up with the Weather Research and Forecasting model (WRF). Compared with the benchmark experiment, whose sea surface temperature (SST) is from the National Centers for Environmental Prediction (NCEP) reanalysis data, the SST replacement experiments with FY-4 A/B SST products significantly improve the track and precipitation forecast, especially with the FY-4B SST product. Based on the above results, AGRI clear-sky and all-sky assimilations with FY-4B SST are implemented with a self-constructed AGRI assimilation module. The results show that the AGRI all-sky assimilation experiment can obtain better analyses and forecasts. Furthermore, it is proven that the combination application of AGRI radiance and SST products is beneficial for typhoon prediction. Full article
Show Figures

Figure 1

Figure 1
<p>The weighting function of channels 9-14 of FY-4A AGRI with RTTOV and the U.S. standard atmospheric profile.</p>
Full article ">Figure 2
<p>(<b>a</b>) The evolution of the best track, (<b>b</b>) the central sea level pressure (units: hPa) and maximum wind (units: knot) for Typhoon Saola from 0000 UTC 22 August to 1200 UTC 3 September 2023.</p>
Full article ">Figure 3
<p>Initial SST (units: K) from (<b>a</b>) <span class="html-italic">CON</span>, (<b>b</b>) SSTA, and (<b>c</b>) SSTB.</p>
Full article ">Figure 4
<p>The predicted (<b>a</b>) track, (<b>b</b>) track errors (units: km), (<b>c</b>) CSLP errors (units: hPa), and (<b>d</b>) MW errors (units: knot) in <span class="html-italic">CON</span> (light blue lines), SSTA (red lines), and SSTB (light green lines) are compared to the JMA best track estimates (blue lines) from 0600 UTC 30 August to 1200 UTC 2 September 2023.</p>
Full article ">Figure 5
<p>Time series of the U and V components of average steering flow (units: m/s) from 0600 UTC 30 August to 1200 UTC 2 September 2023.</p>
Full article ">Figure 6
<p>The 24 h accumulated precipitation (units: mm) from 1200 UTC 1 September to 1200 UTC 2 September 2023 of (<b>a</b>) the Micaps observation; (<b>b</b>) the interpolated Micaps observation with a horizontal resolution of 0.5° × 0.5°; (<b>c</b>) <span class="html-italic">CON</span>; (<b>d</b>) SSTA; and (<b>e</b>) SSTB. The dots with different colors in (<b>a</b>) represent different accumulated precipitation, as shown in the color bar.</p>
Full article ">Figure 7
<p>Performance diagram for the 24 h accumulated precipitation for the <span class="html-italic">CON</span> (light blue), SSTA (red), and SSTB (light green) with a threshold of (<b>a</b>) 0.01 mm; (<b>b</b>) 10 mm; (<b>c</b>) 25 mm; (<b>d</b>) 50 mm; and (<b>e</b>) 75 mm from 1200 UTC 1 September to 1200 UTC 2 September 2023.</p>
Full article ">Figure 8
<p>(<b>a</b>,<b>b</b>) The AGRI observed brightness temperature (units: K) distributions at channel 9 after QC in (<b>a</b>) CLR and (<b>b</b>) ALL valid at 1500 UTC 30 August 2023. (<b>c</b>) The counts of assimilated AGRI observations at channel 9 in ALL and CLR with different cloud mask types every 3 hr from 0900 UTC 30 August to 1500 UTC 30 August 2023.</p>
Full article ">Figure 9
<p>The IPs (units: %) over <span class="html-italic">CON</span> of individual experiments every 3 h from 0900 UTC 30 August to 1500 UTC 30 August 2023 in (<b>a</b>) <span class="html-italic">CTTs</span> and (<b>b</b>) agreements on sky conditions.</p>
Full article ">Figure 10
<p>The predicted (<b>a</b>) track, (<b>b</b>) track errors (units: km), (<b>c</b>) CSLP errors (units: hPa), and (<b>d</b>) MW errors (units: knot) in <span class="html-italic">CON</span> (light blue lines), SSTB (light green lines), CLR (light yellow lines), ALL (orange lines), CLR + SSTB (light red lines), and ALL + SSTB (brown lines) are compared to the JMA best track estimates (blue lines) from 1800 UTC 30 August to 1200 UTC 2 September 2023.</p>
Full article ">Figure 11
<p>The (<b>a</b>) U and (<b>b</b>) V components of steering flows (units: m/s) from 700 to 200 hPa with an interval of 50 hPa in individual experiments at 0600 UTC 1 September 2023.</p>
Full article ">Figure 12
<p>The same as <a href="#remotesensing-16-04105-f007" class="html-fig">Figure 7</a> but for (<b>a</b>) the interpolated Micaps observation with a horizontal resolution of 0.5° × 0.5°; (<b>b</b>) <span class="html-italic">CON</span>; (<b>c</b>) CLR; (<b>d</b>) ALL; (<b>e</b>) SSTB; (<b>f</b>) CLR + SSTB; and (<b>g</b>) ALL + SSTB.</p>
Full article ">Figure 13
<p>The same as <a href="#remotesensing-16-04105-f006" class="html-fig">Figure 6</a> but for <span class="html-italic">CON</span> (light blue), SSTB (light green), CLR (light yellow), ALL (orange), CLR + SSTB (light red), and ALL + SSTB (brown) with a threshold of (<b>a</b>) 0.01 mm; (<b>b</b>) 10 mm; (<b>c</b>) 25 mm; (<b>d</b>) 50 mm; and (<b>e</b>) 75 mm.</p>
Full article ">
20 pages, 6032 KiB  
Article
Feasibility Analysis for Predicting Indian Ocean Bigeye Tuna (Thunnus obesus) Fishing Grounds Based on Temporal Characteristics of FY-3 Microwave Radiation Imager Data
by Yun Zhang, Jinglan Ye, Shuhu Yang, Yanling Han, Zhonghua Hong and Wanting Meng
J. Mar. Sci. Eng. 2024, 12(11), 1917; https://doi.org/10.3390/jmse12111917 - 27 Oct 2024
Viewed by 476
Abstract
Efficient and accurate fishery forecasting is of great significance in ensuring the efficiency of fishery operations. This paper proposes a fishery forecasting method using a brightness temperature (TB) time series spatial feature extraction and fusion model. Using Indian Ocean bigeye tuna fishery data [...] Read more.
Efficient and accurate fishery forecasting is of great significance in ensuring the efficiency of fishery operations. This paper proposes a fishery forecasting method using a brightness temperature (TB) time series spatial feature extraction and fusion model. Using Indian Ocean bigeye tuna fishery data from 2009 to 2021 as a reference, this paper discusses the feasibility of fishery forecasting using FY-3 Microwave Radiation Imager (MWRI) Level 1 TB data. For this paper, we designed a deep learning network model for radiometer TB time series feature extraction (TimeTB-FishNet) based on the Convolutional Neural Network (CNN), Gated Recurrent Unit (GRU), and Attention mechanism. After expanding the dimensions of TB features, the model uses them together with spatiotemporal feature factors (year, month, longitude, and latitude) as features. By adding the GRU and Attention to the CNN, the CNN-GRU-Attention model architecture is established and can extract deep time series spatial features from the data to achieve the best results. In the model validation experiments, the TimeTB-FishNet model performed optimally, with a coefficient of determination (R2) of 0.6643. In the generalization experiments, the R2 also reached 0.6261, with a root mean square error (RMSE) of 46.6031 kg/1000 hook. When the sea surface height (SSH) was introduced, the R2 further reached 0.6463, with a lower RMSE of 45.1318 kg/1000 hook. The experimental results show that the proposed method and model are feasible and effective. The proposed model can directly use enhanced radiometer TB data without relying on lagging ocean environmental product data, performing deep temporal and spatial feature extraction for fishery forecasting. This method can provide a reference for the fishing of bigeye tuna in the Indian Ocean. Full article
(This article belongs to the Section Marine Aquaculture)
Show Figures

Figure 1

Figure 1
<p>Thermodynamic diagram of the CPUE (kg/1000 hook) value and bright temperature value of 18V in bigeye tuna January, 2009. (<b>a</b>) CPUE (kg/1000 hook) value thermodynamic diagram; (<b>b</b>) 18V TB (K) value thermodynamic diagram.</p>
Full article ">Figure 2
<p>Bigeye tuna CPUE (kg/1000 hook) grid expansion before and after comparison for January, 2009, (<b>a</b>) before grid expansion (gray means a missing CPUE value) and (<b>b</b>) after grid expansion.</p>
Full article ">Figure 2 Cont.
<p>Bigeye tuna CPUE (kg/1000 hook) grid expansion before and after comparison for January, 2009, (<b>a</b>) before grid expansion (gray means a missing CPUE value) and (<b>b</b>) after grid expansion.</p>
Full article ">Figure 3
<p>Time series data construction.</p>
Full article ">Figure 4
<p>Prediction model frame diagram.</p>
Full article ">Figure 5
<p>TB temporal space feature extraction fusion model structure.</p>
Full article ">Figure 6
<p>Distribution of TimeFS_Fish Dataset. (<b>a</b>) Distribution of TimeFS_Fish training set data. (<b>b</b>) Distribution of TimeFS_Fish validation set data. (<b>c</b>) Distribution of TimeFS_Fish validation set data.</p>
Full article ">Figure 6 Cont.
<p>Distribution of TimeFS_Fish Dataset. (<b>a</b>) Distribution of TimeFS_Fish training set data. (<b>b</b>) Distribution of TimeFS_Fish validation set data. (<b>c</b>) Distribution of TimeFS_Fish validation set data.</p>
Full article ">Figure 7
<p>RF results (<b>left</b>) and CNN results (<b>right</b>) in 2019.</p>
Full article ">Figure 8
<p>CNN-GRU results (<b>left</b>) and TimeTB-FishNet results (<b>right</b>) in 2019.</p>
Full article ">Figure 9
<p>Scatter plot of TimeTB-FishNet predicted results and actual CPUE in 2020-2021.</p>
Full article ">Figure 10
<p>Introduction to the SSH model structure.</p>
Full article ">Figure 11
<p>TimeFH_Fish results in 2020–2021.</p>
Full article ">
18 pages, 7440 KiB  
Article
A Novel Method for the Estimation of Sea Surface Wind Speed from SAR Imagery
by Zahra Jafari, Pradeep Bobby, Ebrahim Karami and Rocky Taylor
J. Mar. Sci. Eng. 2024, 12(10), 1881; https://doi.org/10.3390/jmse12101881 - 20 Oct 2024
Viewed by 718
Abstract
Wind is one of the important environmental factors influencing marine target detection as it is the source of sea clutter and also affects target motion and drift. The accurate estimation of wind speed is crucial for developing an efficient machine learning (ML) model [...] Read more.
Wind is one of the important environmental factors influencing marine target detection as it is the source of sea clutter and also affects target motion and drift. The accurate estimation of wind speed is crucial for developing an efficient machine learning (ML) model for target detection. For example, high wind speeds make it more likely to mistakenly detect clutter as a marine target. This paper presents a novel approach for the estimation of sea surface wind speed (SSWS) and direction utilizing satellite imagery through innovative ML algorithms. Unlike existing methods, our proposed technique does not require wind direction information and normalized radar cross-section (NRCS) values and therefore can be used for a wide range of satellite images when the initial calibrated data are not available. In the proposed method, we extract features from co-polarized (HH) and cross-polarized (HV) satellite images and then fuse advanced regression techniques with SSWS estimation. The comparison between the proposed model and three well-known C-band models (CMODs)—CMOD-IFR2, CMOD5N, and CMOD7—further indicates the superior performance of the proposed model. The proposed model achieved the lowest Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE), with values of 0.97 m/s and 0.62 m/s for calibrated images, and 1.37 and 0.97 for uncalibrated images, respectively, on the RCM dataset. Full article
(This article belongs to the Special Issue Remote Sensing Applications in Marine Environmental Monitoring)
Show Figures

Figure 1

Figure 1
<p>Distribution of wind direction and wind speed.</p>
Full article ">Figure 2
<p>NRCS vs. incidence angle for different wind speeds and directions using CMOD5N and CMOD7 functions.</p>
Full article ">Figure 3
<p>Scatter plots of real versus calculated wind speed using (<b>a</b>) CMOD5, (<b>b</b>) CMOD-IFR, and (<b>c</b>) CMOD7 models with HH polarization.</p>
Full article ">Figure 4
<p>Scatter plots of real versus calculated wind speed using (<b>a</b>) CMOD5, (<b>b</b>) CMOD-IFR, and (<b>c</b>) CMOD7 models after compensation for polarization.</p>
Full article ">Figure 5
<p>Distribution of intensities for HH and HV polarizations at high and low wind speeds.</p>
Full article ">Figure 6
<p>Block diagram of proposed system.</p>
Full article ">Figure 7
<p>Effect of despeckling filter on RCM image.</p>
Full article ">Figure 8
<p>Histogram of the introduced feature extracted from calibrated data, with orange representing low wind, green representing mid wind, and purple representing high wind.</p>
Full article ">Figure 9
<p>Histogram of the introduced feature extracted from uncalibrated data, with orange representing low wind, green representing mid wind, and purple representing high wind.</p>
Full article ">Figure 10
<p>Comparisons of retrieved SSWS using concatenated models with different features from the calibrated RCM dataset.</p>
Full article ">Figure 11
<p>Comparisons of retrieved SSWS using concatenated models with different features from the uncalibrated RCM dataset.</p>
Full article ">Figure 12
<p>The closest region, where both RCM data and buoy station data are available.</p>
Full article ">Figure 13
<p>ERA5 vs. buoy wind speeds for the south of Greenland across all seasons in 2023.</p>
Full article ">Figure 14
<p>Testing the proposed model in the south of Greenland using buoy wind speed data.</p>
Full article ">
15 pages, 13272 KiB  
Article
Polarization-Enhanced Underwater Laser Range-Gated Imaging for Subaquatic Applications
by Shuaibao Chen, Peng Liu, Wei He, Dong Luo, Yuguang Tan, Liangpei Chen, Jue Wang, Qi Zhao, Guohua Jiao and Wei Chen
Sensors 2024, 24(20), 6681; https://doi.org/10.3390/s24206681 - 17 Oct 2024
Viewed by 776
Abstract
Laser range-gated underwater imaging technology, by removing most of the backscattering noise, can effectively increase image contrast and extend the detection range. The optical signal captured by a range-gated imaging system primarily comprises reflected light from the object and backscattered light from the [...] Read more.
Laser range-gated underwater imaging technology, by removing most of the backscattering noise, can effectively increase image contrast and extend the detection range. The optical signal captured by a range-gated imaging system primarily comprises reflected light from the object and backscattered light from the surrounding water. Consequently, surfaces with low reflectivity or highly turbid water environments substantially constrain the applicability of the range-gated imaging system. To enhance the detection capability of underwater laser range-gated imaging, this paper proposes the incorporation of underwater polarized light imaging technology as an enhancement method. Based on polarization differences, backscattered light and reflected light from an object can be distinguished. Experimental results indicate that, compared to images obtained using a conventional range-gated laser imaging system, those captured with a polarization-enhanced system exhibit an increase of up to 47% for the corresponding Enhancement Measure Evaluation (EME) index. The proposed approach, which integrates polarization imaging with range-gated laser imaging, has the potential to broaden the applicability of underwater laser imaging scenarios, such as deep-sea exploration and military applications. Full article
(This article belongs to the Section Sensing and Imaging)
Show Figures

Figure 1

Figure 1
<p>Program workflow of the polarization enhancement algorithm.</p>
Full article ">Figure 2
<p>Polarization camera used to validate the polarization enhancement algorithm.</p>
Full article ">Figure 3
<p>Original image and algorithm-enhanced image. (<b>a</b>) Normal polarized image of <math display="inline"><semantics> <msup> <mn>0</mn> <mo>°</mo> </msup> </semantics></math>. (<b>b</b>) Normal polarized image of <math display="inline"><semantics> <msup> <mn>90</mn> <mo>°</mo> </msup> </semantics></math>. (<b>c</b>) Original image. (<b>d</b>) Reconstruced image by Schechner’s method <math display="inline"><semantics> <msup> <mn>90</mn> <mo>°</mo> </msup> </semantics></math>. (<b>e</b>) Reconstruced image by CLAHE. (<b>f</b>) Reconstruced image by polynomial fitting. (<b>g</b>) Reconstruced image by Butterworth filter. (<b>h</b>) Reconstruced image by method in the paper.</p>
Full article ">Figure 4
<p>Scene of the image taken by a polarization camera.</p>
Full article ">Figure 5
<p>Range-gated system.</p>
Full article ">Figure 6
<p>Experiment imaging target.</p>
Full article ">Figure 7
<p>Examination of the preservation of the polarization state of the image by adjusting the angle of the polarizer.</p>
Full article ">Figure 8
<p>Experimental setup for polarization-enhanced laser range-gated imaging.</p>
Full article ">Figure 9
<p>The coin used in the experiment.</p>
Full article ">Figure 10
<p>Original images of the three materials, along with images obtained and enhanced using the polarization algorithm. (<b>a</b>) Range-gated image of target board. (<b>b</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>d</b>) Polarization-enhanced range-gated images of target board. (<b>e</b>) Range-gated image of metal sheet. (<b>f</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>g</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>h</b>) Polarization-enhanced range-gated images of metal sheet. (<b>i</b>) Range-gated image of diving suit. (<b>j</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>k</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>l</b>) Polarization-enhanced range-gated images of diving suit.</p>
Full article ">Figure 11
<p>Original images of the target at different turbidities, along with images obtained and enhanced using the polarization algorithm. (<b>a</b>) Range-gated image. (<b>b</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>d</b>) Polarization-enhanced range-gated images of the target board under turibidity of 1.70 NTU. (<b>e</b>) Range-gated image. (<b>f</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>g</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>h</b>) Polarization-enhanced range-gated images of the target board under turibidity of 3.89 NTU. (<b>i</b>) Range-gated image. (<b>j</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>x</mi> </mrow> </msub> </semantics></math>. (<b>k</b>) <math display="inline"><semantics> <msub> <mi>I</mi> <mrow> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. (<b>l</b>) Polarization-enhanced range-gated images of the target board under turibidity of 5.41 NTU.</p>
Full article ">
11 pages, 1631 KiB  
Article
A Balloon Mapping Approach to Forecast Increases in PM10 from the Shrinking Shoreline of the Salton Sea
by Ryan G. Sinclair, Josileide Gaio, Sahara D. Huazano, Seth A. Wiafe and William C. Porter
Geographies 2024, 4(4), 630-640; https://doi.org/10.3390/geographies4040034 - 17 Oct 2024
Viewed by 1338
Abstract
Shrinking shorelines and the exposed playa of saline lakes can pose public health and air quality risks for local communities. This study combines a community science method with models to forecast future shorelines and PM10 air quality impacts from the exposed playa of [...] Read more.
Shrinking shorelines and the exposed playa of saline lakes can pose public health and air quality risks for local communities. This study combines a community science method with models to forecast future shorelines and PM10 air quality impacts from the exposed playa of the Salton Sea, near the community of North Shore, CA, USA. The community science process assesses the rate of shoreline change from aerial images collected through a balloon mapping method. These images, captured from 2019 to 2021, are combined with additional satellite images of the shoreline dating back to 2002, and analyzed with the DSAS (Digital Shoreline Analysis System) in ArcGIS desktop. The observed rate of change was greatly increased during the period from 2017 to 2020. The average rate of change rose from 12.53 m/year between 2002 and 2017 to an average of 38.44 m/year of shoreline change from 2017 to 2020. The shoreline is projected to retreat 150 m from its current position by 2030 and an additional 172 m by 2041. To assess potential air quality impacts, we use WRF-Chem, a regional chemical transport model, to predict increases in emissive dust from the newly exposed playa land surface. The model output indicates that the forecasted 20-year increase in exposed playa will also lead to a rise in the amount of suspended dust, which can then be transported into the surrounding communities. The combination of these model projections suggests that, without mitigation, the expanding exposed playa around the Salton Sea is expected to worsen pollutant exposure in local communities. Full article
(This article belongs to the Special Issue Feature Papers of Geographies in 2024)
Show Figures

Figure 1

Figure 1
<p>Map of the North Shore area of the Salton Sea, CA, with coastline segments (transects) used during this study in two different regions (North and South Yacht Club).</p>
Full article ">Figure 2
<p>A balloon mapping rig flying above the North Shore of the Salton Sea shown with a picavet holding a GoPro7 and suspended by three mylar sleeping bag balloons.</p>
Full article ">Figure 3
<p>An output from DSAS analysis in ArcGIS showing an area in North Shore Salton Sea with historical shoreline positions, which enabled the calculation shoreline change statistics. The final data used for the DSAS were in 2021, with the 2020 line shown here for reference in the image. The DSAS was used to show future shoreline positions with uncertainty bands for the 2031 and 2041 forecasts.</p>
Full article ">Figure 4
<p>Boxplots of the projected increase in PM10 concentrations in 2041 from a WRF-Chem model that uses the increase in land area of a 2-square-kilometer area as calculated from the DSAS model.</p>
Full article ">
22 pages, 29294 KiB  
Article
Ghost Removal from Forward-Scan Sonar Views near the Sea Surface for Image Enhancement and 3-D Object Modeling
by Yuhan Liu and Shahriar Negahdaripour
Remote Sens. 2024, 16(20), 3814; https://doi.org/10.3390/rs16203814 - 14 Oct 2024
Viewed by 724
Abstract
Underwater sonar is the primary remote sensing and imaging modality within turbid environments with poor visibility. The two-dimensional (2-D) images of a target near the air–sea interface (or resting on a hard seabed), acquired by forward-scan sonar (FSS), are generally corrupted by the [...] Read more.
Underwater sonar is the primary remote sensing and imaging modality within turbid environments with poor visibility. The two-dimensional (2-D) images of a target near the air–sea interface (or resting on a hard seabed), acquired by forward-scan sonar (FSS), are generally corrupted by the ghost and sometimes mirror components, formed by the multipath propagation of transmitted acoustic beams. In the processing of the 2-D FSS views to generate an accurate three-dimensional (3-D) object model, the corrupted regions have to be discarded. The sonar tilt angle and distance from the sea surface are two important parameters for the accurate localization of the ghost and mirror components. We propose a unified optimization technique for improving both the measurements of these two parameters from inexpensive sensors and the accuracy of a 3-D object model using 2-D FSS images at known poses. The solution is obtained by the recursive updating of sonar parameters and 3-D object model. Utilizing the 3-D object model, we can enhance the original images and generate synthetic views for arbitrary sonar poses. We demonstrate the performance of our method in experiments with the synthetic and real images of three targets: two dominantly convex coral rocks and a highly concave toy wood table. Full article
(This article belongs to the Topic Computer Vision and Image Processing, 2nd Edition)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Ghost component overlaps with and is indistinguishable from the object region in every view. Mirror component at reference position (elevation axis pointing upward) overlaps with both object and ghost regions (<b>a</b>). As the sonar rotates about the viewing direction (from 0° to 67.5° in increments of 22.5°, here), it separating from the object (<b>b</b>), and forms a distinct blob (<b>c</b>,<b>d</b>).</p>
Full article ">Figure 2
<p>(<b>a</b>) For a sonar beam in <math display="inline"><semantics> <mi>θ</mi> </semantics></math> direction, image intensity <math display="inline"><semantics> <mrow> <mspace width="-0.166667em"/> <mi>I</mi> <mspace width="-0.166667em"/> </mrow> </semantics></math> of pixel <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math> depends on cumulative echos from unknown number of surface patches within volume <math display="inline"><semantics> <mrow> <mspace width="-0.166667em"/> <msub> <mi>V</mi> <mrow> <mspace width="-0.166667em"/> <mi>ϕ</mi> </mrow> </msub> <mspace width="-0.166667em"/> </mrow> </semantics></math> arriving at sonar receiver simultaneously; <math display="inline"><semantics> <mrow> <mspace width="-0.166667em"/> <msub> <mi>V</mi> <mi>ϕ</mi> </msub> <mspace width="-0.166667em"/> </mrow> </semantics></math> covers elevation-angle interval <math display="inline"><semantics> <mrow> <mspace width="-0.166667em"/> <mo>[</mo> <mo>−</mo> <msub> <mi>W</mi> <mi>ϕ</mi> </msub> <mo>,</mo> <msub> <mi>W</mi> <mi>ϕ</mi> </msub> <mo>]</mo> </mrow> </semantics></math>, range interval [<span class="html-italic">ℜ</span>, <math display="inline"><semantics> <mrow> <mo>ℜ</mo> <mspace width="-0.166667em"/> <mo>+</mo> <mrow> <mi>δ</mi> <mo>ℜ</mo> </mrow> </mrow> </semantics></math>] along the beam covering azimuthal-angle interval [<math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>,</mo> <mi>θ</mi> <mspace width="-0.166667em"/> <mo>+</mo> <mspace width="-0.166667em"/> <mrow> <mi>δ</mi> <mi>θ</mi> </mrow> </mrow> </semantics></math>]. (<b>b</b>) A coral rock with voxelated volume and triangular surface mesh of SC solution. (<b>c</b>) Virtual mirror object geometry: transmitted sound waves in direction <math display="inline"><semantics> <msub> <mi mathvariant="bold-italic">R</mi> <mn>1</mn> </msub> </semantics></math> are scattered by surface at <math display="inline"><semantics> <msub> <mi>P</mi> <mi>s</mi> </msub> </semantics></math>. Reflected portion along “unique direction” <math display="inline"><semantics> <msub> <mi mathvariant="bold-italic">R</mi> <mn>2</mn> </msub> </semantics></math> towards <math display="inline"><semantics> <msub> <mi>P</mi> <mi>W</mi> </msub> </semantics></math> on water surface (with surface normal <math display="inline"><semantics> <mi mathvariant="bold-italic">n</mi> </semantics></math>) is specularly reflected towards the sonar along <math display="inline"><semantics> <msub> <mi mathvariant="bold-italic">R</mi> <mn>3</mn> </msub> </semantics></math>, leading to the appearance of a virtual mirror object point at <math display="inline"><semantics> <msub> <mi>P</mi> <mi>m</mi> </msub> </semantics></math>. (<b>d</b>) Virtual ghost object geometry: considering the reverse direction of the mirror-point pathway, sound waves traveling along <math display="inline"><semantics> <mrow> <mo>−</mo> <msub> <mi mathvariant="bold-italic">R</mi> <mn>3</mn> </msub> </mrow> </semantics></math> are specularly reflected towards the object along <math display="inline"><semantics> <mrow> <mo>−</mo> <msub> <mi mathvariant="bold-italic">R</mi> <mn>2</mn> </msub> </mrow> </semantics></math>, and are scattered at <math display="inline"><semantics> <msub> <mi>P</mi> <mi>s</mi> </msub> </semantics></math>, of which components along <math display="inline"><semantics> <mrow> <mo>−</mo> <msub> <mi mathvariant="bold-italic">R</mi> <mn>1</mn> </msub> </mrow> </semantics></math> are captured by the sonar. This leads to the appearance of ghost point <math display="inline"><semantics> <msub> <mi>P</mi> <mi>g</mi> </msub> </semantics></math> along the sonar beam directed at <math display="inline"><semantics> <msub> <mi>P</mi> <mi>s</mi> </msub> </semantics></math> (at a longer range <math display="inline"><semantics> <msub> <mi mathvariant="bold-italic">R</mi> <mi>g</mi> </msub> </semantics></math>).</p>
Full article ">Figure 3
<p>(<b>a</b>) Block diagram of entire algorithm; (<b>b</b>) steps in 3-D shape optimization by displacement of model vertices, computed from 3-D vertex motions that are estimated from the 2-D image motions aligning the object regions in the data and synthetic views.</p>
Full article ">Figure 4
<p>(<b>a</b>) The 2-D vectors <math display="inline"><semantics> <mrow> <mo>{</mo> <msubsup> <mrow> <mi mathvariant="bold">v</mi> </mrow> <mrow> <mi>m</mi> <mi>i</mi> </mrow> <mi>O</mi> </msubsup> <mo>,</mo> <msubsup> <mrow> <mi mathvariant="bold">v</mi> </mrow> <mrow> <mi>m</mi> <mi>j</mi> </mrow> <mi>M</mi> </msubsup> <mo>}</mo> </mrow> </semantics></math> align the frontal contours <math display="inline"><semantics> <mrow> <mo>{</mo> <msubsup> <mi mathvariant="script">C</mi> <mi>m</mi> <mi>O</mi> </msubsup> <mo>,</mo> <msubsup> <mi mathvariant="script">C</mi> <mi>m</mi> <mi>M</mi> </msubsup> <mo>}</mo> </mrow> </semantics></math> of the object and mirror regions in the real images with counterparts <math display="inline"><semantics> <mrow> <mo>{</mo> <msubsup> <mover accent="true"> <mi mathvariant="script">C</mi> <mo>˜</mo> </mover> <mi>m</mi> <mi>O</mi> </msubsup> <mo>,</mo> <msubsup> <mover accent="true"> <mi mathvariant="script">C</mi> <mo>˜</mo> </mover> <mi>m</mi> <mi>M</mi> </msubsup> <mo>}</mo> </mrow> </semantics></math> in the synthetic views; (<b>b</b>) magnified view of relevant regions.</p>
Full article ">Figure 5
<p>Processing steps in the decomposition of sonar data into object and ghost components. (<b>a</b>) Generation of synthetic object image from image formation model, and localizing the ghost and mirror components to identify regions overlapping with object image. (<b>b</b>) Segmentation of real and synthetic object regions into overlapping and non-overlapping parts, using non-overlapping region in generating the LUT for synthetic-to-real object transformation, and apply the LUT to reconstruct overlapping object region to complete the object image by fusing with non-overlapping part. (<b>c</b>) Segmentation of ghost area into overlapping and non-overlapping regions, producing the non-overlapping part. (<b>d</b>) Discounting for the object image within overlap area to generate the ghost component. (<b>e</b>) Generation of ghost image from overlapping and non-overlapping components.</p>
Full article ">Figure 6
<p>Three targets—two dominantly convex coral rocks with mild local concavities and a highly concave wood table—with height, maximum width, and imaging conditions.</p>
Full article ">Figure 7
<p>Coral-one experiment—(<b>a</b>–<b>d</b>) synthetic and (<b>a’</b>–<b>d’</b>) real data. (<b>a</b>,<b>b</b>,<b>a’</b>,<b>b’</b>) Optimization of sonar depth and tilt parameters. (<b>c</b>,<b>c’</b>) Image <math display="inline"><semantics> <msub> <mi>E</mi> <mi>I</mi> </msub> </semantics></math> and volumetric <math display="inline"><semantics> <msub> <mi>E</mi> <mi>V</mi> </msub> </semantics></math> errors moving in tandem confirm 3-D model improvement with reduced image error. (<b>d</b>) Initialized SC solution (top) and optimized 3-D model (bottom), shown by blue surface mesh, are superimposed on Kinect model (black mesh); (<b>d’</b>) optimized SC (blue mesh) and Kinect (red mesh) models.</p>
Full article ">Figure 8
<p>Coral-two experiment—(<b>a</b>–<b>d</b>) synthetic and (<b>a’</b>–<b>d’</b>) real data. (<b>a</b>,<b>b</b>,<b>a’</b>,<b>b’</b>) Optimization of sonar depth and tilt parameters. (<b>c</b>,<b>c’</b>) Improving 3-D model leads to smaller volumetric <math display="inline"><semantics> <msub> <mi>E</mi> <mi>V</mi> </msub> </semantics></math> and image <math display="inline"><semantics> <msub> <mi>E</mi> <mi>I</mi> </msub> </semantics></math> errors. (<b>d</b>) Kinect model (black mesh) superimposed on initialized SC solution (top) and optimized 3-D model (bottom), shown by blue surface meshes. (<b>d’</b>) Optimized SC (blue mesh) and Kinect (red mesh) models.</p>
Full article ">Figure 9
<p>Wood table experiment— (<b>a</b>–<b>d</b>) synthetic and (<b>a’</b>–<b>d’</b>) real data. (<b>a</b>,<b>b</b>,<b>a’</b>,<b>b’</b>) Optimization of sonar depth and tilt parameters. (<b>c</b>,<b>c’</b>) Improving 3-D model reduces both volumetric <math display="inline"><semantics> <msub> <mi>E</mi> <mi>V</mi> </msub> </semantics></math> and image <math display="inline"><semantics> <msub> <mi>E</mi> <mi>I</mi> </msub> </semantics></math> errors. (<b>d</b>) Kinect model (black mesh) superimposed on initialized SC solution (top) and optimized 3-D model (bottom), shown by blue surface meshes. (<b>d’</b>) Optimized SC (blue mesh) and Kinect (red mesh) models.</p>
Full article ">Figure 10
<p>Coral-one experiment—(<b>a</b>) data; (<b>b</b>) data over image region only; (<b>c</b>) initial and (<b>d</b>) optimized synthetic view generated by the 3-D model.</p>
Full article ">Figure 11
<p>Coral-two experiment—(<b>a</b>) data; (<b>b</b>) data over image region only; (<b>c</b>) initial and (<b>d</b>) optimized synthetic view generated by the 3-D model.</p>
Full article ">Figure 12
<p>Wood table experiment—(<b>a</b>) data; (<b>b</b>) data within object region only; (<b>c</b>) initial and (<b>d</b>) optimized synthetic view generated by the 3-D model.</p>
Full article ">Figure 13
<p>Sets of images as in previous experiments for (<b>a1</b>–<b>d1</b>) coral-one and (<b>a2</b>–<b>d2</b>) coral-two views, in which object, ghost, and mirror components overlap (not used in the optimization). (<b>a1</b>,<b>a2</b>) data; (<b>b1</b>,<b>b2</b>) data within object region only; (<b>c1</b>,<b>c2</b>) initial and (<b>d1</b>,<b>d2</b>) optimized synthetic view generated by the 3-D model.</p>
Full article ">Figure 14
<p>Sets of wood table images as in previous figures for views in which object, ghost, and mirror components overlap (not used in optimization process). (<b>a</b>) data; (<b>b</b>) data within object region only; (<b>c</b>) initial and (<b>d</b>) optimized synthetic view generated by the 3-D model.</p>
Full article ">
19 pages, 36440 KiB  
Article
OptiShipNet: Efficient Ship Detection in Complex Marine Environments Using Optical Remote Sensing Images
by Yunfeng Lin, Jinxi Li, Shiqing Wei and Shanwei Liu
J. Mar. Sci. Eng. 2024, 12(10), 1786; https://doi.org/10.3390/jmse12101786 - 8 Oct 2024
Viewed by 775
Abstract
Ship detection faces significant challenges such as dense arrangements, varying dimensions, and interference from the sea surface background. Existing ship detection methods often fail to accurately identify ships in these complex marine environments. This paper presents OptiShipNet, an efficient network for detecting ships [...] Read more.
Ship detection faces significant challenges such as dense arrangements, varying dimensions, and interference from the sea surface background. Existing ship detection methods often fail to accurately identify ships in these complex marine environments. This paper presents OptiShipNet, an efficient network for detecting ships in complex marine environments using optical remote sensing images. First, to effectively capture ship features from complex environments, we designed a DFC-ConvNeXt module as the network’s backbone, where decoupled fully connected (DFC) attention captures long-distance information in both vertical and horizontal directions, thereby enhancing its expressive capabilities. Moreover, a simple, parameter-free attention module (SimAM) is integrated into the network’s neck to enhance focus on ships within challenging backgrounds. To achieve precise ship localization, we employ WIoU loss, enhancing the ship positioning accuracy in complex environments. Acknowledging the lack of suitable datasets for intricate backgrounds, we construct the HRSC-CB dataset, featuring high-resolution optical remote sensing images. This dataset contains 3786 images, each measuring 1000 × 600 pixels. Experiments demonstrate that the proposed model accurately detects ships under complex scenes, achieving an average precision (AP) of 94.1%, a 3.2% improvement over YOLOv5. Furthermore, the model’s frame per second (FPS) rate reaches 80.35, compared to 67.84 for YOLOv5, thus verifying the approach’s effectiveness. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

Figure 1
<p>Ships in different scenes in ORSI. (<b>a</b>) Ships are in a simple scenario and are easy to detect. (<b>b</b>) Ships under cloud interference are often prone to missed detections. (<b>c</b>) The color of the ships is very similar to the sea, making them prone to missed detections. (<b>d</b>) Ships are in a complex port environment, where many interfering factors are present, leading to false detections and missed detections. The yellow box indicates the actual object detected by the existing algorithm, while the red box denotes the ground truth.</p>
Full article ">Figure 2
<p>Our overview framework for ship detection in complex contexts. First, DFC-ConvNeXt is employed to capture essential features within complex backgrounds. Then, the neck utilizes a feature pyramid structure to extract contextual information and perform feature fusion. Finally, we determine the specific location of the ship.</p>
Full article ">Figure 3
<p>ConvNeXt network structure.</p>
Full article ">Figure 4
<p>(<b>a</b>) Improved ConvNeXt network structure; (<b>b</b>) DFC-ConvNeXt Block.</p>
Full article ">Figure 5
<p>Remote sensing ship datasets are used in this study. (<b>a</b>) The HRSC2016 dataset contains mainly a variety of large ships with a single scene. (<b>b</b>) A sample of our constructed dataset contains ships in complex harbors, ships under background interference at the sea surface, and ships under cloud and fog interference.</p>
Full article ">Figure 6
<p>Size distribution of ships on (<b>a</b>) HRSC2016 and (<b>b</b>) HRSC-CB.</p>
Full article ">Figure 7
<p>Detection results from comparison using different methods on the HRSC-CB dataset. The first row shows the detection results of ships in a simple ocean background. The second row shows the detection results of small ships. The third rows show the detection results of ships in a complex sea surface background. The fourth row shows the ship detection results under cloud and fog interference. The final row presents the detection result of ships within a complex port environment.</p>
Full article ">Figure 8
<p>Embedding position of SimAM. Among them, orange represents position 1, yellow represents position 2, and green represents position 3. The SimAM applied at position 1 and position 2 in the network is only used for comparative experiments and is not part of the final network structure. The final network structure only embeds the SimAM at position 3.</p>
Full article ">Figure 9
<p>Visual comparison of the detection results from ablation experiments on the self-built dataset, where A represents the baseline YOLOv5.</p>
Full article ">
24 pages, 6042 KiB  
Article
A Methodology Based on Deep Learning for Contact Detection in Radar Images
by Rosa Gonzales Martínez, Valentín Moreno, Pedro Rotta Saavedra, César Chinguel Arrese and Anabel Fraga
Appl. Sci. 2024, 14(19), 8644; https://doi.org/10.3390/app14198644 - 25 Sep 2024
Viewed by 1251
Abstract
Ship detection, a crucial task, relies on the traditional CFAR (Constant False Alarm Rate) algorithm. However, this algorithm is not without its limitations. Noise and clutter in radar images introduce significant variability, hampering the detection of objects on the sea surface. The algorithm’s [...] Read more.
Ship detection, a crucial task, relies on the traditional CFAR (Constant False Alarm Rate) algorithm. However, this algorithm is not without its limitations. Noise and clutter in radar images introduce significant variability, hampering the detection of objects on the sea surface. The algorithm’s theoretically Constant False Alarm Rates are not upheld in practice, particularly when conditions change abruptly, such as with Beaufort wind strength. Moreover, the high computational cost of signal processing adversely affects the detection process’s efficiency. In previous work, a four-stage methodology was designed: The first preprocessing stage consisted of image enhancement by applying convolutions. Labeling and training were performed in the second stage using the Faster R-CNN architecture. In the third stage, model tuning was accomplished by adjusting the weight initialization and optimizer hyperparameters. Finally, object filtering was performed to retrieve only persistent objects. This work focuses on designing a specific methodology for ship detection in the Peruvian coast using commercial radar images. We introduce two key improvements: automatic cropping and a labeling interface. Using artificial intelligence techniques in automatic cropping leads to more precise edge extraction, improving the accuracy of object cropping. On the other hand, the developed labeling interface facilitates a comparative analysis of persistence in three consecutive rounds, significantly reducing the labeling times. These enhancements increase the labeling efficiency and enhance the learning of the detection model. A dataset consisting of 60 radar images is used for the experiments. Two classes of objects are considered, and cross-validation is applied in the training and validation models. The results yield a value of 0.0372 for the cost function, a recovery rate of 94.5%, and an accuracy rate of 95.1%, respectively. This work demonstrates that the proposed methodology can generate a high-performance model for contact detection in commercial radar images. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

Figure 1
<p>Detection system processes flow chart.</p>
Full article ">Figure 2
<p>Processes of plot extractor phase flow chart.</p>
Full article ">Figure 3
<p>Radar system architecture.</p>
Full article ">Figure 4
<p>Methodology flow chart.</p>
Full article ">Figure 5
<p>Preprocessing and enhancement phase flow chart.</p>
Full article ">Figure 6
<p>Radar image structure as matrix. Each column is an azimuth, and each row is the distance value given to all azimuths. Each index row starts with 0.</p>
Full article ">Figure 7
<p>Representation of a convolution in 1 dimension. * has been placed to represent the output value of the corresponding vector at that index, after convolution. Example: x1* is equal to the resulting convolution at the index where x1 was without *.</p>
Full article ">Figure 8
<p>Gaussian distribution with different standard deviations.</p>
Full article ">Figure 9
<p>(<b>Left</b>) Sperry Marine radar raw image. (<b>Right</b>) Resultant normalized image after the preprocessing and enhancement phase.</p>
Full article ">Figure 10
<p>Automatic cropping flow chart.</p>
Full article ">Figure 11
<p>Images of cutouts of objects.</p>
Full article ">Figure 12
<p>Object labeling interface. The red box above corresponds to a zoom section in the radar image, this zoom section is displayed at the bottom of the figure. The blue boxes at the bottom of the figure represent regions of the radar image where it is possible to find a plot. The red box at the bottom is used to frame a current plot. On the right hand side we visualise this plot framed with red and check with the previous and next lap whether at the same location, at these coordinates, the same plot exists as a persistence criterion.</p>
Full article ">Figure 13
<p>Training process flow chart.</p>
Full article ">Figure 14
<p>Criteria filtering phase flow chart.</p>
Full article ">Figure 15
<p>Predictions of marine vessels in the port of Callao, Perú using the Faster R-CNN model (100,000 epochs).</p>
Full article ">Figure 16
<p>Predictions in bounding boxes with the label and the confidence score.</p>
Full article ">Figure 17
<p>Confidencescores for both object classes “plot” and “no”.</p>
Full article ">
18 pages, 38471 KiB  
Article
Typhoon Intensity Change in the Vicinity of the Semi-Enclosed Sea of Japan
by Soo-Min Choi and Hyo Choi
J. Mar. Sci. Eng. 2024, 12(9), 1638; https://doi.org/10.3390/jmse12091638 - 13 Sep 2024
Viewed by 616
Abstract
The intensity change of Typhoon Songda (TY-0418) in the vicinity of the semi-enclosed Sea of Japan (SJ) was numerically investigated using 3D-WRF and UM-KMA models and GOES-IR satellite images on 4 to 8 September 2004. After the typhoon originated in the Western Pacific [...] Read more.
The intensity change of Typhoon Songda (TY-0418) in the vicinity of the semi-enclosed Sea of Japan (SJ) was numerically investigated using 3D-WRF and UM-KMA models and GOES-IR satellite images on 4 to 8 September 2004. After the typhoon originated in the Western Pacific Ocean in August, it moved to the East China Sea. Following the north-eastward Kuroshio Warm Current, it developed with horizontal and vertical asymmetrical wind and moisture patterns until 5 September. On 7 September, closing to the Kyushu Island, it was divided into three wind fields near the surface due to the increased friction from the surrounding lands and shallower sea depth close to the land, but it still maintained its circular shape over 1 km in height. As it passed by the Korea Strait and entered the SJ, it became a smaller, deformed typhoon due to the SJ’s surrounding mountains, located between the East Korea and Tsushima Warm Currents inside the SJ. Its center matched a high equivalent potential temperature area, releasing significant latent heat through the condensation of water particles over warm currents. The latent heat converted to kinetic energy could be supplied into the typhoon circulation, causing its development. Moist flux and streamline at 1.5 km in height clearly showed the moisture transportation via the mutual interaction of the cyclonic circulation of the typhoon and the anti-cyclonic circulation of the North Pacific High Pressure from the typhoon’s tail toward both the center of the SJ and the Russian Sakhalin Island in the north of Japan, directly causing large clouds in its right quadrant. Simultaneously, the central pressure decrease with time could converge both transported moist air by the typhoon itself and water particles evaporated from the sea, causing them to rise and resulting in the formation of large clouds and the rapid development of the typhoon circulation. The strong downslope winds from the surrounding mountains of the SJ to its center also produced a cyclonic vortex due to the Coriolis force to the right, enhancing the typhoon’s circulation. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Map of Northeast Asia with seas and ocean, (<b>b</b>) topography surrounding the Korean peninsula, China, and Japan, (<b>c</b>) track of TY Songda from 2 to 9 September, and (<b>d</b>) ocean currents surrounding Korea (modified from Lee [<a href="#B31-jmse-12-01638" class="html-bibr">31</a>]). In (<b>b</b>), Ho, Hs, and Ks denote Hokkaido, Honshu, and Kyushu Islands, and in (<b>d</b>), YE, EK, NK, Li, TS, TG, and SO denote the Yellow Sea Warm Current, the East Korea Warm Current, the North Korea Cold Current, the Liman Cold Current, the Tsushima Warm Current, the Tsugaru Warm Current, and the Soya Warm Current.</p>
Full article ">Figure 2
<p>(<b>a</b>) Surface weather map (hPa) supplied by the Korean Meteorological Administration (KMA) including a square area covered by the WRF course-mesh model domain at 09:00 LST, 5 September. (<b>b</b>) GOES-9 IR satellite image supplied by KMA at 09:01 LST, 2004.</p>
Full article ">Figure 3
<p>(<b>a</b>) Surface wind (m/s) at 10 m height at 09:00 LST, 5 September; (<b>b</b>) 3 km height; (<b>c</b>) vertical profile of horizontal winds (m/s) on a line A–B (typhoon eye; E) of <a href="#jmse-12-01638-f003" class="html-fig">Figure 3</a>a; (<b>d</b>) surface atmospheric pressure tendency (−16h Pa/day) causing typhoon development. As it approaches the lands, there are horizontal and vertical asymmetries of stronger winds to its right and increased sea bottom friction in the East China Sea as well as surface friction from surrounding lands.</p>
Full article ">Figure 4
<p>(<b>a</b>) Surface weather chart (hPa) at 09:00LST, 6 September, 2004; (<b>b</b>) GOES-9 IR satellite image before TY-Songda reached the Korea Strait and the west of Kyushu Island.</p>
Full article ">Figure 5
<p>(<b>a</b>) Surface wind (m/s) at 10 m height; (<b>b</b>) vertical profile of the horizontal wind (m/s) along the line A–B (across the typhoon eye; E) in (<b>a</b>) based on WRF model simulation at 09:00 LST, 6 September 2004, showing the asymmetrical deformation of the typhoon circulation.</p>
Full article ">Figure 6
<p>(<b>a</b>) Relative humidity (%) with wind speed (m/s) at 3 km height; (<b>b</b>) relative humidity (%) with wind (m/s) from 0 m to 13 km height along a line A–B in (<b>a</b>); refer to <a href="#jmse-12-01638-f005" class="html-fig">Figure 5</a>b.</p>
Full article ">Figure 7
<p>(<b>a</b>) Surface pressure tendency (<span class="html-italic">∂</span>p/<span class="html-italic">∂</span>t; hPa/day) for 12 hrs at 10 m height based on WRF model simulation; (<b>b</b>) equivalent potential temperature (EPT; K) at 1.5 km height (850 hPa) based on UM-KMA model simulation at 09:00 LST, 6 September 2004. The typhoon center corresponds to −12.9 hPa/day and a 367 K area.</p>
Full article ">Figure 8
<p>(<b>a</b>) Surface weather chart (hPa) at 09:00 LST; (<b>b</b>) GOES−9−IR satellite image at 09:01 LST, 7 September 2004.</p>
Full article ">Figure 9
<p>(<b>a</b>) Surface wind (m/s) at 10 m height at 09:00 LST, 7 September; (<b>b</b>) vertical profile horizontal wind (m/s) along the line A–B (E; the typhoon eye) from the surface to 13 km height in (<b>a</b>) based on WRF model simulation; (<b>c</b>) wind (m/s) at 1 km height, still maintaining a circular shape.</p>
Full article ">Figure 10
<p>(<b>a</b>) Moist flux (0.1 m/s); (<b>b</b>) streamline and isotach (wind speed &gt; 25 kt (green color and 50 kt (yellow color) at 850 (hPa; approximately 1.5 km height) based on UK-KMA model simulation at 09:00 LST, 7 September 2015. H denotes the North Pacific High pressure. The majority of moisture flux and streamline occurred in the right quadrant of the typhoon center, from its tail toward the right of Kyushu Island into the SJ (Japan), pulling significant moisture via mutual interactions between the cyclonic typhoon and the anti-cyclonic H from 40 N to 20 N.</p>
Full article ">Figure 11
<p>(<b>a</b>) Surface pressure tendency (hPa/day) for 12 hours at a 10 m height based on WRF model simulation; (<b>b</b>) equivalent potential temperature (EPT; K) at 1.5 km height (850 hPa) based on UM-KMA model simulation at 09:00 LST, 7 September. The typhoon center corresponds to −8.1 hPa/day and a 355 K area.</p>
Full article ">Figure 12
<p>(<b>a</b>) Surface weather chart (hPa) at 21:00 LST; (<b>b</b>) GOES-9 IR satellite image at 21:00 LST, 7 September 2004. Red frame in (<b>b</b>) denotes a coarse domain of the model.</p>
Full article ">Figure 13
<p>(<b>a</b>) Wind velocity (m/s) at 10 m height; (<b>b</b>) vertical distribution of horizontal wind (m/s) on a black line in <a href="#jmse-12-01638-f013" class="html-fig">Figure 13</a>a at 21:00 LST, 7 September 2004. When TY-Songda moved to the semi-enclosed SJ, the typhoon reformed as a smaller typhoon, still maintaining a circular shape with asymmetric wind patterns horizontally and vertically.</p>
Full article ">Figure 14
<p>(<b>a</b>) Wind velocity (m/s) at 1.5 km; (<b>b</b>) 3 km heights at 21:00 LST on 7 September.</p>
Full article ">Figure 15
<p>(<b>a</b>) Horizontal relative humidity (%) at a 3 km height; (<b>b</b>) vertical relative humidity (%) with wind (m/s) from 0 m to 13 km height along a line A–B in (<b>a</b>) at 21:00 LST, 7 September, showing the moisture asymmetries corresponding to the wind structures in <a href="#jmse-12-01638-f013" class="html-fig">Figure 13</a>b and <a href="#jmse-12-01638-f014" class="html-fig">Figure 14</a>b.</p>
Full article ">Figure 16
<p>(<b>a</b>) Moist flux (0.1 m/s); (<b>b</b>) streamline and isotach (wind speed &gt; 25 kt (green color) and 50 kt (yellow color) at 850 (hPa; about 1.5 km height) based on the UK-KMA model at 21:00 LST. Strong moisture flux between the cyclonic typhoon and the anti-cyclonic North Pacific High pressure (H) occurred from south to north.</p>
Full article ">Figure 17
<p>(<b>a</b>) Surface pressure tendency (hPa/day) at 10 m height based on WRF model simulation; (<b>b</b>) equivalent potential temperature (EPT; K) at a 1.5 km height (850 hPa) based on UM-KMA model simulation at 21:00 LST. The typhoon center corresponds to the area of −17.6 hPa/day and 349 K.</p>
Full article ">Figure 18
<p>(<b>a</b>) Daily mean sea surface temperature (SST; °C) (GOES-9 MCSST) near the Korean peninsula on 4 September; (<b>b</b>) 7 September, and (<b>c</b>) weekly mean SST (1 to 7 September). The red square includes Jeju Island. The typhoon center corresponds to 24 °C area near the SJ center, closer to the Tsushima Warm Current in <a href="#jmse-12-01638-f018" class="html-fig">Figure 18</a>c. The northeastward Kuroshio Warm Current flows from the eastern seas of Philippine and Taiwan toward the southern and eastern seas of Japan Island (28 °C) in <a href="#jmse-12-01638-f018" class="html-fig">Figure 18</a>c.</p>
Full article ">
29 pages, 14739 KiB  
Article
Use of SLSTR Sea Surface Temperature Data in OSTIA as a Reference Sensor: Implementation and Validation
by Chongyuan Mao, Simon Good and Mark Worsfold
Remote Sens. 2024, 16(18), 3396; https://doi.org/10.3390/rs16183396 - 12 Sep 2024
Viewed by 661
Abstract
Sea surface temperature (SST) data from the Sea and Land Surface Temperature Radiometer (SLSTR) onboard the Sentinel-3 satellites have been used in the Met Office’s Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) since 2019 (Sentinel-3A SST data since March 2019 and [...] Read more.
Sea surface temperature (SST) data from the Sea and Land Surface Temperature Radiometer (SLSTR) onboard the Sentinel-3 satellites have been used in the Met Office’s Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) since 2019 (Sentinel-3A SST data since March 2019 and Sentinel-3B data since December 2019). The impacts of using SLSTR SSTs and the SLSTR as the reference sensor for the bias correction of other satellite data have been assessed using independent Argo float data. Combining Sentinel-3A and -3B SLSTRs with two Visible Infrared Imaging Radiometer Suite (VIIRS) sensors (onboard the joint NASA/NOAA Suomi National Polar-orbiting Partnership and National Oceanic and Atmospheric Administration-20 satellites) in the reference dataset has also been investigated. The results indicate that when using the SLSTR as the only reference satellite sensor, the OSTIA system becomes warmer overall, although there are mixed impacts in different parts of the global ocean. Using both the VIIRS and the SLSTR in the reference dataset leads to moderate but more consistent improvements globally. Numerical weather prediction (NWP) results also indicate a better performance when using both the VIIRS and the SLSTR in the reference dataset compared to only using the SLSTR at night. Combining the VIIRS and the SLSTR with latitudinal weighting shows the best validation results against Argo, but further investigation is required to refine this method. Full article
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>An example of the relationship between wind speed, skin-to-depth conversion values and the number of SST data: (<b>a</b>) skin-to-depth conversions using constant Fix0.17 (red line), DonlonWS (blue line) and diurnal model (boxes with whiskers, with the central line showing the median, the boxes representing the higher and lower quartiles, and the whiskers indicating the maximum and minimum values) against wind speed during night-time; (<b>b</b>) same as (<b>a</b>) but for daytime data; (<b>c</b>) the number of SST data against wind speed during night-time; and (<b>d</b>) same as (<b>c</b>) but for daytime. The vertical cyan line in panel (<b>d</b>) indicates the 6 m/s threshold, below which daytime SST data are rejected.</p>
Full article ">Figure 2
<p>The number of Argo profiles in each of the 20° × 20° boxes during (<b>a</b>) July–September 2019 and (<b>b</b>) November 2019–January 2020.</p>
Full article ">Figure 3
<p>Average bias field for merged Sentinel-3A and -3B SLSTR dual-view observations against the reference dataset in: (<b>a</b>) July 2019 and (<b>b</b>) January 2020. Note, there is no land–sea mask applied to the bias fields in order to allow for the representation of large-scale biases.</p>
Full article ">Figure 4
<p>The coverage of the satellite reference data in the OSTIA configurations on an example day (31 January 2021): (<b>a</b>) Control, (<b>b</b>) S3Ref and (<b>c</b>) S3ntRef. The corresponding satellite SSTs in the three configurations are QL 5 night-time VIIRS SSTs, daytime and night-time dual-view SLSTR SSTs, and night-time dual-view SLSTR SSTs.</p>
Full article ">Figure 5
<p>The absolute change in OSTIA-minus-Argo mean differences and standard deviation change ratio of OSTIA-minus-Argo for experimental configurations against the control: (<b>a</b>) the change in absolute O-A mean difference for S3Ref during July–September 2019; (<b>b</b>) the standard deviation change ratio of O-A for S3Ref during July–September 2019; (<b>c</b>) the change in absolute O-A mean differences for S3ntRef during July–September 2019; (<b>d</b>) the standard deviation change ratio of O-A for S3ntRef during July–September2019; (<b>e</b>) the change in absolute O-A mean difference for S3Ref during November 2019–January 2020; (<b>f</b>) the standard deviation change ratio of O-A for S3Ref during November 2019–January 2020; (<b>g</b>) the change in absolute O-A mean difference for S3ntRef during November 2019–January 2020; and (<b>h</b>) the standard deviation change ratio of O-A for S3ntRef in during November 2019–January 2020. Blue indicates improvement in the experimental configurations against the control and red indicates degradation against the control.</p>
Full article ">Figure 6
<p>The averaged bias fields for MetOp-B AVHRR observations in (<b>a</b>) control run over July–September; (<b>b</b>) control run over November–January; (<b>c</b>) S3Ref run over July–September; (<b>d</b>) S3Ref run over November–January; (<b>e</b>) S3ntRef run over July–September; and (<b>f</b>) S3ntRef run over November–January.</p>
Full article ">Figure 7
<p>The averaged bias fields for AMSR2 observations in (<b>a</b>) control run over July–September; (<b>b</b>) control run over November 2019–January 2020; (<b>c</b>) S3Ref run over July–September 2019; (<b>d</b>) S3Ref run over November 2019–January 2020; (<b>e</b>) S3ntRef run over July–September 2019; and (<b>f</b>) S3ntRef run over November 2019–January 2020.</p>
Full article ">Figure 8
<p>The bias field of daytime SLSTR dual-view data against the reference field for S3ntRef run for the periods of (<b>a</b>) July–September 2019 and (<b>b</b>) November 2019–January 2020.</p>
Full article ">Figure 9
<p>The absolute change in OSTIA-minus-Argo mean differences and change ratio of OSTIA-minus-Argo standard deviation for experimental configurations against the control in July–September 2019: (<b>a</b>) the change in absolute O-A mean difference for S3VIIRSref; (<b>b</b>) the change ratio of O-A standard deviation for S3VIIRSref; (<b>c</b>) the change of absolute O-A mean differences for S3ntVIIRSref; (<b>d</b>) the change ratio of O-A standard deviation for S3ntVIIRSref; (<b>e</b>) the change in absolute O-A mean difference for S3VIIRSrefQC; (<b>f</b>) the change ratio of O-A standard deviation for S3VIIRSrefQC; (<b>g</b>) the change in absolute O-A mean difference for S3ntVIIRSrefQC; and (<b>h</b>) the change ratio of O-A standard deviation for S3ntVIIRSrefQC. Blue indicates improvement in the experimental configurations against the control and red indicates degradation against the control.</p>
Full article ">Figure 10
<p>The change in absolute OSTIA-minus-Argo mean differences and change ratio of OSTIA-minus-Argo standard deviation for experimental configurations against the control in November 2019–January 2020: (<b>a</b>) the change in absolute O-A mean difference for S3VIIRSref; (<b>b</b>) the change ratio of O-A standard deviation for S3VIIRSref; (<b>c</b>) the change in absolute O-A mean differences for S3ntVIIRSref; (<b>d</b>) the change ratio of O-A standard deviation for S3ntVIIRSref; (<b>e</b>) the change in absolute O-A mean difference for S3VIIRSrefQC; (<b>f</b>) the change ratio of O-A standard deviation for S3VIIRSrefQC; (<b>g</b>) the change in absolute O-A mean difference for S3ntVIIRSrefQC; and (<b>h</b>) the change ratio of O-A standard deviation for S3ntVIIRSrefQC. Blue indicates improvement in the experimental configurations against the control and red indicates degradation against the control.</p>
Full article ">Figure 11
<p>The averaged bias fields for MetOp-B AVHRR observations in (<b>a</b>) control run over July–September 2019; (<b>b</b>) control run over November 2019–January 2020; (<b>c</b>) S3VIIRSref run over July–September 2019; (<b>d</b>) S3VIIRSrun run over November 2019–January 2020; (<b>e</b>) S3ntVIIRSrefQCrun over July–September 2019; and (<b>f</b>) S3ntVIIRSrefQC run over November 2019–January 2020.</p>
Full article ">Figure 12
<p>The averaged bias fields for AMSR2 observations in: (<b>a</b>) control run over July–September 2019; (<b>b</b>) control run over November 2019–January 2020; (<b>c</b>) S3VIIRSref run over July–September 2019; (<b>d</b>) S3VIIRSrun run over November 2019–January 2020; (<b>e</b>) S3ntVIIRSrefQCrun over July–September 2019; and (<b>f</b>) S3ntVIIRSrefQC run over November 2019–January 2020.</p>
Full article ">
25 pages, 11298 KiB  
Article
A Synthetic Aperture Radar Imaging Simulation Method for Sea Surface Scenes Combined with Electromagnetic Scattering Characteristics
by Yao He, Le Xu, Jincong Huo, Huaji Zhou and Xiaowei Shi
Remote Sens. 2024, 16(17), 3335; https://doi.org/10.3390/rs16173335 - 8 Sep 2024
Viewed by 778
Abstract
Synthetic aperture radar (SAR) simulation is a vital tool for planning SAR missions, interpreting SAR images, and extracting valuable information. SAR imaging is essential for analyzing sea scenes, and the accuracy of sea surface and scattering models is crucial for effective SAR simulations. [...] Read more.
Synthetic aperture radar (SAR) simulation is a vital tool for planning SAR missions, interpreting SAR images, and extracting valuable information. SAR imaging is essential for analyzing sea scenes, and the accuracy of sea surface and scattering models is crucial for effective SAR simulations. Traditional methods typically employ empirical formulas to fit sea surface scattering, which are not closely aligned with the principles of electromagnetic scattering. This paper introduces a novel approach by constructing multiple sea surface models based on the Pierson–Moskowitz (P-M) sea spectrum, integrated with the stereo wave observation projection (SWOP) expansion function to thoroughly account for the influence of wave fluctuation characteristics on radar scattering. Utilizing the shooting and bouncing ray-physical optics (SBR-PO) method, which adheres to the principles of electromagnetic scattering, this study not only analyzes sea surface scattering characteristics under various sea conditions but also facilitates the computation of scattering coupling between multiple targets. By constructing detailed scattering distribution data, the method achieves high-precision SAR simulation results. The scattering model developed using the SBR-PO method provides a more nuanced description of sea surface scenes compared to traditional methods, achieving an optimal balance between efficiency and accuracy, thus significantly enhancing sea surface SAR imaging simulations. Full article
Show Figures

Figure 1

Figure 1
<p>P-M spectrum at different wind speeds.</p>
Full article ">Figure 2
<p>Three-dimensional sea surface simulation diagram.</p>
Full article ">Figure 3
<p>Schematic diagram of sea surface simulation at different wind speeds and directions. (<b>a</b>) the wind speed is 5 m/s, and the wind direction is 30°; (<b>b</b>) the wind speed is 7 m/s, and the wind direction is 30°; (<b>c</b>) the wind speed is 10 m/s, and the wind direction is 30°; (<b>d</b>) the wind speed is 5 m/s, and the wind direction is 90°; (<b>e</b>) the wind speed is 7 m/s, and the wind direction is 90°; (<b>f</b>) the wind speed is 10 m/s, and the wind direction is 90°.</p>
Full article ">Figure 4
<p>Schematic diagram of the scattering path of composite targets on the sea surface.</p>
Full article ">Figure 5
<p>Sea surface–single ship scattering model. (<b>a</b>) Schematic diagram of the sea surface–single ship composite scenario model. (<b>b</b>) Ship size.</p>
Full article ">Figure 6
<p>Plate–cube combined scattering model. (<b>a</b>) Schematic diagram of the flat plate–cube composite scene model. (<b>b</b>) Composite scattering field of the combined model under different incident wave angles <math display="inline"><semantics> <mi>θ</mi> </semantics></math>.</p>
Full article ">Figure 7
<p>Composite scattering field of the combined model under different incident wave angles <math display="inline"><semantics> <mi>θ</mi> </semantics></math>. (<b>a</b>) Schematic diagram of the scattering corner. (<b>b</b>) The incident wave is divided into two parts. (<b>c</b>) The incident wave is divided into three parts.</p>
Full article ">Figure 8
<p>Scattering field of the composite scene at different incident wave azimuths. (<b>a</b>) The azimuth is 0°. (<b>b</b>) The azimuth is 90°.</p>
Full article ">Figure 9
<p>Scattering field of the composite scene under different sea surface wind speeds. (<b>a</b>) The wind speed is 5 m/s. (<b>b</b>) The wind speed is 7 m/s. (<b>c</b>) The wind speed is 10 m/s. (<b>d</b>) The ship–sea coupled scatting.</p>
Full article ">Figure 10
<p>Scattering field of the composite scene under different sea surface wind angles. (<b>a</b>) Total field. (<b>b</b>) Scattering of a ship. (<b>c</b>) Scattering of the sea surface. (<b>d</b>) Scattering of coupling.</p>
Full article ">Figure 11
<p>Scattering field of the composite scene under different incident wave polarization modes. (<b>a</b>) The azimuth is 0°. (<b>b</b>) The azimuth is 90°.</p>
Full article ">Figure 12
<p>Scattering field of the composite scene with different incident wave frequencies.</p>
Full article ">Figure 13
<p>Multiple ships scattering model. (<b>a</b>) Schematic diagram of sea surface–multi-ship composite scene model. (<b>b</b>) Schematic diagram of the relative positions of two ships.</p>
Full article ">Figure 14
<p>Monostatic scattering of ships at different left and right distances in the sea surface–two-ship composite scene. (<b>a</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> <mn>0</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>Y</mi> <mo>=</mo> <mn>80</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>. (<b>b</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> <mn>100</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>Y</mi> <mo>=</mo> <mrow> <mn>80</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>. (<b>c</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> <mn>0</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>Y</mi> <mo>=</mo> <mn>120</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>. (<b>d</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> <mn>100</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>Y</mi> <mo>=</mo> <mn>80</mn> <mspace width="4.pt"/> <mi mathvariant="normal">m</mi> </mrow> </semantics></math>.</p>
Full article ">Figure 15
<p>Ship–ship coupled scattering in the composite scene of the sea surface and two ships. (<b>a</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> </mrow> </semantics></math> 0 m. (<b>b</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>X</mi> <mo>=</mo> </mrow> </semantics></math> 100 m.</p>
Full article ">Figure 16
<p>Sea surface–ship-interference model. (<b>a</b>) Schematic diagram of the sea surface–ship-interference composite scenario model. (<b>b</b>) Ship size.</p>
Full article ">Figure 17
<p>Composite scene scattering distribution when the incident wave azimuth angles are 0°, 90°, and 180°. (<b>a</b>) The azimuth is 0°. (<b>b</b>) The azimuth is 90°. (<b>c</b>) The azimuth is 180°.</p>
Full article ">Figure 18
<p>Composite scene scattering distribution when the incident wave azimuth angles are 45° and 225°. (<b>a</b>) The azimuth is 45°. (<b>b</b>) The azimuth is 225°.</p>
Full article ">Figure 19
<p>Composite model SAR Imaging. (<b>a</b>) Schematic diagram of the sea surface–frigate composite scenario model. (<b>b</b>) Schematic diagram of radar wave incident angle in the sea surface–ship composite scenario model.</p>
Full article ">Figure 20
<p>Comparison of sea surface–ship imaging results at different incident wave angles. (<b>a</b>) The pitch is 30°, and the azimuth is 30°. (<b>b</b>) The pitch is 60°, and the azimuth is 30°. (<b>c</b>) The pitch is 30°, and the azimuth is 60°. (<b>d</b>) The pitch is 60°, and the azimuth is 60°.</p>
Full article ">Figure 21
<p>Comparison of sea surface–ship imaging results at different sea surface wind speeds. (<b>a</b>) The wind speed is 3 m/s. (<b>b</b>) The wind speed is 5 m/s. (<b>c</b>) The wind speed is 7 m/s. (<b>d</b>) The wind speed is 10 m/s.</p>
Full article ">Figure 22
<p>Comparison of sea surface–ship imaging results under different sea surface wind angles, with <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>=</mo> <mn>10</mn> </mrow> </semantics></math> m/s. (<b>a</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> <mo>−</mo> </mrow> </semantics></math> 45°. (<b>b</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math> 0°. (<b>c</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math> 45°.</p>
Full article ">Figure 23
<p>Comparison of sea surface–ship imaging results under different sea surface wind angles, with <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>=</mo> <mn>5</mn> </mrow> </semantics></math> m/s. (<b>a</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> <mo>−</mo> </mrow> </semantics></math> 45°. (<b>b</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math> 0°. (<b>c</b>) <math display="inline"><semantics> <mrow> <mo>Δ</mo> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math> 45°.</p>
Full article ">Figure 24
<p>SAR imaging results of the sea surface–ship interference composite scene. (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math>0°. (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math>48°. (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math>270°. (<b>d</b>) <math display="inline"><semantics> <mrow> <mi>ϕ</mi> <mo>=</mo> </mrow> </semantics></math>105°.</p>
Full article ">
24 pages, 6993 KiB  
Article
Advancing Volcanic Activity Monitoring: A Near-Real-Time Approach with Remote Sensing Data Fusion for Radiative Power Estimation
by Giovanni Salvatore Di Bella, Claudia Corradino, Simona Cariello, Federica Torrisi and Ciro Del Negro
Remote Sens. 2024, 16(16), 2879; https://doi.org/10.3390/rs16162879 - 7 Aug 2024
Viewed by 1561
Abstract
The global, near-real-time monitoring of volcano thermal activity has become feasible through thermal infrared sensors on various satellite platforms, which enable accurate estimations of volcanic emissions. Specifically, these sensors facilitate reliable estimation of Volcanic Radiative Power (VRP), representing the heat radiated during volcanic [...] Read more.
The global, near-real-time monitoring of volcano thermal activity has become feasible through thermal infrared sensors on various satellite platforms, which enable accurate estimations of volcanic emissions. Specifically, these sensors facilitate reliable estimation of Volcanic Radiative Power (VRP), representing the heat radiated during volcanic activity. A critical factor influencing VRP estimates is the identification of hotspots in satellite imagery, typically based on intensity. Different satellite sensors employ unique algorithms due to their distinct characteristics. Integrating data from multiple satellite sources, each with different spatial and spectral resolutions, offers a more comprehensive analysis than using individual data sources alone. We introduce an innovative Remote Sensing Data Fusion (RSDF) algorithm, developed within a Cloud Computing environment that provides scalable, on-demand computing resources and services via the internet, to monitor VRP locally using data from various multispectral satellite sensors: the polar-orbiting Moderate Resolution Imaging Spectroradiometer (MODIS), the Sea and Land Surface Temperature Radiometer (SLSTR), and the Visible Infrared Imaging Radiometer Suite (VIIRS), along with the geostationary Spinning Enhanced Visible and InfraRed Imager (SEVIRI). We describe and demonstrate the operation of this algorithm through the analysis of recent eruptive activities at the Etna and Stromboli volcanoes. The RSDF algorithm, leveraging both spatial and intensity features, demonstrates heightened sensitivity in detecting high-temperature volcanic features, thereby improving VRP monitoring compared to conventional pre-processed products available online. The overall accuracy increased significantly, with the omission rate dropping from 75.5% to 3.7% and the false detection rate decreasing from 11.0% to 4.3%. The proposed multi-sensor approach markedly enhances the ability to monitor and analyze volcanic activity. Full article
(This article belongs to the Special Issue Application of Remote Sensing Approaches in Geohazard Risk)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>1</b>) Derivation of the Normalized Thermal Index (NTI) obtained by combining the radiance of the MIR and the radiance of the TIR. (<b>2</b>) Application of the Spatial Standard Deviation (SSD) filter to each pixel in the image. (<b>3</b>) Definition of two statistical masks, Mask1 and Mask2, to identify “potential” and “true” hotspots, applied on the SSD and NTI of the volcanic area (VA). (<b>4</b>) Application of Gabor filter to extract the significant features of the image, resulting in a matrix called Gabor Weighted NTI (G-NTI). (<b>5</b>) Highlighting hotspots in the crater area and defining the Spatial Gabor Weighted NTI (SG-NTI). (<b>6</b>) Application of a statistical mask to the previously extracted matrix. (<b>7</b>) Calculation of the final VRP.</p>
Full article ">Figure 2
<p>Workflow image of the RSDF algorithm. Study cases: (<b>a</b>) Etna on 2 December 2023 at 01:10 UTC, MODIS sensor; (<b>b</b>) Etna on 15 January 2023 at 20:46 UTC, SLSTR sensor; (<b>c</b>) Stromboli on 3 October 2023 at 13:10 UTC, MODIS sensor; (<b>d</b>) Stromboli on 23 October 2023 at 09:08 UTC, SLSTR sensor.</p>
Full article ">Figure 3
<p>Time series of the Etna volcano. The panels show VRP calculated respectively from the RSDF Algorithm SLSTR (blue triangles) SLSTR Level 2 (red triangles), the RSDF Algorithm MODIS (blue triangles), MODIS Level 2 (red triangles). (<b>a</b>,<b>c</b>) shows data from January 2021 to April 2022, (<b>b</b>,<b>d</b>) from April 2022 to June 2023.</p>
Full article ">Figure 4
<p>Histograms (<b>a</b>,<b>c</b>) and probability plots (<b>b</b>,<b>d</b>) for Etna datasets. (<b>a</b>,<b>c</b>) Histograms display data distribution related to VRP (and FRP) in logarithmic scale; (<b>a</b>) blue bars represent the distribution of SLSTR-–RSDF algorithm processed data, and red bars represent SLSTR Level 2 product data; (<b>c</b>) blue bars represent the distribution of MODIS–RSDF algorithm processed data, and red bars represent MODIS active fire products. (<b>b</b>,<b>d</b>) Probability plots for normal distribution of RSDF algorithm processed data (blue), and Level 2 product data (red). The dashed grey lines represent the reference lines of the theoretical distributions, and the black dashed line in (<b>b</b>) corresponds to the slope change associated with the transition between regimes of background and high thermal activity.</p>
Full article ">Figure 5
<p>Stacked time series of VRPw (weekly mean) retrieved for the SLSTR–RSDF algorithm processed data (blue), SLSTR Level 2 product data (red), MODIS–RSDF algorithm processed data (green), and SLSTR Level 2 product data (black) at the Etna volcano, displayed on a logarithmic scale.</p>
Full article ">Figure 6
<p>VRP time series of the Stromboli volcano. The panels show VRP calculated respectively from the RSDF Algorithm SLSTR (blue triangles) SLSTR Level 2 (red triangles), the RSDF Algorithm MODIS (blue triangles), MODIS Level 2 (red triangles). (<b>a</b>,<b>c</b>) shows data from January 2021 to April 2022, (<b>b</b>,<b>d</b>) from April 2022 to June 2023.</p>
Full article ">Figure 7
<p>Histograms (<b>a</b>,<b>c</b>) and probability plots (<b>b</b>,<b>d</b>) for Stromboli datasets. (<b>a</b>,<b>c</b>) Histograms display data distribution related to VRP (and FRP) in logarithmic scale; (<b>a</b>) blue bars represent the distribution of SLSTR–RSDF algorithm processed data, and red bars represent SLSTR Level 2 product data; (<b>c</b>) blue bars represent the distribution of MODIS–RSDF algorithm processed data, and red bars represent MODIS active fire products. (<b>b</b>,<b>d</b>) Probability plots for normal distribution of RSDF algorithm processed data (blue), and Level 2 product data and MODIS active fire products (red). The dashed grey lines represent the reference lines of the theoretical distributions, and the black dashed line in (<b>b</b>) corresponds to the slope change associated with the transition between regimes of background and high thermal activity for Stromboli.</p>
Full article ">Figure 8
<p>Stacked time series of VRPw (weekly mean) retrieved for SLSTR–RSDF algorithm processed data (blue), SLSTR Level 2 product data (red), MODIS–RSDF algorithm processed data (green), and MODIS active fire products (black) at the Etna volcano, displayed on a logarithmic scale.</p>
Full article ">Figure 9
<p>Cumulative Volcanic Radiative Energy (VRE) calculated from VRP (and FRP) using the trapezoidal rule for integration. The blue line represents VRESLSTR, the red dashed line FREMODIS, the green dashed line VREMODIS, and the black dashed line FREMODIS. Panels (<b>a</b>,<b>c</b>) show data for Etna; panels (<b>b</b>,<b>d</b>) show data for Stromboli.</p>
Full article ">Figure 10
<p>Radiative power time series from SLSTR– and MODIS–RSDF algorithm data with intensity limits categorized as low, moderate, high, and extreme. (<b>a</b>) Etna, (<b>b</b>) Stromboli.</p>
Full article ">Figure 11
<p>Temporal trend of VRP values derived from the RSDF algorithm for SEVIRI, SLSTR, MODIS, and VIIRS over two periods at Mt. Etna: (<b>a</b>) 1 February 2021–30 April 2021, and (<b>b</b>) 27 September 2023–10 October 2023.</p>
Full article ">Figure 12
<p>TADR and lava flow volume flux during the effusive event at Etna from 14 May 2022 to 16 June 2022. TADR_max, TADR_mean, and TADR_min are represented by blue, red, and green points, respectively. The total volume_max, volume_mean, and volume_min are represented by blue, red, and green lines, respectively.</p>
Full article ">Figure 13
<p>TADR and lava flow volume flux during the effusive event at Stromboli from 27 September 2023 to 10 October 2023. TADR_max, TADR_mean, and TADR_min are represented by blue, red, and green points, respectively. The total volume_max, volume_mean, and volume_min are represented by blue, red, and green lines, respectively.</p>
Full article ">
13 pages, 7781 KiB  
Article
Operational Mapping of Submarine Groundwater Discharge into Coral Reefs: Application to West Hawai‘i Island
by Gregory P. Asner, Nicholas R. Vaughn and Joseph Heckler
Oceans 2024, 5(3), 547-559; https://doi.org/10.3390/oceans5030031 - 5 Aug 2024
Viewed by 1301
Abstract
Submarine groundwater discharge (SGD) is a recognized contributor to the hydrological and biogeochemical functioning of coral reef ecosystems located along coastlines. However, the distribution, size, and thermal properties of SGD remain poorly understood at most land–reef margins. We developed, deployed, and demonstrated an [...] Read more.
Submarine groundwater discharge (SGD) is a recognized contributor to the hydrological and biogeochemical functioning of coral reef ecosystems located along coastlines. However, the distribution, size, and thermal properties of SGD remain poorly understood at most land–reef margins. We developed, deployed, and demonstrated an operational method for airborne detection and mapping of SGD using the 200 km coastline of western Hawai‘i Island as a testing and analysis environment. Airborne high spatial resolution (1 m) thermal imaging produced relative sea surface temperature (SST) maps that aligned geospatially with boat-based transects of SGD presence–absence. Boat-based SST anomaly measurements were highly correlated with airborne SST anomaly measurements (R2 = 0.85; RMSE = 0.04 °C). Resulting maps of the relative difference in SST inside and outside of SGD plumes, called delta-SST, revealed 749 SGD plumes in 200 km of coastline, with nearly half of the SGD plumes smaller than 0.1 ha in size. Only 9% of SGD plumes were ≥1 ha in size, and just 1% were larger than 10 ha. Our findings indicate that small SGD is omnipresent in the nearshore environment. Furthermore, we found that the infrequent, large SGD plumes (>10 ha) displayed the weakest delta-SST values, suggesting that large discharge plumes are not likely to provide cooling refugia to warming coral reefs. Our operational approach can be applied frequently over time to generate SGD information relative to terrestrial substrate, topography, and pollutants. This operational approach will yield new insights into the role that land-to-reef interactions have on the composition and condition of coral reefs along coastlines. Full article
(This article belongs to the Topic Conservation and Management of Marine Ecosystems)
Show Figures

Figure 1

Figure 1
<p>Map of the west Hawai‘i Island study area. The eight flight coverage regions along the mapped coastline are shown in different colors. Inset shows the location of Hawai‘i Island west of the continental United States. Red box in upper right corner indicates locations of Hawai‘i Island globally. White arrow indicates true north.</p>
Full article ">Figure 2
<p>Example image mosaics of sea surface temperature (<b>a</b>) before and (<b>b</b>) after flight line temperature offset computations for improved blending of flight line temperatures. Background imagery provided by Google Earth™. White arrow indicates true north.</p>
Full article ">Figure 3
<p>Demonstration of the standardizing effect of conversion from raw surface temperature maps (<b>a</b>) to relative temperature maps (<b>b</b>). The latter were used to visually outline SGD plumes across the state. Background imagery provided by Google Earth™. White arrow indicates true north.</p>
Full article ">Figure 4
<p>Distribution of field validation sites (<span class="html-italic">n</span> = 52) at (<b>a</b>) region, (<b>b</b>) cluster, and (<b>c</b>) SGD plume scales. Panel (<b>c</b>) also shows seven of the mapped SGD plumes in red, with boat-based transects shown as sequential dots. Values in panel (<b>a</b>) indicate number of SGD plumes in each sub-region.</p>
Full article ">Figure 5
<p>Spatial density of submarine groundwater discharge (SGD) sites along the west Hawai‘i Island coastline. Refer to <a href="#oceans-05-00031-f001" class="html-fig">Figure 1</a> for geographic context. Due to the rare occurrence of very large SGD plumes, the density maps were calculated by the number and size of discharge sites in specific size classes of ≤1 ha and &gt;1 ha per linear coastline distance of one km.</p>
Full article ">Figure 6
<p>Frequency distribution of submarine groundwater discharge (SGD) plumes along the west Hawai‘i coastline.</p>
Full article ">Figure 7
<p>Frequency distribution of remotely sensed delta-SST in each submarine groundwater discharge (SGD) plume. Delta-SST was calculated as the difference in sea surface temperature inside relative to outside of each mapped SGD plume (see Methods).</p>
Full article ">Figure 8
<p>(<b>a</b>) Example boat transect into submarine groundwater discharge (SGD) plume (in red) at Honokōhau Harbor. (<b>b</b>) Transect results corresponding to panel (<b>a</b>). Dots in both panels indicate each sea surface temperature (SST) measurement.</p>
Full article ">Figure 9
<p>Relationship between boat-based estimates of delta-SST inside and outside of each SGD plume and those derived from airborne thermal imaging (n = 47). Open circles indicate each SGD site and dashed line indicates quadratic fit line.</p>
Full article ">Figure 10
<p>Relationship between remotely sensed submarine groundwater discharge (SGD) plume area and delta-SST values. Open circles indicate each SGD site.</p>
Full article ">
21 pages, 6928 KiB  
Article
Quality Assessment of Operational Sea Surface Temperature Product from FY-4B/AGRI with In Situ and OSTIA Data
by Quanjun He, Peng Cui and Yanwei Chen
Remote Sens. 2024, 16(15), 2769; https://doi.org/10.3390/rs16152769 - 29 Jul 2024
Cited by 1 | Viewed by 781
Abstract
The Fengyun-4B (FY-4B) satellite is currently the primary operational geostationary meteorological satellite in China, replacing the previous FY-4A satellite. The advanced geostationary radiation imager (AGRI) aboard the FY-4B satellite provides an operational sea surface temperature (SST) product with a high observation frequency of [...] Read more.
The Fengyun-4B (FY-4B) satellite is currently the primary operational geostationary meteorological satellite in China, replacing the previous FY-4A satellite. The advanced geostationary radiation imager (AGRI) aboard the FY-4B satellite provides an operational sea surface temperature (SST) product with a high observation frequency of 15 min. This paper conducts the first data quality assessment of operational SST products from the FY-4B/AGRI using quality-controlled measured SSTs from the in situ SST quality monitor dataset and foundation SSTs produced by the operational sea surface temperature and sea ice analysis (OSTIA) system from July 2023 to January 2024. The FY-4B/AGRI SST product provides a data quality level flag on a pixel-by-pixel basis. Accuracy evaluations are conducted on the FY-4B/AGRI SST product with different data quality levels. The results indicate that the FY-4B/AGRI operational SST generally has a negative mean bias compared to in situ SST and OSTIA SST, and that the accuracy of the FY-4B/AGRI SST, with an excellent quality level, can meet the needs of practical applications. The FY-4B/AGRI SST with an excellent quality level demonstrates a strong correlation with in situ SST and OSTIA SST, with a correlation coefficient R exceeding 0.99. Compared with in situ SST, the bias, root mean square error (RMSE), and unbiased RMSE (ubRMSE) of the FY-4B/AGRI SST with an excellent quality level are −0.19, 0.66, and 0.63 °C in daytime, and −0.15, 0.70, and 0.68 °C at night, respectively. Compared with OSTIA SST, the bias, RMSE, and ubRMSE of the FY-4B/AGRI SST with an excellent data quality level are −0.10, 0.64, and 0.63 °C in daytime, and −0.13, 0.68, and 0.67 °C at night. The FY-4B/AGRI SST tends to underestimate the sea water temperature in mid–low-latitude regions, while it tends to overestimate sea water temperature in high-latitude regions and near the edges of the full disk. The time-varying validation of FY-4B/AGRI SST accuracy shows weak fluctuations with a period of 3–4 months. Hourly accuracy verification shows that the difference between the FY-4B/AGRI SST and OSTIA SST reflects a diurnal effect. However, FY-4B/AGRI SST products need to be used with caution around midnight to avoid an abnormal accuracy. This paper also discusses the relationships between the FY-4B/AGRI SST and satellite zenith angle, water vapor content, wind speed, and in situ SST, which have an undeniable impact on the underestimation of the FY-4B/AGRI operational SST. The accuracy of the FY-4B/AGRI operational SST retrieval algorithm still needs to be further improved in the future. Full article
Show Figures

Figure 1

Figure 1
<p>Flowchart of the research methods.</p>
Full article ">Figure 2
<p>Density scatter plots between in situ SST and FY-4B/AGRI SST for daytime (<b>top</b>) and nighttime (<b>bottom</b>). Where DQ presents FY4B/AGRI SST data quality levels of 0, 1, and 2, the gray dashed line is the diagonal.</p>
Full article ">Figure 3
<p>Histograms of FY-4B/AGRI SST minus in situ SST for daytime (<b>top</b>) and nighttime (<b>bottom</b>). Red dashed lines show the Gaussian distributions.</p>
Full article ">Figure 4
<p>Density scatter plots between OSTIA SST and FY-4B/AGRI SST for daytime (<b>top</b>) and nighttime (<b>bottom</b>).</p>
Full article ">Figure 5
<p>Histograms of FY-4B/AGRI SST minus OSTIA SST for daytime (<b>top</b>) and nighttime (<b>bottom</b>). Red dashed lines show the Gaussian distributions.</p>
Full article ">Figure 6
<p>Target diagram for bias and ubRMSE of FY-4B/AGRI SST with different quality levels: (<b>a</b>) comparison to in situ SST, and (<b>b</b>) comparison to OSTIA SST. Where the red edge represents in situ SST, the blue edge represents OSTIA SST, the white fill color represents daytime, and the black fill color represents nighttime.</p>
Full article ">Figure 7
<p>Map of bias, RMSE, R, and sample number N between FY-4B/AGRI SST and in situ SST with 2° × 2° grid for daytime (<b>top</b>) and nighttime (<b>bottom</b>).</p>
Full article ">Figure 8
<p>Monthly variation of FY-4B/AGRI SST minus in situ SST from July 2023 to January 2024. The blue solid line and the green dashed line inside the box are the median and mean value, respectively.</p>
Full article ">Figure 9
<p>Hourly variations of error: (<b>a</b>) FY-4B/AGRI SST minus in situ SST; (<b>b</b>) FY-4B/AGRI SST minus OSTIA SST; and (<b>c</b>) in situ SST minus OSTIA SST. Color represents the number of samples, and the blue solid line and green dot inside the box are the median and mean values, respectively.</p>
Full article ">Figure 10
<p>Relationship between satellite zenith and FY-4B/AGRI SST minus in situ SST. Colorful density scatter plot shows the number of samples in each 1° bin, the black error bar represents the bias and SD in each 5° bin.</p>
Full article ">Figure 11
<p>Relationship between water vapor and FY-4B/AGRI SST minus in situ SST. Colorful scatter plots show the sample number with a 1 kg/m<sup>2</sup> bin, the black error bar represents the bias, and the SD with the 5 kg/m<sup>2</sup> bin.</p>
Full article ">Figure 12
<p>Relationship between wind speed and FY-4B/AGRI SST minus in situ SST. Colorful scatter plots show the sample number with a 1 m/s bin, the black error bar represents the bias, and the SD with a 1 m/s bin.</p>
Full article ">Figure 13
<p>Relationship between in situ SST and FY-4B/AGRI SST minus in situ SST. Colorful density scatter plot shows the sample number with a 1 °C bin, the black error bar represents the bias, and the SD with a 1 °C bin.</p>
Full article ">
Back to TopTop