Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (53)

Search Parameters:
Keywords = bio-radar

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 39396 KiB  
Article
Using a Neural Network to Model the Incidence Angle Dependency of Backscatter to Produce Seamless, Analysis-Ready Backscatter Composites over Land
by Claudio Navacchi, Felix Reuß and Wolfgang Wagner
Remote Sens. 2025, 17(3), 361; https://doi.org/10.3390/rs17030361 - 22 Jan 2025
Viewed by 656
Abstract
In order to improve the current standard of analysis-ready Synthetic Aperture Radar (SAR) backscatter data, we introduce a machine learning-based approach to estimate the slope of the backscatter–incidence angle relationship from several backscatter statistics. The method requires information from radiometric terrain-corrected gamma nought [...] Read more.
In order to improve the current standard of analysis-ready Synthetic Aperture Radar (SAR) backscatter data, we introduce a machine learning-based approach to estimate the slope of the backscatter–incidence angle relationship from several backscatter statistics. The method requires information from radiometric terrain-corrected gamma nought time series and overcomes the constraints of a limited orbital coverage, as exemplified with the Sentinel-1 constellation. The derived slope estimates contain valuable information on scattering characteristics of different land cover types, allowing for the correction of strong forward-scattering effects over water bodies and wetlands, as well as moderate surface scattering effects over bare soil and sparsely vegetated areas. Comparison of the estimated and computed slope values in areas with adequate orbital coverage shows good overall agreement, with an average RMSE value of 0.1 dB/° and an MAE of 0.05 dB/°. The discrepancy between RMSE and MAE indicates the presence of outliers in the computed slope, which are attributed to speckle and backscatter fluctuations over time. In contrast, the estimated slope excels with a smooth spatial appearance. After correcting backscatter values by normalising them to a certain reference incidence angle, orbital artefacts are significantly reduced. This becomes evident with differences up to 5 dB when aggregating the normalised backscatter measurements over certain time periods to create spatially seamless radar backscatter composites. Without being impacted by systematic differences in the illumination and physical properties of the terrain, these composites constitute a valuable foundation for land cover and land use mapping, as well as bio-geophysical parameter retrieval. Full article
(This article belongs to the Special Issue Calibration and Validation of SAR Data and Derived Products)
Show Figures

Figure 1

Figure 1
<p>Number of relative orbits derived from Wagner et al. [<a href="#B41-remotesensing-17-00361" class="html-bibr">41</a>] (<b>top</b>) and ESA WorldCover 2021 [<a href="#B40-remotesensing-17-00361" class="html-bibr">40</a>] (<b>bottom</b>). All regions of interest are visible as black markers and are scaled by the average terrain slope in the lower figure. The crosses in the upper figure remain constant to avoid overlaying any interesting orbital patterns.</p>
Full article ">Figure 2
<p>RTC gamma nought (<math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>) pre-processing workflow implemented in <span class="html-italic">wizsard</span>. The grey boxes represent a specific processing step; the smaller boxes in the upper corner represent the input layers, and those in the lower corner represent the output layers. The colours of the small boxes help to trace the data flow of the processing chain. More details can be found in [<a href="#B23-remotesensing-17-00361" class="html-bibr">23</a>].</p>
Full article ">Figure 3
<p>Orbit direction difference of normalised <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (<math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </mrow> </semantics></math>(38°), <b>top left</b>) and <math display="inline"><semantics> <mi>β</mi> </semantics></math> computed from VV polarised <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>VV</mi> </msub> </semantics></math>, <b>bottom left</b>). Optical imagery (©2024 Microsoft Corporation, ©2024 Maxar, ©CNES (2024), ©Earthstar Geographics SIO) is provided in the second column.</p>
Full article ">Figure 4
<p>Slope distributions for VV polarised <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>VV</mi> </msub> </semantics></math>, <b>top left</b>), VH polarised <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>VH</mi> </msub> </semantics></math>, <b>top right</b>), their difference (<b>bottom left</b>), and the difference between the two orbit directions (<math display="inline"><semantics> <mrow> <msub> <mi>β</mi> <mrow> <mrow> <mi>VV</mi> <mo>,</mo> </mrow> <mspace width="0.166667em"/> <mi mathvariant="normal">A</mi> </mrow> </msub> <mo>−</mo> <msub> <mi>β</mi> <mrow> <mrow> <mi>VV</mi> <mo>,</mo> </mrow> <mspace width="0.166667em"/> <mi mathvariant="normal">D</mi> </mrow> </msub> </mrow> </semantics></math>) (<b>bottom right</b>) for the land cover classes <span class="html-italic">tree cover</span>, <span class="html-italic">cropland</span>, <span class="html-italic">built-up</span>, <span class="html-italic">water bodies</span>, and <span class="html-italic">wetland</span>.</p>
Full article ">Figure 5
<p>Two-dimensional distributions of the terrain slope and the computed slope values (<math display="inline"><semantics> <mi>β</mi> </semantics></math>) for VV (<b>top</b>) and VH (<b>bottom</b>) polarisation.</p>
Full article ">Figure 6
<p>Input parameters for estimating <math display="inline"><semantics> <mi>β</mi> </semantics></math> for each polarisation: mean of <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>, <math display="inline"><semantics> <msubsup> <mover> <mi>γ</mi> <mo>¯</mo> </mover> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (VV, <b>top left</b>; VH, <b>top right</b>; CR, <b>bottom left</b>) and the percentile difference of <math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>, <math display="inline"><semantics> <msubsup> <mover accent="true"> <mi>γ</mi> <mo stretchy="false">˜</mo> </mover> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> (VV, <b>centre left</b>; VH, <b>centre right</b>; CR, <b>bottom right</b>). In this example, all input data were acquired from a descending orbit and cover the surroundings of the village of Marghita in Romania. To provide some context, the top row shows optical imagery (©2024 Microsoft Corporation Earthstar Geographics SIO) next to land cover data taken from ESA WorldCover 2021 [<a href="#B40-remotesensing-17-00361" class="html-bibr">40</a>].</p>
Full article ">Figure 7
<p>Workflow summarising all key components of our study. The grey boxes represent a specific processing step; the smaller boxes in the upper corner represent the input layers, and those in the lower corner represent the output layers. The colours of the small boxes help to trace the data flow of the processing chain. The whole workflow is executed separately for each polarisation and orbit direction.</p>
Full article ">Figure 8
<p>Two-dimensional distributions of the reference slope (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>r</mi> </msub> </semantics></math>) and each input parameter, i.e., the average <math display="inline"><semantics> <msubsup> <mover> <mi>γ</mi> <mo>¯</mo> </mover> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> and sensitivity <math display="inline"><semantics> <msubsup> <mover accent="true"> <mi>γ</mi> <mo stretchy="false">˜</mo> </mover> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math> for each polarisation (and CR), as well as those of the reference slope (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>r</mi> </msub> </semantics></math>) and the estimated slope (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>e</mi> </msub> </semantics></math>). Contours show the 50% level of the kernel density estimate for <span class="html-italic">tree cover</span>, <span class="html-italic">water bodies</span>, <span class="html-italic">wetland</span>, and <span class="html-italic">shrubland</span>.</p>
Full article ">Figure 9
<p>Calculated reference slope values (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>r</mi> </msub> </semantics></math>) for VV (<b>top left</b>) and VH (<b>top right</b>) polarisation and estimated slope values (<math display="inline"><semantics> <msub> <mi>β</mi> <mi>e</mi> </msub> </semantics></math>) for VV (<b>bottom left</b>) and VH (<b>bottom right</b>) polarisation. The image content covers the same region of interest as in <a href="#remotesensing-17-00361-f006" class="html-fig">Figure 6</a>. Attention should be paid to the varying scales of the colour bars.</p>
Full article ">Figure 10
<p>Comparison between non-normalised (<math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>) and normalised (<math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </mrow> </semantics></math>(38°)) backscatter distributions for each polarisation and a selection of land cover types in certain regions of interest. VV polarised backscatter is coloured as turquoise and VH polarised backscatter as orange. Non-normalised backscatter is shown in the background with a dashed contour. The reference to each region of interest originates from the Equi7Grid tile naming scheme [<a href="#B64-remotesensing-17-00361" class="html-bibr">64</a>].</p>
Full article ">Figure 11
<p>Radar backscatter composites generated from non-normalised backscatter (<math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>; first and third columns) and normalised backscatter (<math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </mrow> </semantics></math>(38°); second and fourth columns). The radar backscatter composites in the first two columns were generated from VV polarised backscatter and the ones in the last two columns from VH polarised backscatter. The selected areas are located in southern Australia (first row), Iraq (second row), eastern China (third row), northern Canada (fourth row), and southern Chile (fifth row). The small boxes in the lower-right corner of the images in the first column show the number of observations. The red dots in the boxes in the lower-right corner of the images in the third column provide some geographical context and mark the centre of the region.</p>
Full article ">Figure 12
<p>Comparison between a non-normalised (<math display="inline"><semantics> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </semantics></math>, first column); LCA-weighted, non-normalised (<math display="inline"><semantics> <msubsup> <mi>γ</mi> <mrow> <mi>T</mi> <mo>,</mo> <mspace width="0.166667em"/> <mi>c</mi> </mrow> <mn>0</mn> </msubsup> </semantics></math>, second column); normalised (<math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mi>T</mi> <mn>0</mn> </msubsup> </mrow> </semantics></math>(38°), third column); and LCA-weighted, normalised (<math display="inline"><semantics> <mrow> <msubsup> <mi>γ</mi> <mrow> <mi>T</mi> <mo>,</mo> <mspace width="0.166667em"/> <mi>c</mi> </mrow> <mn>0</mn> </msubsup> </mrow> </semantics></math>(38°), fourth column) twelve-day radar backscatter composites (15 October 2020–27 October 2020). The first row shows VV backscatter, and the second row shows VH backscatter. The small boxes in the lower-left corner of the first images in the first and second rows show the location of the selected area (Lake Näsijärvi, Finland) and the number of observations (yellow: 7; purple: 5), respectively.</p>
Full article ">
17 pages, 8908 KiB  
Article
Detection of Random Body Movements Using Clustering-Based Methods in Bioradar Systems
by André Rouco, Filipe Silva, Beatriz Soares, Daniel Albuquerque, Carolina Gouveia, Susana Brás and Pedro Pinho
Information 2024, 15(10), 584; https://doi.org/10.3390/info15100584 - 25 Sep 2024
Viewed by 797
Abstract
Bioradar systems, in general, refer to radar systems used for the detection of vital signs. These systems hold significant importance across various sectors, particularly in healthcare and surveillance, due to their capacity to provide contactless solutions for monitoring physiological functions. In these applications, [...] Read more.
Bioradar systems, in general, refer to radar systems used for the detection of vital signs. These systems hold significant importance across various sectors, particularly in healthcare and surveillance, due to their capacity to provide contactless solutions for monitoring physiological functions. In these applications, the primary challenge lies in the presence of random body movements (BMs), which can significantly hinder the accurate detection of vital signs. To compensate the affected signal in a timely manner, portions of BM must be correctly identified. To address this challenge, this work proposes a solution based on the Density-Based Spatial Clustering of Applications with Noise (DBScan) algorithm to detect the occurrence of BM in radar signals. The main idea of this algorithm is to cluster the radar samples, aiming to differentiate between segments in which the subject is stable and segments in which the subject is moving. Using a dataset involving eight subjects, the proposed method successfully detects three types of body movements: chest movement, body rotation, and arm movement. The achieved results are promising, with F1 scores of 0.83, 0.73, and 0.8, respectively, for the detection of each specific movement type. Full article
(This article belongs to the Special Issue Signal Processing in Radio Systems)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Illustration of the movements included in the protocol.</p>
Full article ">Figure 2
<p>System setup and experiment environment.</p>
Full article ">Figure 3
<p>Raw signal acquired by Bio-Radar.</p>
Full article ">Figure 4
<p>Optimization parameter methods.</p>
Full article ">Figure 5
<p>Results of K-mean implementation.</p>
Full article ">Figure 6
<p>Elbow method results.</p>
Full article ">Figure 7
<p>Results of DBSCAN implementation.</p>
Full article ">Figure 8
<p>Illustration of the radar signal in the complex domain with and without random motion.</p>
Full article ">Figure 9
<p>Block diagram of the algorithm.</p>
Full article ">Figure 10
<p>DBScan example for MinPts = 3.</p>
Full article ">Figure 11
<p>Optimal <math display="inline"><semantics> <mi>ε</mi> </semantics></math> for Subject 7 with <span class="html-italic">MinPts</span> 13.</p>
Full article ">Figure 12
<p>DBScan results in the time domain.</p>
Full article ">Figure 13
<p>Polar plot DBScan results.</p>
Full article ">Figure 14
<p>Variation of the average F1 Score with <span class="html-italic">MinOnesPercentage</span>.</p>
Full article ">Figure 15
<p>Polar plot results after noise segmentation.</p>
Full article ">Figure 16
<p>Time results after noise segmentation.</p>
Full article ">
16 pages, 1457 KiB  
Article
A Deep Learning Method for Human Sleeping Pose Estimation with Millimeter Wave Radar
by Zisheng Li, Ken Chen and Yaoqin Xie
Sensors 2024, 24(18), 5900; https://doi.org/10.3390/s24185900 - 11 Sep 2024
Viewed by 1298
Abstract
Recognizing sleep posture is crucial for the monitoring of people with sleeping disorders. Existing contact-based systems might interfere with sleeping, while camera-based systems may raise privacy concerns. In contrast, radar-based sensors offer a promising solution with high penetration ability and the capability to [...] Read more.
Recognizing sleep posture is crucial for the monitoring of people with sleeping disorders. Existing contact-based systems might interfere with sleeping, while camera-based systems may raise privacy concerns. In contrast, radar-based sensors offer a promising solution with high penetration ability and the capability to detect vital bio-signals. This study propose a deep learning method for human sleep pose recognition from signals acquired from single-antenna Frequency-Modulated Continuous Wave (FMCW) radar device. To capture both frequency features and sequential features, we introduce ResTCN, an effective architecture combining Residual blocks and Temporal Convolution Network (TCN) to recognize different sleeping postures, from augmented statistical motion features of the radar time series. We rigorously evaluated our method with an experimentally acquired data set which contains sleeping radar sequences from 16 volunteers. We report a classification accuracy of 82.74% on average, which outperforms the state-of-the-art methods. Full article
(This article belongs to the Section Radar Sensors)
Show Figures

Figure 1

Figure 1
<p>Experimental bedroom environment with a FMCW radar devices mounted on the wall.</p>
Full article ">Figure 2
<p>The four postures: (<b>a</b>) supine, (<b>b</b>) left-side lying, (<b>c</b>) right-side lying, and (<b>d</b>) prone.</p>
Full article ">Figure 3
<p>Typical signal examples of different postures. (<b>a1</b>–<b>a4</b>) Typical chirp time sequences of 4 postures: supine, left-side lying, right-side lying, and prone. (<b>b1</b>–<b>b4</b>) Range-FFT images of the 4 postures (1 frame, 32 chirps). (<b>c1</b>–<b>c4</b>) Statistical motion feature images of the 4 postures (6 s).</p>
Full article ">Figure 4
<p>Typical examples of the augmented samples. (<b>a</b>) Original: sample from the original feature image set. (<b>b</b>) Range shift: range shifted image with an offset of 10 pixels. (<b>c</b>) Time shift: time shifted image with an offset of 1 s. (<b>d</b>,<b>e</b>) Feature image 1 of prone, feature image 2 of left side lateral: feature images of supine and left side lateral. (<b>f</b>) Mix-up of feature image 1 and 2 (<math display="inline"><semantics> <mi>λ</mi> </semantics></math> = 0.5): mix-up image of (<b>d</b>,<b>e</b>) with <math display="inline"><semantics> <mi>λ</mi> </semantics></math> = 0.5.</p>
Full article ">Figure 5
<p>Architecture of ResTCN.</p>
Full article ">Figure 6
<p>The confusion matrix of the proposed method and relative SOTA methods. (<b>a</b>) SVM. (<b>b</b>) ShuffleNet. (<b>c</b>) DenseNet. (<b>d</b>) Vit. (<b>e</b>) Swin Transformer V2. (<b>f</b>) ResTCN (proposed).</p>
Full article ">
22 pages, 5916 KiB  
Article
Penetrating Barriers: Noncontact Measurement of Vital Bio Signs Using Radio Frequency Technology
by Kobi Aflalo and Zeev Zalevsky
Sensors 2024, 24(17), 5784; https://doi.org/10.3390/s24175784 - 5 Sep 2024
Viewed by 1927
Abstract
The noninvasive measurement and sensing of vital bio signs, such as respiration and cardiopulmonary parameters, has become an essential part of the evaluation of a patient’s physiological condition. The demand for new technologies that facilitate remote and noninvasive techniques for such measurements continues [...] Read more.
The noninvasive measurement and sensing of vital bio signs, such as respiration and cardiopulmonary parameters, has become an essential part of the evaluation of a patient’s physiological condition. The demand for new technologies that facilitate remote and noninvasive techniques for such measurements continues to grow. While previous research has made strides in the continuous monitoring of vital bio signs using lasers, this paper introduces a novel technique for remote noncontact measurements based on radio frequencies. Unlike laser-based methods, this innovative approach offers the advantage of penetrating through walls and tissues, enabling the measurement of respiration and heart rate. Our method, diverging from traditional radar systems, introduces a unique sensing concept that enables the detection of micro-movements in all directions, including those parallel to the antenna surface. The main goal of this work is to present a novel, simple, and cost-effective measurement tool capable of indicating changes in a subject’s condition. By leveraging the unique properties of radio frequencies, this technique allows for the noninvasive monitoring of vital bio signs without the need for physical contact or invasive procedures. Moreover, the ability to penetrate barriers such as walls and tissues opens new possibilities for remote monitoring in various settings, including home healthcare, hospital environments, and even search and rescue operations. In order to validate the effectiveness of this technique, a series of experiments were conducted using a prototype device. The results demonstrated the feasibility of accurately measuring respiration patterns and heart rate remotely, showcasing the potential for real-time monitoring of a patient’s physiological parameters. Furthermore, the simplicity and low-cost nature of the proposed measurement tool make it accessible to a wide range of users, including healthcare professionals, caregivers, and individuals seeking to monitor their own health. Full article
(This article belongs to the Section Radar Sensors)
Show Figures

Figure 1

Figure 1
<p>The variation in the complex dielectric constant: real (<b>a</b>), imaginary (<b>b</b>), and conductivity (<b>c</b>) with microwave frequency, with emphasis on the 2.4 GHz frequency indicated by a black vertical line, for 23 different tissues located in the abdominal and upper chest regions.</p>
Full article ">Figure 2
<p>This method, as evidenced by the spectrogram (<b>a</b>) and its cross-section (<b>b</b>) at the point of interest, is not optimal for discerning details about low frequencies due to the lower resolution at these frequencies. Since the spectrogram is based on a Fourier transformation of a finite signal within a window function, it has several frequency components that are clearly visible in the horizontal blue lines (<b>a</b>).</p>
Full article ">Figure 3
<p>The scalogram (<b>a</b>), which illustrates the time-dependent frequency components of the ECG signal’s pulses, features a red cross as a noteworthy point that is examined in the cross-section (<b>b</b>), showing all axes information at the cross position.</p>
Full article ">Figure 4
<p>Depiction of the RF trajectory and the subject’s position, with both antennas aimed at the same point. The test subject is equipped with a piezoelectric sensor on the torso, providing an extra reference for chest movements.</p>
Full article ">Figure 5
<p>The separation of targets in close proximity is greatly influenced by the beam width and the targets’ distance from the antenna. Scenario (<b>a</b>) demonstrates a situation where the target of greater magnitude obscures the slower, less intense target. On the other hand, scenario (<b>b</b>) depicts a case where the targets are distinct and separated. In this scenario, since the beam is narrow and aimed toward the slower target, it can be detected.</p>
Full article ">Figure 6
<p>Illustration of the linearly polarized antenna that was used, with its corresponding horizontal (H) and vertical (E) planes (<b>a</b>) and the corresponding beam pattern in each plane (<b>b</b>). (<b>a</b>) Illustration of the rectangular aperture linearly polarized antenna and its corresponding linear planes. (<b>b</b>) Beam pattern of the deployed antenna across both horizontal (H) and vertical (E) planes.</p>
Full article ">Figure 7
<p>The respiration experiment demonstrates the positioning of antennas towards the subject’s chest (<b>a</b>). The same experiment was conducted in both an anechoic chamber (<b>b</b>) and an uncontrolled environment (<b>a</b>) to demonstrate the ability to detect signals in a noisy environment. The anechoic chamber isolates our measurements from unknown exterior electromagnetic signals due to the radiation-absorbent material coated on its walls. For the validation of remote measurements of the respiration rate, a piezoelectric crystal was placed on the subject’s chest, as shown in (<b>c</b>), serving as an additional analog measurement for our system. The complete connections of our system are depicted in (<b>d</b>).</p>
Full article ">Figure 8
<p>The process of respiration over a period of time (<b>a</b>), as recorded by two antennas receiving signals during normal breathing patterns. The scalogram of regular respiration (<b>b</b>), captured from one of the antennas, reveals a clear signal variation in the low-frequency region. This fluctuation is attributed to the varying respiration rates over time.</p>
Full article ">Figure 9
<p>Respiratory signal captured using radio frequency with the reference signal derived from a piezoelectric sensor attached to the subject’s torso (<b>a</b>), accompanied by a histogram that illustrates the variances between the two signals (<b>b</b>). The green circles indicate a point of the piezoelectric sensor’s high dependency on torso placement, where increased pressure results in a higher magnitude. (<b>a</b>) Radio frequency signal alongside the reference signal. (<b>b</b>) Histogram of the discrepancies between the signal and the reference.</p>
Full article ">Figure 10
<p>Measured heart rate detection using wavelet denoising (<b>a</b>) involves the display of the original signal after it has undergone coarse filtering. The scalogram of measured data (<b>b</b>) displays a constant heart rate of 60 bpm over time. These data are presented after the high-frequency background noise has been filtered out during the preprocessing stage.</p>
Full article ">Figure 11
<p>RF signals are transmitted and then reflected off the surface of the speaker’s membrane (<b>a</b>). These reflected signals are then detected by two differently positioned receiving antennas for the purpose of spatial representation. The antenna frequency response (<b>b</b>) of the speaker membrane exhibits an acceptable SNR of above zero for frequencies up to 850 Hz. Beyond this point, the SNR starts to decrease below zero, eventually reaching the noise level where the signal becomes indistinguishable. The variations in the reflected signal stem from its inherent response when it generates a specific frequency. A perfect speaker, on the other hand, would emit such a signal without these variations.</p>
Full article ">Figure 12
<p>Variation in SNR with source power level for various distances between the antenna source and test subject.</p>
Full article ">Figure 13
<p>Depiction of pendulum motion over time, where the movement is perpendicular (<b>a</b>) and parallel (<b>b</b>) to the plane of the antenna. The terms <math display="inline"><semantics> <msub> <mi>t</mi> <mn>0</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> describe the temporal position of the pendulum over time where <math display="inline"><semantics> <msub> <mi>t</mi> <mn>0</mn> </msub> </semantics></math> is the initial time and <math display="inline"><semantics> <msub> <mi>t</mi> <mn>2</mn> </msub> </semantics></math> corresponds to a later time.</p>
Full article ">Figure 14
<p>The experimental setup is depicted in two different environments. (<b>a</b>) Controlled environment within an anechoic chamber designed to eliminate any interference. (<b>b</b>) Uncontrolled environment, specifically a typical office space with surrounding modern electronics.</p>
Full article ">Figure 15
<p>The scalogram showcases the captured signals from a pendulum moving in two separate directions: perpendicular (<b>a</b>) and parallel (<b>b</b>) to the antenna’s plane. This representation allows us to observe a damping effect in the pendulum’s motion over time, as the frequency components exhibit a decrease, indicating a slowdown in the pendulum’s movement. (<b>a</b>) The scalogram illustrates the motion of a pendulum, which moves perpendicular to the plane of the antenna. (<b>b</b>) The scalogram depicts the pendulum’s motion, which occurs alongside the antenna’s plane.</p>
Full article ">Figure 16
<p>Illustration of the experimental setup: (<b>a</b>) participants in parallel and vertical orientations relative to the antenna. The breathing patterns of the participants, displayed as amplitude versus time (<b>b</b>), highlighting the differences in chest movements among the subjects.</p>
Full article ">
17 pages, 1703 KiB  
Article
Reduction in Chemical Fertilizer Rates by Applying Bio-Organic Fertilizer for Optimization Yield and Quality of Hemerocallis citrina Baroni
by Songhai Wu, Zhou Li, Yanfei Yang, Jin Sun, Dongmei Lian, Zhengfeng Lai and Jianji Hong
Agronomy 2024, 14(8), 1627; https://doi.org/10.3390/agronomy14081627 - 25 Jul 2024
Viewed by 1155
Abstract
In this study, we investigated if reducing the amount of chemical fertilizer by combining it with organic fertilizer in Hemerocallis citrina Baroni (H. citrina) cultivation could improve plant growth and photosynthetic capacity and, consequently, increase yield and quality. A continuous two-year [...] Read more.
In this study, we investigated if reducing the amount of chemical fertilizer by combining it with organic fertilizer in Hemerocallis citrina Baroni (H. citrina) cultivation could improve plant growth and photosynthetic capacity and, consequently, increase yield and quality. A continuous two-year field experiment was conducted at a research farm in Zhangzhou City, China, during 2021–2022. Six fertilization levels with two locally grown H. citrina cultivars, “Taidong 6” and “Shibage”, were tested. The results showed that 100% of the recommended dose of chemical fertilizer (RDF) with bio-organic fertilizer yielded superior effects in promoting both vegetative and reproductive growth in comparison to RDF alone. However, reducing the application rate of chemical fertilizers, especially by more than 40%, resulted in a significant decline in certain agronomic traits such as plant width, leaf width, and scape length. Compared to RDF, the use of 100% or 80% RDF in combination with bio-organic fertilizer significantly increased chlorophyll content, net photosynthetic rate, and transpiration rate as well as yield production, while excessive reductions in chemical fertilizer rate produced results that demonstrated an opposite trend. The co-application of chemical and bio-organic fertilizer enhanced the contents of soluble sugar and lowered total acidity, whereas excessive chemical fertilizer reduction decreased vitamin C, total flavonoids, and soluble protein levels. Utilizing radar chart analysis for a comprehensive assessment of yield and quality demonstrates that the application of bio-organic fertilizer with 80% RDF could be a better field fertilization regime for H. citrina cultivation. Full article
(This article belongs to the Section Agroecology Innovation: Achieving System Resilience)
Show Figures

Figure 1

Figure 1
<p>Mean monthly precipitation and temperature in the experimental site from 2021 to 2022.</p>
Full article ">Figure 2
<p>Effects of different chemical fertilizer reductions combined with bio-organic fertilizer on leaf SPAD value of <span class="html-italic">H. citrina</span>. T<sub>0</sub>–T<sub>5</sub> represent different chemical fertilizer reductions combined with bio-organic fertilizer (see details in <a href="#agronomy-14-01627-t001" class="html-table">Table 1</a>). Vertical bars represent standard deviations. With each cultivar in the trial, vertical bars marked by the different letters are significantly different at <span class="html-italic">p</span> &lt; 0.05 according to LSD testing.</p>
Full article ">Figure 3
<p>Effects of different chemical fertilizer reductions combined with bio-organic fertilizer on leaf photosynthetic characteristics at the full-flowering stage of <span class="html-italic">H. citrina</span>. <span class="html-italic">Pn</span>, net photosynthetic rate; <span class="html-italic">Tr</span>, transpiration rate; <span class="html-italic">WUE</span>, water use efficiency. T<sub>0</sub>–T<sub>5</sub> represent different chemical fertilizer reductions combined with bio-organic fertilizer (see details in <a href="#agronomy-14-01627-t001" class="html-table">Table 1</a>). Vertical bars show standard deviations; different letters denote significant differences at <span class="html-italic">p</span> &lt; 0.05 according to LSD testing.</p>
Full article ">Figure 4
<p>Correlation coefficients between growth and yield parameters of <span class="html-italic">H. citrina</span>. PH, plant height; PW, plant width; LL, leaf length; LW, leaf width; SL, scape length; SD, scape diameter; FBL, floral bud length; FBD, floral bud diameter; SPAD, chlorophyll content; <span class="html-italic">Pn</span>, net photosynthetic rate; <span class="html-italic">Tr</span>, transpiration rate; <span class="html-italic">WUE</span>, water use efficiency; SN, scape number; FBN, floral bud number; FBW, floral bud weight; Y, yield. ** and * represent <span class="html-italic">p</span> &lt; 0.01 and 0.05 according to Pearson correlation analysis, respectively.</p>
Full article ">Figure 5
<p>Radar chart showing the overall performance of <span class="html-italic">H. citrina</span> yield and quality under different chemical fertilizer reductions combined with bio-organic fertilizer. All variables were normalized using division by their maximum value. SN: scape number; FBN: floral bud number per scape; FBW: floral bud weight; Y: yield of fresh floral buds.</p>
Full article ">
16 pages, 3250 KiB  
Article
Iterative Adaptive Based Multi-Polarimetric SAR Tomography of the Forested Areas
by Shuang Jin, Hui Bi, Qian Guo, Jingjing Zhang and Wen Hong
Remote Sens. 2024, 16(9), 1605; https://doi.org/10.3390/rs16091605 - 30 Apr 2024
Cited by 2 | Viewed by 1489
Abstract
Synthetic aperture radar tomography (TomoSAR) is an extension of synthetic aperture radar (SAR) imaging. It introduces the synthetic aperture principle into the elevation direction to achieve three-dimensional (3-D) reconstruction of the observed target. Compressive sensing (CS) is a favorable technology for sparse elevation [...] Read more.
Synthetic aperture radar tomography (TomoSAR) is an extension of synthetic aperture radar (SAR) imaging. It introduces the synthetic aperture principle into the elevation direction to achieve three-dimensional (3-D) reconstruction of the observed target. Compressive sensing (CS) is a favorable technology for sparse elevation recovery. However, for the non-sparse elevation distribution of the forested areas, if CS is selected to reconstruct it, it is necessary to utilize some orthogonal bases to first represent the elevation reflectivity sparsely. The iterative adaptive approach (IAA) is a non-parametric algorithm that enables super-resolution reconstruction with minimal snapshots, eliminates the need for hyperparameter optimization, and requires fewer iterations. This paper introduces IAA to tomographicinversion of the forested areas and proposes a novel multi-polarimetric-channel joint 3-D imaging method. The proposed method relies on the characteristics of the consistent support of the elevation distribution of different polarimetric channels and uses the L2-norm to constrain the IAA-based 3-D reconstruction of each polarimetric channel. Compared with typical spectral estimation (SE)-based algorithms, the proposed method suppresses the elevation sidelobes and ambiguity and, hence, improves the quality of the recovered 3-D image. Compared with the wavelet-based CS algorithm, it reduces computational cost and avoids the influence of orthogonal basis selection. In addition, in comparison to the IAA, it demonstrates greater accuracy in identifying the support of the elevation distribution in forested areas. Experimental results based on BioSAR 2008 data are used to validate the proposed method. Full article
(This article belongs to the Special Issue Advances in Synthetic Aperture Radar Data Processing and Application)
Show Figures

Figure 1

Figure 1
<p>TomoSAR imaging geometry.</p>
Full article ">Figure 2
<p>Scattering Scattering of a forested area. (<b>a</b>) Scattering mechanism, (<b>b</b>) Scattering distribution.</p>
Full article ">Figure 3
<p>Schematic diagram of backscattering coefficients of HH, HV, and VV polarimetric channels.</p>
Full article ">Figure 4
<p>Elevation aperture position in the BioSAR 2008 dataset.</p>
Full article ">Figure 5
<p>Implementation process of the TomoSAR 3-D imaging of the forested areas based on the proposed method.</p>
Full article ">Figure 6
<p>Polarimetric SAR image of the surveillance area (The yellow area numbers 1 and 2 respectively represent the two slices selected for the experiment).</p>
Full article ">Figure 7
<p>Amplitude and phase results after data preprocessing for the (<b>a</b>) HH, (<b>b</b>) HV, and (<b>c</b>) VV polarimetric channels.</p>
Full article ">Figure 7 Cont.
<p>Amplitude and phase results after data preprocessing for the (<b>a</b>) HH, (<b>b</b>) HV, and (<b>c</b>) VV polarimetric channels.</p>
Full article ">Figure 8
<p>The incoherent sum of the results for all polarization channels (Slice 1). (<b>a</b>) BF. (<b>b</b>) Capon. (<b>c</b>) MUSIC. (<b>d</b>) Wavelet-based <math display="inline"><semantics> <msub> <mi>L</mi> <mn>1</mn> </msub> </semantics></math>. (<b>e</b>) IAA. (<b>f</b>) The proposed method. The white line represents the LiDAR DSM.</p>
Full article ">Figure 8 Cont.
<p>The incoherent sum of the results for all polarization channels (Slice 1). (<b>a</b>) BF. (<b>b</b>) Capon. (<b>c</b>) MUSIC. (<b>d</b>) Wavelet-based <math display="inline"><semantics> <msub> <mi>L</mi> <mn>1</mn> </msub> </semantics></math>. (<b>e</b>) IAA. (<b>f</b>) The proposed method. The white line represents the LiDAR DSM.</p>
Full article ">Figure 9
<p>The incoherent sum of the results for all polarization channels (Slice 2). (<b>a</b>) BF. (<b>b</b>) Capon. (<b>c</b>) MUSIC. (<b>d</b>) Wavelet-based <math display="inline"><semantics> <msub> <mi>L</mi> <mn>1</mn> </msub> </semantics></math>. (<b>e</b>) IAA. (<b>f</b>) The proposed method. The white line represents the LiDAR DSM.</p>
Full article ">Figure 9 Cont.
<p>The incoherent sum of the results for all polarization channels (Slice 2). (<b>a</b>) BF. (<b>b</b>) Capon. (<b>c</b>) MUSIC. (<b>d</b>) Wavelet-based <math display="inline"><semantics> <msub> <mi>L</mi> <mn>1</mn> </msub> </semantics></math>. (<b>e</b>) IAA. (<b>f</b>) The proposed method. The white line represents the LiDAR DSM.</p>
Full article ">Figure 10
<p>3-D point cloud map of the entire surveillance region reconstructed by the proposed method.</p>
Full article ">
19 pages, 3471 KiB  
Article
Remote Emotion Recognition Using Continuous-Wave Bio-Radar System
by Carolina Gouveia, Beatriz Soares, Daniel Albuquerque, Filipa Barros, Sandra C. Soares, Pedro Pinho, José Vieira and Susana Brás
Sensors 2024, 24(5), 1420; https://doi.org/10.3390/s24051420 - 22 Feb 2024
Cited by 2 | Viewed by 1800
Abstract
The Bio-Radar is herein presented as a non-contact radar system able to capture vital signs remotely without requiring any physical contact with the subject. In this work, the ability to use the proposed system for emotion recognition is verified by comparing its performance [...] Read more.
The Bio-Radar is herein presented as a non-contact radar system able to capture vital signs remotely without requiring any physical contact with the subject. In this work, the ability to use the proposed system for emotion recognition is verified by comparing its performance on identifying fear, happiness and a neutral condition, with certified measuring equipment. For this purpose, machine learning algorithms were applied to the respiratory and cardiac signals captured simultaneously by the radar and the referenced contact-based system. Following a multiclass identification strategy, one could conclude that both systems present a comparable performance, where the radar might even outperform under specific conditions. Emotion recognition is possible using a radar system, with an accuracy equal to 99.7% and an F1-score of 99.9%. Thus, we demonstrated that it is perfectly possible to use the Bio-Radar system for this purpose, which is able to be operated remotely, avoiding the subject awareness of being monitored and thus providing more authentic reactions. Full article
(This article belongs to the Section Radar Sensors)
Show Figures

Figure 1

Figure 1
<p>Schematics of the monitoring scenario.</p>
Full article ">Figure 2
<p>Block diagram of the digital signal-processing algorithm applied to bR and bP signals.</p>
Full article ">Figure 3
<p>Comparison between the extracted bR signal with the correspondent bP one. (<b>a</b>) Respiratory signal; (<b>b</b>) cardiac signal.</p>
Full article ">Figure 4
<p>Illustration of the IBI computation. (<b>a</b>) Through conventional method; (<b>b</b>) Through sliding window.</p>
Full article ">Figure 5
<p>Comparison of the time domain HRV parameters computed using the conventional method on radar signal, the sliding window method and the original ECG result. (<b>a</b>) SDNN for ID01; (<b>b</b>) RMSSD for ID01; (<b>c</b>) pNN50 for ID01; (<b>d</b>) SDNN for ID06; (<b>e</b>) RMSSD for ID06; (<b>f</b>) pNN50 for ID06; (<b>g</b>) SDNN for ID10; (<b>h</b>) RMSSD for ID10; (<b>i</b>) pNN50 for ID10.</p>
Full article ">Figure 6
<p>Illustration of how HRV features are computed and assigned to each observation.</p>
Full article ">Figure 7
<p>Workflow of the statistical study for features selection.</p>
Full article ">Figure 8
<p>Correlatio n matrix for feature selection for bR: (<b>a</b>) After the Pairwise T test, (<b>b</b>) after removing redundant features.</p>
Full article ">
20 pages, 6006 KiB  
Article
Impact and Classification of Body Stature and Physiological Variability in the Acquisition of Vital Signs Using Continuous Wave Radar
by Beatriz Soares, Carolina Gouveia, Daniel Albuquerque and Pedro Pinho
Appl. Sci. 2024, 14(2), 921; https://doi.org/10.3390/app14020921 - 22 Jan 2024
Cited by 1 | Viewed by 1399
Abstract
The Bio-Radar system, useful for monitoring patients with infectious diseases and detecting driver drowsiness, has gained popularity in the literature. However, its efficiency across diverse populations considering physiological and body stature variations needs further exploration. This work addresses this gap by applying machine [...] Read more.
The Bio-Radar system, useful for monitoring patients with infectious diseases and detecting driver drowsiness, has gained popularity in the literature. However, its efficiency across diverse populations considering physiological and body stature variations needs further exploration. This work addresses this gap by applying machine learning (ML) algorithms—Support Vector Machine (SVM), K-Nearest Neighbors (KNN), and Random Forest—to classify subjects based on gender, age, Body Mass Index (BMI), and Chest Wall Perimeter (CWP). Vital signs were collected from 92 subjects using a Continuous Wave (CW) radar operating at 5.8 GHz. The results showed that the Random Forest algorithm was the most accurate, achieving accuracies of 76.66% for gender, 71.13% for age, 72.52% for BMI, and 74.61% for CWP. This study underscores the importance of considering individual variations when using Bio-Radar, enhancing its efficiency and expanding its potential applications. Full article
Show Figures

Figure 1

Figure 1
<p>Block diagram of the Bio-Radar system, adapted from [<a href="#B33-applsci-14-00921" class="html-bibr">33</a>].</p>
Full article ">Figure 2
<p>Block diagram of the Bio-Radar system including DSP module, adapted from [<a href="#B35-applsci-14-00921" class="html-bibr">35</a>].</p>
Full article ">Figure 3
<p>Sequential illustration of DC offset removal process in radar signal using the Optimized Cost Method. (<b>a</b>) Original radar signal with DC offset shown by ‘Received signal’ and the guide for optimization ‘Fictitious circumference’ with ‘Signal median’ as its center. (<b>b</b>) Identification of ‘Lowest cost point’ indicating the optimal DC offset values, with ‘Arc center’ marking the new median post-correction. (<b>c</b>) Signal post-DC offset removal, where the median aligns with the origin. (<b>d</b>) The signal after offset removal and a rotation to align for further analysis.</p>
Full article ">Figure 4
<p>Example of a demodulated signal.</p>
Full article ">Figure 5
<p>Overview of the measurement setup and the system configuration.</p>
Full article ">Figure 6
<p>Sliding window of the feature extraction.</p>
Full article ">Figure 7
<p>Normalized PSD curve of respiratory signals for a subject.</p>
Full article ">Figure 8
<p>Overview of the statistical results.</p>
Full article ">Figure 9
<p>Statistical analysis diagram.</p>
Full article ">Figure 10
<p>Correlation matrices associated for the gender test.</p>
Full article ">Figure 11
<p>Data splitting into train, test, and new data.</p>
Full article ">Figure 12
<p>Classification workflow, modified from [<a href="#B55-applsci-14-00921" class="html-bibr">55</a>].</p>
Full article ">
19 pages, 833 KiB  
Article
Radar-Based Invisible Biometric Authentication
by Maria Louro da Silva, Carolina Gouveia, Daniel Filipe Albuquerque and Hugo Plácido da Silva
Information 2024, 15(1), 44; https://doi.org/10.3390/info15010044 - 12 Jan 2024
Cited by 2 | Viewed by 3268
Abstract
Bio-Radar (BR) systems have shown great promise for biometric applications. Conventional methods can be forged, or fooled. Even alternative methods intrinsic to the user, such as the Electrocardiogram (ECG), present drawbacks as they require contact with the sensor. Therefore, research has turned towards [...] Read more.
Bio-Radar (BR) systems have shown great promise for biometric applications. Conventional methods can be forged, or fooled. Even alternative methods intrinsic to the user, such as the Electrocardiogram (ECG), present drawbacks as they require contact with the sensor. Therefore, research has turned towards alternative methods, such as the BR. In this work, a BR dataset with 20 subjects exposed to different emotion-eliciting stimuli (happiness, fearfulness, and neutrality) in different dates was explored. The spectral distributions of the BR signal were studied as the biometric template. Furthermore, this study included the analysis of respiratory and cardiac signals separately, as well as their fusion. The main test devised was authentication, where a system seeks to validate an individual’s claimed identity. With this test, it was possible to infer the feasibility of these type of systems, obtaining an Equal Error Rate (EER) of 3.48% if the training and testing data are from the same day and within the same emotional stimuli. In addition, the time and emotion results dependency is fully analysed. Complementary tests such as sensitivity to the number of users were also performed. Overall, it was possible to achieve an evaluation and consideration of the potential of BR systems for biometrics. Full article
Show Figures

Figure 1

Figure 1
<p>Basic model of a radar system applied to the measurement of vital signs. (<b>a</b>) represents the ideal scenario, where RX is the receiving antenna and TX the transmitting one; (<b>b</b>) represents the signal for an ideal scenario in the complex plane. Adapted from [<a href="#B9-information-15-00044" class="html-bibr">9</a>].</p>
Full article ">Figure 2
<p>Effect of the surrounding objects: (<b>a</b>) reflections schematics on the environment; (<b>b</b>) equivalent projection of the received signal on the complex plane. Adapted from [<a href="#B9-information-15-00044" class="html-bibr">9</a>].</p>
Full article ">Figure 3
<p>Implemented biometric identification system’s architecture.</p>
Full article ">Figure 4
<p>Data acquisition setup used.</p>
Full article ">Figure 5
<p>Example signals of the different modalities: (<b>a</b>) Cardiac signals extracted using the BR (in yellow), and the ECG signals (in dark grey); (<b>b</b>) respiratory signals extracted with the BR (in green), and with the BIOPAC (in light grey). The “minimum-maximum” normalisation was applied in these signals to obtain an amplitude between 0 and 1, so as to provide a better comparison. All signals were acquired from Subject 1, in neutral conditions.</p>
Full article ">Figure 6
<p>Spectral profile of the different signal sources. In darker colour the mean waveform is showcased for the segmented signal: the respiratory BR signal in yellow, the cardiac BR signal in green, and the ECG signal in red; the standard deviation is the area surrounding it. The signals shown were retrieved from Subject 1, in neutral conditions. Once more, the “minimum-maximum” normalisation was applied for comparison purposes. (<b>a</b>) represents a magnification into the first 3.0 Hz of the FFT signals where most BR frequencies are represented, whereas (<b>b</b>) represents the whole spectra.</p>
Full article ">Figure 7
<p>Spectral profile of different subjects. In colour the mean waveform is showcased for the segmented signal: yellow for Subject 8, green for Subject 9, and finally red is used to represent Subject 11. These signals were retrieved, in neutral conditions, after normalising, which is explained afterwards. In these pictures, the respiratory (<b>a</b>) and cardiac (<b>b</b>) BR signals are shown for comparison.</p>
Full article ">Figure 8
<p>Spectral profile of different subjects in different conditions. In the coloured line, the mean waveform is showcased for the segmented signal: yellow for Subject 13, green for Subject 18, and finally red is used to represent Subject 20; the standard deviation is portrayed and filled with colour. These signals were retrieved from the respiratory BR signal, in (<b>a</b>) neutral, (<b>b</b>) happy and (<b>c</b>) fear conditions.</p>
Full article ">Figure 9
<p>Train and Test split done for evaluation Scenario S1.</p>
Full article ">Figure 10
<p>Diagram illustrating the steps taken: (<b>1</b>) represents the extraction of the raw data; (<b>2</b>) the feature extraction, and (<b>3</b>) the classifier used.</p>
Full article ">Figure 11
<p>ROC curve with the best results for the three scenarios: (<b>a</b>) S1, (<b>b</b>) S2, and (<b>c</b>) S3. In it, the 1-FRR and FAR for the ECG as well as the BR signals are plotted in a coloured line: yellow for the ECG signal, green for the fusion source, red, and blue for the respiratory and cardiac BR signals, respectively.</p>
Full article ">Figure 12
<p>Best (<b>a</b>) and worst (<b>b</b>) case scenarios using the fusion signal source for different numbers of subjects.</p>
Full article ">
17 pages, 11061 KiB  
Article
Synthesis and Application of Modified Lignin Polyurea Binder for Manufacturing a Controlled-Release Potassium Fertilizer
by Mingyang Li, Gaoyang E, Conghui Wang, Ruolin Shi, Junxi Wang, Shuo Wang, Yu Wang, Qi Chen, Zeli Li and Zhiguang Liu
Agronomy 2023, 13(10), 2641; https://doi.org/10.3390/agronomy13102641 - 19 Oct 2023
Cited by 1 | Viewed by 1894
Abstract
Conventional potassium chloride granules have inefficient applications in agricultural production due to particle irregularity and low fluidity. The application of controlled-release potassium chloride could increase the potassium-use efficiency and alleviate the shortage of potassium ore resources. In this study, a well-rounded potassium chloride [...] Read more.
Conventional potassium chloride granules have inefficient applications in agricultural production due to particle irregularity and low fluidity. The application of controlled-release potassium chloride could increase the potassium-use efficiency and alleviate the shortage of potassium ore resources. In this study, a well-rounded potassium chloride fertilizer core was prepared, using the graft modification of polyurea to enhance the coating rate and release performance. The adhesive and tensile characteristics of the modified polyurea binder, as well as the granule properties of modified polyurea binder potassium chloride, were studied to determine the ideal lignin-grafted ratio. The effect of the modified polyurea binder with potassium chloride on the properties of coated fertilizer was investigated. The findings, shown by radar maps of the binder’s properties, demonstrated that the ideal mass ratio of the modified lignin polyurea binder to urea is 1:2. The Fourier-transform infrared spectroscopy results demonstrated that the amino functional groups of lignin were enhanced, improving the product’s interfacial compatibility with the polyurea matrix. Compared to humic acid (HA; 12%) and bentonite (Ben; 30%) treatments, the granule intensity of the 9.9%—1:2 treatment considerably increased by 139.10% and 38.86%, respectively, while the static angle of the granules reduced by 16.67% and 3.81%. The 28-day cumulative release rate of the modified polyurea (9.9%—2:1) with a 5% coating thickness was the lowest (28%), 42% lower than that of the lowest conventional treatment. In summary, the creation of a bio-lignin polyurea binder under the optimum conditions reduced the need for petrochemical-based materials, allowed the preparation of fertilizer with granules of increased fluidity, and enabled the successful coating of a high-salt potassium fertilizer, offering a novel technique for the high-value application of potash fertilizer coating. Full article
Show Figures

Figure 1

Figure 1
<p>Synthesis process of urea grafted lignin (<b>a</b>) and the FTIR spectra of lignin, urea, and urea-grafted lignin (GL1:1, 1:2, and 2:1) (<b>b</b>).</p>
Full article ">Figure 2
<p>Synthesis process of lignin-based polyurea binder (<b>a</b>) and the FTIR spectra of D2000, IPDI, polyurea, urea-grafted lignin, and urea-grafted lignin polyurea (PGL1:1, 1:2, and 2:1) (<b>b</b>).</p>
Full article ">Figure 3
<p>Characterization diagram of lignin-based polyurea binder: 25 Celsius viscosity (<b>a</b>), 120 Celsius viscosity (<b>b</b>), dry bond strength (<b>c</b>), tensile strength (<b>d</b>), elongation at break (<b>e</b>), and elastic modulus (<b>f</b>). PGL polyurea binder containing grafted lignin; PGLS represents polyurea binder containing grafted lignin and disulfide; * represents for multiplication sign.</p>
Full article ">Figure 4
<p>Comparison of granule strength of different binder fertilizers (<b>a</b>), comparison of fertilizer granule strength with different proportions of modified polyurea binder (<b>b</b>). Note: same letters for the bars indicate that means of fertilizer granule strength were not significantly different among treatments at 5% level.</p>
Full article ">Figure 5
<p>Sliding angle of granules of potassium chloride fertilizer (<b>a</b>), granule resting angle of potassium chloride fertilizer (<b>b</b>) (Sm stands for commercially available potash). Note: same letters for the boxes indicate that means of sliding angle and resting angle of granules were not significantly different among treatments at 5% level.</p>
Full article ">Figure 6
<p>Radar diagram of synthetic properties of lignin-based polyurea binders. Sliding angles C A represent the supplementary angle of the sliding angle.</p>
Full article ">Figure 7
<p>Granule size distribution of uncoated potassium chloride fertilizer (<b>a</b>) and 4% OBPU-coated potassium chloride fertilizer (<b>b</b>).</p>
Full article ">Figure 8
<p>Moisture absorption rate of potassium chloride fertilizer with different binders. Note: same letters for the bars indicate that means of moisture absorption rate were not significantly different among treatments at 5% level.</p>
Full article ">Figure 9
<p>Cumulative release rate of coated potassium chloride fertilizer with different thickness: 4% coating fluid (<b>a</b>) and 5% coating fluid (<b>b</b>).</p>
Full article ">Figure 10
<p>FTIR spectra of 4% soybean-oil-based polyurethanes (PGL1:1, 1:2, 2:1, HN, and Ben) in (<b>b</b>) and out (<b>a</b>) of controlled-release KCl membrane shell.</p>
Full article ">Figure 11
<p>SEM images of the membrane shell and fertilizer of (<b>a</b>) PGL2:1-4%-OBPU, (<b>b</b>) PGL1:1-4%-OBPU, (<b>c</b>) PGL1:2-4%-OBPU, (<b>d</b>) Ben-4%-OBPU, and (<b>e</b>) HN-4%-OBPU. Panels 1 (<b>a1</b>–<b>e1</b>), 2 (<b>a2</b>–<b>e2</b>), 3 (<b>a3</b>–<b>e3</b>), the membrane surface after 50 times, 100 times, and 500 times image amplification, respectively, and panels 4 (<b>a4</b>–<b>e4</b>) and 5 (<b>a5</b>–<b>e5</b>) after 100 times and 5000 times image amplification.</p>
Full article ">
20 pages, 7670 KiB  
Article
A Novel Non-Contact Detection and Identification Method for the Post-Disaster Compression State of Injured Individuals Using UWB Bio-Radar
by Ding Shi, Fulai Liang, Jiahao Qiao, Yaru Wang, Yidan Zhu, Hao Lv, Xiao Yu, Teng Jiao, Fuyuan Liao, Keding Yan, Jianqi Wang and Yang Zhang
Bioengineering 2023, 10(8), 905; https://doi.org/10.3390/bioengineering10080905 - 30 Jul 2023
Cited by 6 | Viewed by 2348
Abstract
Building collapse leads to mechanical injury, which is the main cause of injury and death, with crush syndrome as its most common complication. During the post-disaster search and rescue phase, if rescue personnel hastily remove heavy objects covering the bodies of injured individuals [...] Read more.
Building collapse leads to mechanical injury, which is the main cause of injury and death, with crush syndrome as its most common complication. During the post-disaster search and rescue phase, if rescue personnel hastily remove heavy objects covering the bodies of injured individuals and fail to provide targeted medical care, ischemia-reperfusion injury may be triggered, leading to rhabdomyolysis. This may result in disseminated intravascular coagulation or acute respiratory distress syndrome, further leading to multiple organ failure, which ultimately leads to shock and death. Using bio-radar to detect vital signs and identify compression states can effectively reduce casualties during the search for missing persons behind obstacles. A time-domain ultra-wideband (UWB) bio-radar was applied for the non-contact detection of human vital sign signals behind obstacles. An echo denoising algorithm based on PSO-VMD and permutation entropy was proposed to suppress environmental noise, along with a wounded compression state recognition network based on radar-life signals. Based on training and testing using over 3000 data sets from 10 subjects in different compression states, the proposed multiscale convolutional network achieved a 92.63% identification accuracy. This outperformed SVM and 1D-CNN models by 5.30% and 6.12%, respectively, improving the casualty rescue success and post-disaster precision. Full article
(This article belongs to the Special Issue Contactless Technologies for Human Vital Signs Monitoring)
Show Figures

Figure 1

Figure 1
<p>Structure diagram of a UWB bio-radar system and schematic diagram of non-contact cardiopulmonary activity detection.</p>
Full article ">Figure 2
<p>Original UWB radar data map: (<b>a</b>) two−dimensional color map; (<b>b</b>) three−dimensional color map.</p>
Full article ">Figure 3
<p>Preprocessed UWB radar data map: (<b>a</b>) two−dimensional color map; (<b>b</b>) three−dimensional color map; (<b>c</b>) radar echo signal at a human body’s position.</p>
Full article ">Figure 4
<p>Overall process block diagram.</p>
Full article ">Figure 5
<p>PSO diagram: the distributions of the MCC under different combinations of α and K. The optimization parameters: MEE = 19.76, α = 2062, and K = 4.</p>
Full article ">Figure 6
<p>Algorithm flow chart of PSO-VMD based on permutation entropy.</p>
Full article ">Figure 7
<p>Wavelet time-frequency diagram of the simulated signal: (<b>a</b>) Haar wavelet; (<b>b</b>) Db4 wavelet; (<b>c</b>) Sym4 wavelet; (<b>d</b>) Morlet wavelet.</p>
Full article ">Figure 7 Cont.
<p>Wavelet time-frequency diagram of the simulated signal: (<b>a</b>) Haar wavelet; (<b>b</b>) Db4 wavelet; (<b>c</b>) Sym4 wavelet; (<b>d</b>) Morlet wavelet.</p>
Full article ">Figure 8
<p>Compression state recognition network structure.</p>
Full article ">Figure 9
<p>Flow chart of the human compression state recognition process.</p>
Full article ">Figure 10
<p>Experimental scenarios for extracting contact and non-contact human life signals under indoor free space conditions.</p>
Full article ">Figure 11
<p>Schematic diagram of comparison between radar life and reference signals: (<b>a</b>) time domain comparison of respiratory signals; (<b>b</b>) radar respiratory signal spectrum; (<b>c</b>) time domain comparison of cardiac signals; (<b>d</b>) radar heartbeat signal spectrum.</p>
Full article ">Figure 12
<p>Experimental outdoor obstructed detection scenario for extracting contact and non-contact human life signals.</p>
Full article ">Figure 13
<p>Average error rate between the radar life signals versus reference signals: (<b>a</b>) respiratory rate measurement error rate; (<b>b</b>) heart rate measurement error rate.</p>
Full article ">Figure 14
<p>Four types of compression state identification experimental scenarios.</p>
Full article ">Figure 15
<p>Accuracy and loss curves of the network training process.</p>
Full article ">Figure 16
<p>Confusion matrix of the compression state recognition.</p>
Full article ">Figure 17
<p>Multi-model cross-validation results.</p>
Full article ">
21 pages, 5199 KiB  
Article
Automatic Life Detection Based on Efficient Features of Ground-Penetrating Rescue Radar Signals
by Di Shi, Gunnar Gidion, Leonhard M. Reindl and Stefan J. Rupitsch
Sensors 2023, 23(15), 6771; https://doi.org/10.3390/s23156771 - 28 Jul 2023
Cited by 3 | Viewed by 1694
Abstract
Good feature engineering is a prerequisite for accurate classification, especially in challenging scenarios such as detecting the breathing of living persons trapped under building rubble using bioradar. Unlike monitoring patients’ breathing through the air, the measuring conditions of a rescue bioradar are very [...] Read more.
Good feature engineering is a prerequisite for accurate classification, especially in challenging scenarios such as detecting the breathing of living persons trapped under building rubble using bioradar. Unlike monitoring patients’ breathing through the air, the measuring conditions of a rescue bioradar are very complex. The ultimate goal of search and rescue is to determine the presence of a living person, which requires extracting representative features that can distinguish measurements with the presence of a person and without. To address this challenge, we conducted a bioradar test scenario under laboratory conditions and decomposed the radar signal into different range intervals to derive multiple virtual scenes from the real one. We then extracted physical and statistical quantitative features that represent a measurement, aiming to find those features that are robust to the complexity of rescue-radar measuring conditions, including different rubble sites, breathing rates, signal strengths, and short-duration disturbances. To this end, we utilized two methods, Analysis of Variance (ANOVA), and Minimum Redundancy Maximum Relevance (MRMR), to analyze the significance of the extracted features. We then trained the classification model using a linear kernel support vector machine (SVM). As the main result of this work, we identified an optimal feature set of four features based on the feature ranking and the improvement in the classification accuracy of the SVM model. These four features are related to four different physical quantities and independent from different rubble sites. Full article
Show Figures

Figure 1

Figure 1
<p>The three stages to design an automated life-detection algorithm.</p>
Full article ">Figure 2
<p>Illustration of a bioradar operation scenario. A person is trapped underneath rubble piles. A bioradar is placed by an UAS on top of the rubble pile. All objects in the environment reflect the transmitted radar signal. A first responder, a rescue dog, a working excavator and a vibrating tree are out of the range window.</p>
Full article ">Figure 3
<p>The workflow of this contribution, which is an expansion of the workflow given in <a href="#sensors-23-06771-f001" class="html-fig">Figure 1</a>.</p>
Full article ">Figure 4
<p>The experimental set-up. In the center is an office table with five bags of broken bricks, three bags are in a plastic box, and two are on the side. Under the table, there are two metal beams. (<b>a</b>) A bioradar and two antennas are on the top of the box. (<b>b</b>) Side view of the set-up. A test person can hide underneath the table. One long side of the set-up is close to a wall with a heater. (<b>c</b>) Body position 1: sitting.</p>
Full article ">Figure 5
<p>(<b>a</b>) Body position 2: right lateral. (<b>b</b>) Body position 3: left lateral. (<b>c</b>) Body position 4: supine. (<b>d</b>) Body position 5: prone.</p>
Full article ">Figure 6
<p>Schematic diagram of the laboratory with experimental set-up. A test person lies in a “right lateral” position. The blue arrows illustrate the deformation of the human body during inhalation. The transmission and reflection of RF waves can take any path in the room. Here we illustrate some possible paths with yellow, rose-red, and green colors. In the illustration, the rose-red colored path is the shortest, RF waves propagate through the box with brick bags and the table, hit the person’s waist then return. The green path is the longest, however, the waves in this path mainly travel through the air, reaching and reflecting perpendicular to the person’s chest.</p>
Full article ">Figure 7
<p>A measurement of “with person”. (<b>a</b>) Sub-signals of the first three range intervals, high-pass filtered. (<b>b</b>) Frequency—range plot. In the range 1 to 5, the FFT-determined respiratory frequency <math display="inline"><semantics><msub><mi>f</mi><mi>fft</mi></msub></semantics></math> is labeled.</p>
Full article ">Figure 8
<p>A measurement of “with person”. (<b>a</b>) FFT of the high-pass filtered signal of the first three range bins. (<b>b</b>) CWT of range 1. The biggest peak at each time point is highlighted with the white curve, <math display="inline"><semantics><msub><mi>f</mi><mi>cwt</mi></msub></semantics></math>. The mode and mean of <math display="inline"><semantics><msub><mi>f</mi><mi>cwt</mi></msub></semantics></math> is noted on the plot with a dotted line, respectively.</p>
Full article ">Figure 9
<p>The “with person” measurement shown in <a href="#sensors-23-06771-f008" class="html-fig">Figure 8</a>. (<b>a</b>) CWT of range 2. (<b>b</b>) CWT of range 3. The biggest peak f_cwt at each time point is highlighted with the white curve, <math display="inline"><semantics><msub><mi>f</mi><mi>cwt</mi></msub></semantics></math>. The mode and mean of <math display="inline"><semantics><msub><mi>f</mi><mi>cwt</mi></msub></semantics></math> is noted on the plot with a dotted line, respectively.</p>
Full article ">Figure 10
<p>A “without person” measurement. (<b>a</b>) FFT of the high-pass filtered signal of the first three range bins. (<b>b</b>) CWT of range 1.</p>
Full article ">Figure 11
<p>Boxplots of nine features for observations with and without persons, respectively. The median of a feature is shown as the line inside the box. The lower and upper quartiles are shown as the bottom and top edges of the box, respectively. The distance between the top and bottom edges is the interquartile range (<math display="inline"><semantics><mrow><mi>I</mi><mi>Q</mi><mi>R</mi></mrow></semantics></math>). Outliers are shown as circles, and they are values that are more than <math display="inline"><semantics><mrow><mn>1.5</mn><mspace width="0.166667em"/><mo>·</mo><mspace width="0.166667em"/><mi>I</mi><mi>Q</mi><mi>R</mi></mrow></semantics></math> away from the edges of the box. The whiskers are lines that connect the box edges to the nonoutlier maximum and the nonoutlier minimum.</p>
Full article ">Figure 12
<p>A <span class="html-italic">F</span>-distribution with <math display="inline"><semantics><mrow><msub><mi>d</mi><mrow><mi>f</mi><mi>G</mi></mrow></msub><mo>=</mo><mn>1</mn></mrow></semantics></math> and <math display="inline"><semantics><mrow><msub><mi>d</mi><mrow><mi>f</mi><mi>E</mi></mrow></msub><mo>=</mo><mn>2163</mn></mrow></semantics></math>. The blue <span class="html-italic">F</span> is an illustrative example. The <span class="html-italic">p</span>-value is the area of the upper tail.</p>
Full article ">Figure 13
<p>Scatter plot of the two top scoring features: (<b>a</b>) according to ANOVA ranking. (<b>b</b>) according to MRMR ranking.</p>
Full article ">Figure 14
<p>The classification accuracies (ACC), false negative rates (FNR), and false positive rates (FPR) of the SVM model as the number of features increase. Features are added sequentially according to the rank orders given by one-way ANOVA and MRMR, respectively. (<b>a</b>) overview (<b>b</b>) local zoom-in of (<b>a</b>).</p>
Full article ">Figure 15
<p>Scatter plots of the data and support vectors for the trained models. (<b>a</b>) The decision boundary of the 2D-LSVM model is a straight line. (<b>b</b>) The decision boundary of the 3D-LSVM model is a plane.</p>
Full article ">Figure 16
<p>For 4D-LSVM, the observations can be wrapped into a 1D space according to the Equation (<a href="#FD17-sensors-23-06771" class="html-disp-formula">17</a>). The decision boundary is the point <math display="inline"><semantics><mrow><mi>f</mi><mo>(</mo><mi>x</mi><mo>)</mo><mo>=</mo><mn>0</mn></mrow></semantics></math>.</p>
Full article ">Figure 17
<p>(<b>a</b>) The number of observations with different PR(f_fft) ranges. (<b>b</b>) The false rate of the 4D-LSVM model for different PR(f_fft) ranges.</p>
Full article ">Figure 18
<p>False-negative classified observations and corresponding body positions. The percentage of measurements taken in position sitting, right lateral, left lateral, supine, and prone are about 20%, 10%, 10%, 40%, and 20%, respectively.</p>
Full article ">
20 pages, 16640 KiB  
Article
Frequency Comb-Based Ground-Penetrating Bioradar: System Implementation and Signal Processing
by Di Shi, Gunnar Gidion, Taimur Aftab, Leonhard M. Reindl and Stefan J. Rupitsch
Sensors 2023, 23(3), 1335; https://doi.org/10.3390/s23031335 - 25 Jan 2023
Cited by 2 | Viewed by 2632
Abstract
Radars can be used as sensors to detect the breathing of victims trapped under layers of building materials in catastrophes like earthquakes or gas explosions. In this contribution, we present the implementation of a novel frequency comb continuous wave (FCCW) bioradar module using [...] Read more.
Radars can be used as sensors to detect the breathing of victims trapped under layers of building materials in catastrophes like earthquakes or gas explosions. In this contribution, we present the implementation of a novel frequency comb continuous wave (FCCW) bioradar module using a commercial software-defined radio (SDR). The FCCW radar transmits multiple equally spaced frequency components simultaneously. The data acquisition of the received combs is frequency domain-based. Hence, it does not require synchronization between the transmit and receive channels, as time domain-based broadband radars, such as ultra wideband (UWB) pulse radar and frequency-modulated CW (FMCW) radar, do. Since a frequency comb has an instantaneous wide bandwidth, the effective scan rate is much higher than that of a step frequency CW (SFCW) radar. This FCCW radar is particularly suitable for small motion detection. Using inverse fast Fourier transform (IFFT), we can decompose the received frequency comb into different ranges and remove ghost signals and interference of further range intervals. The frequency comb we use in this report has a bandwidth of only 60 MHz, resulting in a range resolution of up to 2.5 m, much larger than respiration-induced chest wall motions. However, we demonstrate that in the centimeter range, motions can be detected and evaluated by processing the received comb signals. We want to integrate the bioradar into an unmanned aircraft system for fast and safe search and rescue operations. As a trade-off between ground penetrability and the size and weight of the antenna and the radar module, we use 1.3 GHz as the center frequency. Field measurements show that the proposed FCCW bioradar can detect an alive person through different nonmetallic building materials. Full article
(This article belongs to the Special Issue RADAR Sensors and Digital Signal Processing)
Show Figures

Figure 1

Figure 1
<p>Illustration of the bioradar measuring principle.</p>
Full article ">Figure 2
<p>Polar plot of two <math display="inline"><semantics> <mrow> <mi>s</mi> <mo>[</mo> <mi>l</mi> <mo>]</mo> </mrow> </semantics></math> examples: (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <mo>=</mo> <mn>3</mn> </mrow> </semantics></math>, odd. The comb has three tones. For <math display="inline"><semantics> <mrow> <mi>l</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math> and 2, these tones equally distribute on the unit circle, and their complex sum is zero. For <math display="inline"><semantics> <mrow> <mi>l</mi> <mo>=</mo> <mn>3</mn> </mrow> </semantics></math>, the three tones overlap at −j. (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>M</mi> <mo>=</mo> <mn>4</mn> </mrow> </semantics></math>, even. The comb has four tones. For <math display="inline"><semantics> <mrow> <mi>l</mi> <mo>=</mo> <mn>4</mn> </mrow> </semantics></math>, the four tones overlap at j.</p>
Full article ">Figure 3
<p>For transmitting: (<b>a</b>) The pre-defined waveform in sample time domain. Samples <math display="inline"><semantics> <mrow> <mi>l</mi> <mo>=</mo> <mn>1</mn> <mo>…</mo> <mn>128</mn> </mrow> </semantics></math> built one complete wave. (<b>b</b>) The spectrum of the waveform <math display="inline"><semantics> <mrow> <mi>s</mi> <mo>[</mo> <mi>l</mi> <mo>]</mo> </mrow> </semantics></math> with a length <math display="inline"><semantics> <mrow> <mi>L</mi> <mo>=</mo> <mn>4096</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Illustration of a bioradar operation scenario. A person is trapped underneath rubble piles. A bioradar is on top of the rubble pile. All objects in the environment reflect the transmitted radar signal. A non-trapped person and a vibrating tree are out of the range window.</p>
Full article ">Figure 5
<p>Block diagram of the software-defined radio bioradar. The SDR block is modified from [<a href="#B40-sensors-23-01335" class="html-bibr">40</a>]. Next Section shows a detailed version of the signal processing process.</p>
Full article ">Figure 6
<p>Signal processing procedure. Step 1, pre-processing, includes four sub-steps: <span class="html-italic">N</span> times FFT, <span class="html-italic">N</span> times IFFT, decompose the signal into <span class="html-italic">M</span> range intervals and keep the first <math display="inline"><semantics> <mrow> <mi>M</mi> <mo>/</mo> <mn>2</mn> </mrow> </semantics></math> sub-signals, and highpass filter them. In step 2, another FFT is applied to these sub-signals, respectively. The result of the second step is a plot and three values. Step 3 calculates the time-frequency distribution of the sub-signal from <math display="inline"><semantics> <msub> <mi>R</mi> <mrow> <mi>p</mi> <mi>e</mi> <mi>r</mi> <mi>s</mi> <mi>o</mi> <mi>n</mi> </mrow> </msub> </semantics></math> and results in a diagram and two values. With the results from steps 2 and 3, the decision about whether life is detected can be determined in Step 4.</p>
Full article ">Figure 7
<p>(<b>a</b>) Photograph of the Bioradar (adrv9364-z7020) with two Vivaldi antennas. (<b>b</b>) A test person sits in front of the test setup.</p>
Full article ">Figure 8
<p>For receiving: (<b>a</b>) One frame of the received signal. (<b>b</b>) The spectrum of this frame.</p>
Full article ">Figure 9
<p>(<b>a</b>) The 32 points of raw data of one frame, linear scale of <a href="#sensors-23-01335-f008" class="html-fig">Figure 8</a>b. (<b>b</b>) The IFFT of the magnitude in (<b>a</b>).</p>
Full article ">Figure 10
<p>(<b>a</b>) The real part of the time domain decomposed signal for the first three ranges of the measurement shown in <a href="#sensors-23-01335-f007" class="html-fig">Figure 7</a>. Range-1 <math display="inline"><semantics> <mrow> <mo>≈</mo> <mn>0</mn> <mspace width="4pt"/> </mrow> </semantics></math> to <math display="inline"><semantics> <mrow> <mn>2.5</mn> </mrow> </semantics></math> m, range-2 <math display="inline"><semantics> <mrow> <mo>≈</mo> <mn>2.5</mn> <mspace width="4pt"/> </mrow> </semantics></math> to 5 m, range-3 <math display="inline"><semantics> <mrow> <mo>≈</mo> <mn>5</mn> <mspace width="4pt"/> </mrow> </semantics></math> to <math display="inline"><semantics> <mrow> <mn>7.5</mn> </mrow> </semantics></math> m. (<b>b</b>) The corresponding filtered signals.</p>
Full article ">Figure 11
<p>(<b>a</b>) FFT of the filtered signals of the first three range bins shown in <a href="#sensors-23-01335-f010" class="html-fig">Figure 10</a>; (<b>b</b>) The breathing frequency–range plot.</p>
Full article ">Figure 12
<p>The CWT time-frequency distribution of the range-2 signal in <a href="#sensors-23-01335-f010" class="html-fig">Figure 10</a>b. (<b>a</b>) The distribution for frequency <math display="inline"><semantics> <mrow> <mi>f</mi> <mo>∈</mo> </mrow> </semantics></math> [0, 1 Hz]. (<b>b</b>) The 1D representation of (<b>a</b>): the strongest peak of each time point. The mode value of the curve is marked in red.</p>
Full article ">Figure 13
<p>The CWT time–frequency distribution of the range-1 signal in <a href="#sensors-23-01335-f010" class="html-fig">Figure 10</a>b. (<b>a</b>) The distribution for frequency <math display="inline"><semantics> <mrow> <mi>f</mi> <mo>∈</mo> </mrow> </semantics></math> [0, 1 Hz]. (<b>b</b>) The 1D representation of (<b>a</b>): The strongest peak of each time point. The mode value of the curve is marked in red.</p>
Full article ">Figure 14
<p>A bioradar experiment with two people. (<b>a</b>) Photograph of the experiment setup. Person 1 sits in the first range interval, and person 2 sits in the third range interval. (<b>b</b>) The breathing frequency–distance plot.</p>
Full article ">Figure 15
<p>Block diagram of the prototype bioradar system. Relevant software used on the Raspberry Pi is illustrated with white blocks.</p>
Full article ">Figure 16
<p>Measurement scenarios: (<b>a</b>) A tunnel covered with a wooden plate. (<b>b</b>) A concrete tube. (<b>c</b>) A building with wooden floors. The test person lay on the ground floor. The bioradar was placed on the first floor. (<b>d</b>) A building with reinforced concrete floors. The test person lay on the ground floor. The bioradar was placed on the first floor.</p>
Full article ">Figure A1
<p>GNU radio flow graph.</p>
Full article ">Figure A2
<p>Measurement Nr.1 in scenario shown in <a href="#sensors-23-01335-f016" class="html-fig">Figure 16</a>a: (<b>a</b>) top: preprocessed time-domain signal of the first range bin; bottom: FFT of the signal in the upper plot. (<b>b</b>) CWT of the signal in (<b>a</b>).</p>
Full article ">Figure A3
<p>Measurement Nr.3 in scenario shown in <a href="#sensors-23-01335-f016" class="html-fig">Figure 16</a>b: (<b>a</b>) top: preprocessed time-domain signal of the first range bin; bottom: FFT of the signal in the upper plot. (<b>b</b>) CWT of the signal in (<b>a</b>).</p>
Full article ">Figure A4
<p>Measurement Nr.5 in scenario shown in <a href="#sensors-23-01335-f016" class="html-fig">Figure 16</a>c: (<b>a</b>) top: preprocessed time-domain signal of the first range bin; bottom: FFT of the signal in the upper plot. (<b>b</b>) CWT of the signal in (<b>a</b>).</p>
Full article ">Figure A5
<p>Measurement Nr.7 in scenario shown in <a href="#sensors-23-01335-f016" class="html-fig">Figure 16</a>d: (<b>a</b>) top: preprocessed time-domain signal of the first range bin; bottom: FFT of the signal in the upper plot. (<b>b</b>) CWT of the signal in (<b>a</b>).</p>
Full article ">
24 pages, 11126 KiB  
Article
An Improved Forest Height Model Using L-Band Single-Baseline Polarimetric InSAR Data for Various Forest Densities
by Ao Sui, Opelele Omeno Michel, Yu Mao and Wenyi Fan
Remote Sens. 2023, 15(1), 81; https://doi.org/10.3390/rs15010081 - 23 Dec 2022
Cited by 1 | Viewed by 1833
Abstract
Forest density affects the inversion of forest height by influencing the penetration and attenuation of synthetic aperture radar (SAR) signals. Traditional forest height inversion methods often fail in low-density forest areas. Based on L-band single-baseline polarimetric SAR interferometry (PolInSAR) simulation data and the [...] Read more.
Forest density affects the inversion of forest height by influencing the penetration and attenuation of synthetic aperture radar (SAR) signals. Traditional forest height inversion methods often fail in low-density forest areas. Based on L-band single-baseline polarimetric SAR interferometry (PolInSAR) simulation data and the BioSAR 2008 data, we proposed a forest height optimization model at the stand scale suitable for various forest densities. This optimization model took into account shortcomings of the three-stage inversion method by employing height errors to represent the mean penetration depth and SINC inversion method. The relationships between forest density and extinction coefficient, penetration depth, phase, and magnitude were also discussed. In the simulated data, the inversion height established by the optimization method was 17.35 m, while the RMSE value was 3.01 m when the forest density was 100 stems/ha. This addressed the drawbacks of the conventional techniques including failing at low forest density. In the real data, the maximum RMSE of the optimization method was 2.17 m as the stand density increased from 628.66 stems/ha to 1330.54 stems/ha, showing the effectiveness and robustness of the optimization model in overcoming the influence of stand density on the inversion process in realistic scenarios. This study overcame the stand density restriction on L-band single baseline PolInSAR data for forest height estimation and offered a reference for algorithm selection and optimization. The technique is expected to be extended from the stand scale to a larger area for forest ecosystem monitoring and management. Full article
(This article belongs to the Special Issue Monitoring Forest Carbon Sequestration with Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Scenarios of simulation for nine groups of simulated data. The reference height for all nine datasets is 18 m. Forest density ranges from 100 to 900 stems/ha.</p>
Full article ">Figure 2
<p>Pauli-basis (HV, HH + VV, and HH − VV) composite images of nine simulated data sets.</p>
Full article ">Figure 3
<p>The extent of the study area and the several products included in the BioSAR 2008 dataset. (<b>a</b>) The red pentagram in the upper right corner of the thumbnail depicts the location of the study area in Sweden. The red rectangle region shows the SAR image. The green and yellow polygons represent the 31 forest stands in the red rectangle region; the four yellow polygons represent the forest stands employed in this study. (<b>b</b>) The green background image is a Lidar image with a grid size resampling of 1 m × 1 m. Pauli-basis image is the extent of SAR images. The Pauli-basis image (on the left) shows the locations of the four forest stands. The shape of four forest stands on the Pauli-basis image is shown on the right. All products on the map are geocoded to WGS84 UTM Zone 34N.</p>
Full article ">Figure 4
<p>Flow chart. DEM_diff is the DEM difference method, SINC inversion is the SINC inversion method, RVoG_Phase represents the RVoG ground phase method, ThreeStage represents the traditional three-stage inversion method, Phase_Coherence is the phase and coherence inversion method, Hybrid Iterative Method (theoretical model) represents coherence amplitude and three-stage hybrid iterative theoretical model. The hybrid Iterative Method (application model) describes the coherence amplitude and three-stage hybrid iterative application model.</p>
Full article ">Figure 5
<p>Schematic representation of the RVoG two-layer scattering model. The two-layer model consists of a volume layer and a ground layer, where the volume layer is supposed to be a medium with uniform random density.</p>
Full article ">Figure 6
<p>The geometric model of forest scattering is affected by terrain slope. (<b>a</b>) The landscape distribution faces the radar sensor when the terrain slope is positive. (<b>b</b>) When the terrain slope is negative, the landscape distribution is at the back of the radar sensor. <math display="inline"><semantics> <mrow> <mi>y</mi> <mi>o</mi> <mi>z</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msup> <mi>y</mi> <mo>′</mo> </msup> <msup> <mi>o</mi> <mo>′</mo> </msup> <msup> <mi>z</mi> <mo>′</mo> </msup> </mrow> </semantics></math> are the coordinate system under flat terrain and the slope-corrected coordinate system, respectively. <math display="inline"><semantics> <mrow> <msub> <mi>B</mi> <mo>⊥</mo> </msub> </mrow> </semantics></math> means the vertical baseline. <math display="inline"><semantics> <mrow> <msub> <mi>S</mi> <mrow> <mi>m</mi> <mi>a</mi> <mi>s</mi> <mi>t</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msub> <mi>S</mi> <mrow> <mi>s</mi> <mi>l</mi> <mi>a</mi> <mi>v</mi> <mi>e</mi> </mrow> </msub> </mrow> </semantics></math> are the master image and the slave image, respectively. P is the ground object (e.g., forest), <math display="inline"><semantics> <mi>θ</mi> </semantics></math> is the radar incidence angle, <math display="inline"><semantics> <mrow> <msub> <mi>θ</mi> <mn>0</mn> </msub> </mrow> </semantics></math> is the local incidence angle, <math display="inline"><semantics> <mi>α</mi> </semantics></math> is the range slope, <math display="inline"><semantics> <mrow> <msub> <mi>h</mi> <mrow> <mi>v</mi> <mn>0</mn> </mrow> </msub> </mrow> </semantics></math> is the slope-corrected forest height under the <math display="inline"><semantics> <mrow> <msup> <mi>y</mi> <mo>′</mo> </msup> <msup> <mi>o</mi> <mo>′</mo> </msup> <msup> <mi>z</mi> <mo>′</mo> </msup> </mrow> </semantics></math> coordinate system, and <math display="inline"><semantics> <mrow> <msub> <mi>h</mi> <mi>v</mi> </msub> </mrow> </semantics></math> is the forest height under the <math display="inline"><semantics> <mrow> <mi>y</mi> <mi>o</mi> <mi>z</mi> </mrow> </semantics></math> coordinate system.</p>
Full article ">Figure 7
<p>Images of the four forest stands’ terrain slope and DEM. (<b>a</b>–<b>d</b>) Pauli-basis of four stands in the SAR coordinate system. (<b>e</b>–<b>h</b>) Terrain slope (<b>i</b>–<b>l</b>) DEM.</p>
Full article ">Figure 8
<p>Results graphs for the six forest height inversion techniques for simulated data sets with 100–900 stems/ha. (<b>a</b>) Line graph of forest height (<b>b</b>) Line graph of forest height error. The red line means the DEM difference method (DEM_diff), and the orange line represents the SINC inversion method (SINC inversion). The yellow line describes the RVoG ground phase method (RVoG_Phase), and the green line represents the traditional three-stage inversion method (ThreeStage). The blue line is the phase and coherence inversion method (Phase_Coherence), and the purple line represents the coherence magnitude and three-stage hybrid iterative method (Hybrid Iterative).</p>
Full article ">Figure 9
<p>Relative heights of canopy phase center and ground phase center. (<b>a</b>) DEM difference method (<b>b</b>) RVoG ground phase method.</p>
Full article ">Figure 10
<p>In the SINC inversion method, two-dimensional images of tree height and HV polarization intensity are generated for nine groups of forest stands. (<b>a</b>) A two-dimensional map of forest height for nine groups of forest stands. (<b>b</b>) A two-dimensional map of the HV magnitude for nine groups of forest stands. For each of the nine groups of forest stands, the black circles indicate the regions with low amplitude values. The red and black circles are positioned in the same location in both images.</p>
Full article ">Figure 11
<p>Forest height mapping in four forest stands was accomplished using coherence magnitude and three-stage hybrid iterative application algorithms. (<b>a</b>) Stand 2625, (<b>b</b>) stand 2228, (<b>c</b>) stand 3611, and (<b>d</b>) stand 4451.</p>
Full article ">Figure 12
<p>Error bar plot of ground phase center relative heights. (<b>a</b>) DEM difference method (<b>b</b>) RVoG ground phase method.</p>
Full article ">Figure 13
<p>A presentation of the relationship between extinction coefficient, coherence amplitude, and phase. (<b>a</b>) The relationship between complex coherence and extinction coefficient in the complex unit circle. (<b>b</b>) The relationship between tree height, amplitude, and extinction coefficient. (<b>c</b>) The relationship between extinction coefficient and phase center height, tree height (<math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0</mn> <mo> </mo> <mi>d</mi> <mi>B</mi> <mo>/</mo> <mi>m</mi> </mrow> </semantics></math> is the black line, <math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.1</mn> <mo> </mo> <mi>d</mi> <mi>B</mi> <mo>/</mo> <mi>m</mi> </mrow> </semantics></math> is the blue line, <math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.125</mn> <mo> </mo> <mi>d</mi> <mi>B</mi> <mo>/</mo> <mi>m</mi> </mrow> </semantics></math> is the yellow line, and <math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.5</mn> <mo> </mo> <mi>d</mi> <mi>B</mi> <mo>/</mo> <mi>m</mi> </mrow> </semantics></math> is the red line).</p>
Full article ">Figure 14
<p>The two-dimensional maps depict the extinction coefficients of nine groups of stands inverted using the traditional three-stage inversion method, with red circles representing the same locations of the nine groups of forest stands.</p>
Full article ">Figure 15
<p>Diagram showing the relative tree height error, penetration depth, and forest density. (<b>a</b>) Schematic representation of signal penetration at low forest density, where penetration depth d is higher, and canopy phase center Y is lower. (<b>b</b>) Schematic illustration of signal penetration at high forest density showing that the penetration depth d is less and the canopy phase center Y is more elevated and closer to the canopy’s top. (<b>c</b>) The relative error (penetration depth) of the nine groups of forest stand inversions using the traditional three-stage inversion method.</p>
Full article ">Figure 16
<p>An illustration of ideal geometric structure of the conventional three-stage inversion method and the situation of a failed geometric structure. (<b>a</b>) The ideal geometry of the traditional three-stage inversion method. (<b>b</b>) Either the coherence region’s long axis is too short or too far away from the LUT for the geometric structure to fail.</p>
Full article ">
18 pages, 5287 KiB  
Article
A Semi-Empirical Retrieval Method of Above-Ground Live Forest Fuel Loads by Combining SAR and Optical Data
by Yanxi Li and Binbin He
Remote Sens. 2023, 15(1), 5; https://doi.org/10.3390/rs15010005 - 20 Dec 2022
Cited by 8 | Viewed by 3159
Abstract
Forest fuel load is the key factor for fire risk assessment, firefighting, and carbon emissions estimation. Remote sensing technology has distinct advantages in fuel load estimation due to its sensitivity to biomass and adequate spatiotemporal observations for large scales. Many related works applied [...] Read more.
Forest fuel load is the key factor for fire risk assessment, firefighting, and carbon emissions estimation. Remote sensing technology has distinct advantages in fuel load estimation due to its sensitivity to biomass and adequate spatiotemporal observations for large scales. Many related works applied empirical methods with individual satellite observation data to estimate fuel load, which is highly conditioned on local data and limited by saturation problems. Here, we combined optical data (i.e., Landsat 7 ETM+) and spaceborne Synthetic Aperture Radar (SAR) data (i.e., ALOS PALSAR) in a proposed semi-empirical retrieval model to estimate above-ground live forest fuel loads (FLAGL). Specifically, optical data was introduced into water cloud model (WCM) to compensate for vegetation coverage information. For comparison, we also evaluated the performance of single spaceborne L-band SAR data (i.e., ALOS PALSAR) in fuel load estimation with common WCM. The above two comparison experiments were both validated by field measurements (i.e., BioSAR-2008) and leave-one-out cross-validation (LOOCV) method. WCM with single SAR data could achieve reasonable performance (R2 = 0.64 or higher and RMSEr = 35.3% or lower) but occurred an underestimation problem especially in dense forests. The proposed method performed better with R2 increased by 0.05–0.13 and RMSEr decreased by 5.8–12.9%. We also found that the underestimation problem (i.e., saturation problem) was alleviated even when vegetation coverage reached 65% or the total FLAGL reached about 183 Tons/ha. We demonstrated our FLAGL estimation method by validation in an open-access dataset in Sweden. Full article
(This article belongs to the Special Issue In Situ Data in the Interplay of Remote Sensing)
Show Figures

Figure 1

Figure 1
<p>Geographical location of the study area. (<b>a</b>) The location of study area in Sweden. (<b>b</b>) The distribution of forest stands.</p>
Full article ">Figure 2
<p>Methodological flowchart of <span class="html-italic">FL<sub>AG</sub></span> estimation from integrated optical and SAR data with new model.</p>
Full article ">Figure 3
<p>Comparison of <span class="html-italic">FL<sub>AGL</sub></span> (FFL, BFL and SFL) inversion results using WCM (<b>top row</b>) and the new model (<b>bottom row</b>) on BioSAR-2008.</p>
Full article ">Figure 4
<p>The spatial distribution of <span class="html-italic">FL<sub>AGL</sub></span> in the study area.</p>
Full article ">Figure 5
<p>Comparison of model performance with different vegetation coverage.</p>
Full article ">Figure 6
<p>The correlation relationship between HV and FFL (<b>a</b>), and HV/Green and FFL (<b>b</b>).</p>
Full article ">
Back to TopTop