Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (413)

Search Parameters:
Keywords = smartphone sensing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 4002 KiB  
Article
A Ratiometric Fluorescence Probe for Visualized Detection of Heavy Metal Cadmium and Application in Water Samples and Living Cells
by Qijiang Xu, Wen Qin, Yanfei Qin, Guiying Hu, Zhiyong Xing and Yatong Liu
Molecules 2024, 29(22), 5331; https://doi.org/10.3390/molecules29225331 - 13 Nov 2024
Viewed by 338
Abstract
Heavy metal cadmium (II) residuals have inflicted severe damage to human health and ecosystems. It has become imperative to devise straightforward and highly selective sensing methods for the detection of Cd2+. In this work, a ratiometric benzothiazole-based fluorescence probe (BQFA [...] Read more.
Heavy metal cadmium (II) residuals have inflicted severe damage to human health and ecosystems. It has become imperative to devise straightforward and highly selective sensing methods for the detection of Cd2+. In this work, a ratiometric benzothiazole-based fluorescence probe (BQFA) was effortlessly synthesized and characterized using standard optical techniques for the visual detection of Cd2+ with a change in color from blue to green, exhibiting a significant Stokes shift. Moreover, the binding ratio of BQFA to Cd2+ was established as 1:1 by the Job’s plot and was further confirmed by FT-IR and 1HNMR titrations. The ratiometric fluorescence response via the ICT mechanism was confirmed by DFT calculations. Furthermore, the limit of detection for detecting Cd2+ was determined to be 68 nM. Furthermore, it is noteworthy that BQFA showed good performance in real water samples, paper strips, smartphone colorimetric identification, and cell imaging. Full article
(This article belongs to the Section Analytical Chemistry)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Fluorescence spectra of <b>BQFA</b> (10 μM) in the presence of the tested analytes in DMF/H<sub>2</sub>O/EtOH (<span class="html-italic">v</span>/<span class="html-italic">v</span>/<span class="html-italic">v</span>, 1/1/98) solution (λ<sub>ex</sub> = 350 nm). (<b>b</b>) Solution color change before and after the addition of Cd<sup>2+</sup> (5 eq.).</p>
Full article ">Figure 2
<p>(<b>a</b>) Competition of fluorescence spectra of <b>BQFA</b> (10 μM) when detecting Cd<sup>2+</sup> (5.0 eq.) in the presence of other metal ions (5.0 eq.) using the ratio of fluorescence intensity at 496 nm and 465 nm. (<b>b</b>) Fluorescence spectra of <b>BQFA</b> (10 μM) when detecting Cd<sup>2+</sup> (5.0 eq.) in the presence of other metal ions (5.0 eq.).</p>
Full article ">Figure 3
<p>(<b>a</b>) Fluorescence titration of <b>BQFA</b> (10 μM) with the addition of Cd<sup>2+</sup> (0–50 μM) in solution. (<b>b</b>) UV–Vis spectrum titration of <b>BQFA</b> (10 μM) with the addition of Cd<sup>2+</sup> (0–50 μM) in solution.</p>
Full article ">Figure 4
<p>(<b>a</b>) Job’s plot of <b>BQFA</b> and Cd<sup>2+</sup>. (<b>b</b>) Benesi–Hildebrand plot of <b>BQFA</b> with Cd<sup>2+</sup>.</p>
Full article ">Figure 5
<p>(<b>a</b>) Energy-optimized structure of the probe <b>BQFA</b> and <b>BQFA</b>+Cd<sup>2+</sup>. (<b>b</b>) Calculated molecular orbitals and the HOMO–LUMO gaps of <b>BQFA</b> and <b>BQFA</b> + Cd<sup>2+</sup> in the ground state and excited state.</p>
Full article ">Figure 6
<p>Schematic representation of an RGB analysis of images produced via a color recognition application.</p>
Full article ">Figure 7
<p>(<b>a</b>) Fluorescence microscopy images of A549 cells interacting with <b>BQFA</b> in the presence of Cd<sup>2+</sup>: (a<sub>1</sub>–a<sub>3</sub>) Cells imaged after treatment with PBS buffer solution; (b<sub>1</sub>–b<sub>3</sub>) cells after 1 h of incubation with 10 µM <b>BQFA</b>; (c<sub>1</sub>–c<sub>3</sub>) cells exposed to 10 µM Cd<sup>2+</sup> for 1 h, followed by 1 h of incubation with 10 µM <b>BQFA</b> for imaging. (<b>b</b>) Fluorescence microscopy images of Siha cells interacting with <b>BQFA</b> in the absence and presence of Cd<sup>2+</sup>: (a<sub>1</sub>–a<sub>3</sub>, b<sub>1</sub>–b<sub>3</sub>, and c<sub>1</sub>–c<sub>3</sub>) imaging was conducted in the blue channel, with an excitation wavelength (λ<sub>ex</sub>) of 405 nm and an emission range (λ<sub>em</sub>) of 500–550 nm. Scale bar: 20 μm.</p>
Full article ">Scheme 1
<p>Synthetic route of <b>BQFA</b>.</p>
Full article ">Scheme 2
<p>Probable interaction mechanism of <b>BQFA</b> and Cd<sup>2+</sup>.</p>
Full article ">
18 pages, 12901 KiB  
Article
Evaluating Bicycle Path Roughness: A Comparative Study Using Smartphone and Smart Bicycle Light Sensors
by Tufail Ahmed, Ali Pirdavani, Geert Wets and Davy Janssens
Sensors 2024, 24(22), 7210; https://doi.org/10.3390/s24227210 - 11 Nov 2024
Viewed by 440
Abstract
The quality of bicycle path surfaces significantly influences the comfort of cyclists. This study evaluates the effectiveness of smartphone sensor data and smart bicycle lights data in assessing the roughness of bicycle paths. The research was conducted in Hasselt, Belgium, where various bicycle [...] Read more.
The quality of bicycle path surfaces significantly influences the comfort of cyclists. This study evaluates the effectiveness of smartphone sensor data and smart bicycle lights data in assessing the roughness of bicycle paths. The research was conducted in Hasselt, Belgium, where various bicycle path pavement types, such as asphalt, cobblestone, concrete, and paving tiles, were analyzed across selected streets. A smartphone application (Physics Toolbox Sensor Suite) and SEE.SENSE smart bicycle lights were used to collect GPS and vertical acceleration data on the bicycle paths. The Dynamic Comfort Index (DCI) and Root Mean Square (RMS) values from the data collected through the Physics Toolbox Sensor Suite were calculated to quantify the vibrational comfort experienced by cyclists. In addition, the data collected from the SEE.SENSE smart bicycle light, DCI, and RMS computed results were categorized for a statistical comparison. The findings of the statistical tests revealed no significant difference in the comfort assessment among DCI, RMS, and SEE.SENSE. The study highlights the potential of integrating smartphone sensors and smart bicycle lights for efficient, large-scale assessments of bicycle infrastructure, contributing to more informed urban planning and improved cycling conditions. It also provides a low-cost solution for the city authorities to continuously assess and monitor the quality of their cycling paths. Full article
Show Figures

Figure 1

Figure 1
<p>Unprocessed data from the Physics Toolbox Sensor Suite application.</p>
Full article ">Figure 2
<p>Vibration data from the SEE.SENSE smart bicycle lights.</p>
Full article ">Figure 3
<p>Acceleration on bicycle streets with different surface pavements (asphalt-paved and cobblestone-paved).</p>
Full article ">Figure 4
<p>Acceleration on asphalt-paved bicycle streets.</p>
Full article ">Figure 5
<p>Acceleration on cobblestone-paved bicycle streets.</p>
Full article ">Figure 6
<p>DCI of study area bicycle streets.</p>
Full article ">Figure 7
<p>RMS of study area bicycle streets.</p>
Full article ">Figure 8
<p>SEE.SENSE vibration values of study area bicycle streets.</p>
Full article ">Figure 9
<p>RMS, DCI, and SEE.SENSE of study area bicycle streets.</p>
Full article ">
11 pages, 1855 KiB  
Article
Smartphone-Based Leaf Colorimetric Analysis of Grapevine (Vitis vinifera L.) Genotypes
by Péter Bodor-Pesti, Dóra Taranyi, Gábor Vértes, István Fazekas, Diána Ágnes Nyitrainé Sárdy, Tamás Deák, Zsuzsanna Varga and László Baranyai
Horticulturae 2024, 10(11), 1179; https://doi.org/10.3390/horticulturae10111179 - 7 Nov 2024
Viewed by 566
Abstract
Leaf chlorophyll content is a key indicator of plant physiological status in viticulture; therefore, regular evaluation to obtain data for nutrient supply and canopy management is of vital importance. The measurement of pigmentation is most frequently carried out with hand-held instruments, destructive off-site [...] Read more.
Leaf chlorophyll content is a key indicator of plant physiological status in viticulture; therefore, regular evaluation to obtain data for nutrient supply and canopy management is of vital importance. The measurement of pigmentation is most frequently carried out with hand-held instruments, destructive off-site spectrophotometry, or remote sensing. Moreover, smartphone-based applications also ensure a promising way to collect colorimetric information that could correlate with pigmentation. In this study, four grapevine genotypes were investigated using smartphone-based RGB (Red, Green, Blue) and CIE-L*a*b* colorimetry and a portable chlorophyll meter. The objective of this study was to evaluate the correlation between leaf chlorophyll concentration and RGB- or CIE-L*a*b*-based color indices. A further aim was to find an appropriate model for discriminating between the genotypes by leaf coloration. For these purposes, fully developed leaves of ‘Chardonnay’, ‘Sauvignon blanc’, and ‘Pinot noir’ clones 666 and 777 were investigated with the Color Grab smartphone application to obtain RGB and CIE-L*a*b* values. Using these color values, chroma, hue, and a further 31 color indices were calculated. Chlorophyll concentrations were determined using an Apogee MC100 device, and the values were correlated with color values and color indices. The results showed that the chlorophyll concentration and color indices significantly differed between the genotypes. According to the results, certain color indices show a different direction in their relationship with leaf pigmentation for different grapevine genotypes. The same index showed a positive correlation for the leaf chlorophyll concentration for one variety and a negative correlation for another, which raises the possibility that the relationship is genotype-specific and not uniform within species. In light of this result, further study of the species specificity of the commonly used vegetation indices is warranted. Support Vector Machine (SVM) analysis of the samples based on color properties showed a 71.63% classification accuracy, proving that coloration is an important ampelographic feature for the identification and assessment of true-to-typeness. Full article
(This article belongs to the Section Viticulture)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Mean leaf chlorophyll concentration (<b>a</b>), chroma (<b>b</b>) and hue values (<b>c</b>) of the investigated grapevine genotypes. Horizontal lines in each box indicate the median. Different letters indicate significant differences between the genotypes (<span class="html-italic">p</span> &lt; 0.05) (n = 100).</p>
Full article ">Figure 2
<p>Pearson’s correlation between the leaf chlorophyll content and color indices. Green links show a positive correlation, while blue links depict a negative correlation between the chlorophyll content and the index. Only correlation indices with <span class="html-italic">p</span> &lt; 0.01 were included. Deeper colors refer to stronger correlation, and light colors refer to weaker correlation.</p>
Full article ">
16 pages, 3806 KiB  
Article
2SpamH: A Two-Stage Pre-Processing Algorithm for Passively Sensed mHealth Data
by Hongzhe Zhang, Jihui L. Diaz, Soohyun Kim, Zilong Yu, Yiyuan Wu, Emily Carter and Samprit Banerjee
Sensors 2024, 24(21), 7053; https://doi.org/10.3390/s24217053 - 31 Oct 2024
Viewed by 560
Abstract
Recent advancements in mobile health (mHealth) technology and the ubiquity of wearable devices and smartphones have expanded a market for digital health and have emerged as innovative tools for data collection on individualized behavior. Heterogeneous levels of device usage across users and across [...] Read more.
Recent advancements in mobile health (mHealth) technology and the ubiquity of wearable devices and smartphones have expanded a market for digital health and have emerged as innovative tools for data collection on individualized behavior. Heterogeneous levels of device usage across users and across days within a single user may result in different degrees of underestimation in passive sensing data, subsequently introducing biases if analyzed without addressing this issue. In this work, we propose an unsupervised 2-Stage Pre-processing Algorithm for Passively Sensed mHealth Data (2SpamH) algorithm that uses device usage variables to infer the quality of passive sensing data from mobile devices. This article provides a series of simulation studies to show the utility of the proposed algorithm compared to existing methods. Application to a real clinical dataset is also illustrated. Full article
(This article belongs to the Special Issue Wearable Sensing Technologies for Human Health Monitoring)
Show Figures

Figure 1

Figure 1
<p>A simulated example of downward bias introduced by duty-cycling algorithm. The vertical black and red lines represent the captured and uncaptured uploads of the true raw sensor data, respectively. The uncaptured uploads represent activity that is uncaptured by the sensor either because the sensor is off (due to duty-cycling) or the device is not being used or worn by the user. Each point, a daily aggregated step count, is derived as the sum of all uploads of raw sensor data from that day (black vertical lines), while the daily hypothetical upload is the sum of both the captured uploads (black vertical lines) and uncaptured uploads (red vertical lines) from that day. The upper trajectory of the dashed black line represents the trajectory of the true step count uploads while the lower trajectory of the solid line is the observed trajectory of captured step count uploads, an underestimation of the ground truth trajectory.</p>
Full article ">Figure 2
<p>Step-by-step illustration of the 2SpamH algorithm, where the size of each data point on the constructed feature space represents a daily observation of step count; (<b>a</b>) feature space construction with the first principal component of phone usage measures (<span class="html-italic">x</span>-axis), and the normalized number of step count uploads (<span class="html-italic">y</span>-axis); (<b>b</b>) prototype selection (red = “missing”, blue = “non-missing”); (<b>c</b>) k-nearest neighbors algorithm.</p>
Full article ">Figure 3
<p>Daily upload of step counts from the device of a single user over a three-week period, colored by the day of the week.</p>
Full article ">Figure 4
<p>The sensitivity and specificity performances of all three algorithms. Each block in the grids represent the algorithm performances at a specific activity level (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>L</mi> </mrow> <mrow> <mi>A</mi> </mrow> </msub> </mrow> </semantics></math>) and phone-usage level (<math display="inline"><semantics> <mrow> <msub> <mrow> <mi>L</mi> </mrow> <mrow> <mi>C</mi> </mrow> </msub> </mrow> </semantics></math>).</p>
Full article ">Figure 5
<p>Trajectory of step count data before and after applying 2SpamH and imputation. Data points are color-coded to differentiate their types: black (Point Type: Non-Missing after 2SpamH (Observed) dots represent the values of step counts of good quality observations identified by the 2SpamH algorithm; turquoise (Point Type: Missing after 2SpamH and Imputed) dots represent the values of step counts of poor-quality observations identified by the 2SpamH algorithm, which were then imputed using missForest; orange (Point Type: Missing from Technical Cause and Imputed) dots represent missing data points in the original data and then imputed using missForest; and gray (Point Type: Missing after 2SpamH (Observed)) dots represent the values of the observed step counts in the original data of poor-quality observations identified by 2SpamH. Note that each gray dot is connected to a turquoise dot by a dashed line since these are identified as missing by the 2SpamH algorithm and imputed. The blue curve shows the trend of the step counts after applying the 2SpamH algorithm and imputing missing data, while the red curve shows the trend before applying the algorithm. The figure demonstrates the effectiveness of 2SpamH in addressing the underestimation problem in passive measures.</p>
Full article ">Figure 6
<p>2SpamH algorithm for three users. Each point represents a daily observation of step count, the <span class="html-italic">x</span>-axis represents the first principal component of phone usage measures, and the <span class="html-italic">y</span>-axis represents the normalized number of uploads. The red-shaded areas in the lower left corners and blue-shaded areas in the upper right corners of each subplot represent prototypes with missing and non-missing labels, respectively. The size of the dots corresponds to the number of steps, with larger dots indicating higher step counts.</p>
Full article ">
15 pages, 2501 KiB  
Article
LIG-Based High-Sensitivity Multiplexed Sensing System for Simultaneous Monitoring of Metabolites and Electrolytes
by Sang Hyun Park and James Jungho Pak
Sensors 2024, 24(21), 6945; https://doi.org/10.3390/s24216945 - 29 Oct 2024
Viewed by 483
Abstract
With improvements in medical environments and the widespread use of smartphones, interest in wearable biosensors for continuous body monitoring is growing. We developed a wearable multiplexed bio-sensing system that non-invasively monitors body fluids and integrates with a smartphone application. The system includes sensors, [...] Read more.
With improvements in medical environments and the widespread use of smartphones, interest in wearable biosensors for continuous body monitoring is growing. We developed a wearable multiplexed bio-sensing system that non-invasively monitors body fluids and integrates with a smartphone application. The system includes sensors, readout circuits, and a microcontroller unit (MCU) for signal processing and wireless communication. Potentiometric and amperometric measurement methods were used, with calibration capabilities added to ensure accurate readings of analyte concentrations and temperature. Laser-induced graphene (LIG)-based sensors for glucose, lactate, Na+, K+, and temperature were developed for fast, cost-effective production. The LIG electrode’s 3D porous structure provided an active surface area 16 times larger than its apparent area, resulting in enhanced sensor performance. The glucose and lactate sensors exhibited high sensitivity (168.15 and 872.08 μAmM−1cm−2, respectively) and low detection limits (0.191 and 0.167 μM, respectively). The Na+ and K+ sensors demonstrated sensitivities of 65.26 and 62.19 mVdec−1, respectively, in a concentration range of 0.01–100 mM. Temperature sensors showed an average rate of resistance change per °C of 0.25%/°C, within a temperature range of 20–40 °C, providing accurate body temperature monitoring. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) <span class="html-italic">Left:</span> Multisensor array <span class="html-italic">Right:</span> Passivation layer. (<b>b</b>) Laser irradiation process of LIG-based electrodes. (<b>c</b>) Picture of the multisensor array. (<b>d</b>) Structure of multisensor array. (<b>e</b>) Picture of multisensor and multiplexed sensing system board. (<b>f</b>) Block diagram of multisensor and multiplexed sensing system board.</p>
Full article ">Figure 2
<p>FE-SEM images of LIG electrode. (<b>a</b>) Top view. (<b>b</b>) Cross-sectional view.</p>
Full article ">Figure 3
<p>(<b>a</b>) Raman spectrogram of PI film and LIG. (<b>b</b>) Raw CV plot and (<b>c</b>) peak current plot of LIG electrode according to scan speed. (<b>d</b>) Plot of potential difference with and without PVB coating Ag/AgCl RE vs. commercial Ag/AgCl at various NaCl concentrations.</p>
Full article ">Figure 4
<p>Plot of (<b>a</b>) glucose and (<b>b</b>) lactate sensor’s amperometric current depending on glucose and lactate concentration. (<b>c</b>) Plot of lactate sensor’s sensitivity in the lactate concentration range between 0.1 mM and 10 mM. (<b>d</b>) Plot of lactate sensor’s amperometric current in 0–0.5 mM lactate concentration range. Sensitivity and LOD comparison plot of enzymatic electrochemical.</p>
Full article ">Figure 5
<p>Sensitivity and LOD comparison plot of enzymatic electrochemical (<b>a</b>) glucose and (<b>b</b>) lactate sensors. Plot of (<b>c</b>) Na<sup>+</sup> and (<b>d</b>) K<sup>+</sup> sensor’s potential according to Na<sup>+</sup> and K<sup>+</sup> concentration.</p>
Full article ">Figure 6
<p>Selectivity plot of (<b>a</b>) glucose, (<b>b</b>) lactate, (<b>c</b>) Na<sup>+</sup>, and (<b>d</b>) K<sup>+</sup> sensor.</p>
Full article ">
22 pages, 1654 KiB  
Article
A New Scene Sensing Model Based on Multi-Source Data from Smartphones
by Zhenke Ding, Zhongliang Deng, Enwen Hu, Bingxun Liu, Zhichao Zhang and Mingyang Ma
Sensors 2024, 24(20), 6669; https://doi.org/10.3390/s24206669 - 16 Oct 2024
Viewed by 528
Abstract
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect [...] Read more.
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information—Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors—characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>Four scene classifications: (<b>a</b>) outdoor, (<b>b</b>) semi-outdoor, (<b>c</b>) semi-indoor, and (<b>d</b>) indoor.</p>
Full article ">Figure 2
<p>Satellite zenith view: (<b>a</b>) west indoor neighboring window, (<b>b</b>) south indoor neighbouring window, (<b>c</b>) indoor, and (<b>d</b>) open outdoor neighboring window.</p>
Full article ">Figure 3
<p>DOP change graph: (<b>a</b>) outdoor DOP change graph, and (<b>b</b>) indoor DOP change graph.</p>
Full article ">Figure 4
<p>Visible satellite map: (<b>a</b>) variation in the number of visible satellites. (<b>b</b>) Variation in the rate of change of visible satellites in different windows.</p>
Full article ">Figure 5
<p>Satellite signal quality map: (<b>a</b>) CNR variation and (<b>b</b>) DCNR variation.</p>
Full article ">Figure 6
<p>State of motion versus acceleration.</p>
Full article ">Figure 7
<p>Wi-Fi channel spectrum scan: (<b>a</b>) indoor, (<b>b</b>) outdoor.</p>
Full article ">Figure 8
<p>Visible AP distribution of Wi-Fi: (<b>a</b>) number distribution, (<b>b</b>) signal strength distribution.</p>
Full article ">Figure 9
<p>Variation of light sensors and cellular network sensors: (<b>a</b>) variation of indoor and outdoor light intensity over 24 h, (<b>b</b>) variation of the number of base stations receiving signals.</p>
Full article ">Figure 10
<p>An algorithmic model for the classification of complex indoor and outdoor scenes based on spatio-temporal features.</p>
Full article ">Figure 11
<p>Pearson correlation feature map.</p>
Full article ">Figure 12
<p>Schematic of a two-scale convolutional neural network.</p>
Full article ">Figure 13
<p>BiLSTM network structure diagram.</p>
Full article ">Figure 14
<p>Structure of the ablation experiment.</p>
Full article ">Figure 15
<p>Confusion matrix: (<b>a</b>) confusion matrix before WOA optimization. (<b>b</b>) confusion matrix after WOA optimisation.</p>
Full article ">Figure 16
<p>Comparison of the accuracy of different models.</p>
Full article ">Figure 17
<p>Comparison of accuracy in different scenarios.</p>
Full article ">
14 pages, 247 KiB  
Review
Technological Interventions to Implement Prevention and Health Promotion in Cardiovascular Patients
by Ayisha Z. Bashir, Anji Yetman and Melissa Wehrmann
Healthcare 2024, 12(20), 2055; https://doi.org/10.3390/healthcare12202055 - 16 Oct 2024
Viewed by 929
Abstract
Background/Objectives: The aim of the narrative review is to identify information on the impact of technological interventions (such as telehealth and mobile health) on the health promotion of cardiac patients from diverse populations. Methods: The online databases of PubMed and the [...] Read more.
Background/Objectives: The aim of the narrative review is to identify information on the impact of technological interventions (such as telehealth and mobile health) on the health promotion of cardiac patients from diverse populations. Methods: The online databases of PubMed and the Cochrane Library were searched for articles in the English language regarding technological interventions for health promotion in cardiac patients. In addition, a methodological quality control process was conducted. Exclusion was based on first reading the abstract, and then the full manuscript was scanned to confirm that the content was not related to cardiac patients and technological interventions. Results: In all, 11 studies were included in this review after quality control analysis. The sample size reported in these studies ranged from 12 to 1424 subjects. In eight studies mobile phones, smartphones, and apps were used as mHealth interventions with tracking and texting components; two studies used videoconferencing as a digital intervention program, while three studies focused on using physical activity trackers. Conclusions: This review highlights the positive aspects of patient satisfaction with the technological interventions including, but not limited to, accessibility to health care providers, sense of security, and well-being. The digital divide becomes apparent in the articles reviewed, as individuals with limited eHealth literacy and lack of technological knowledge are not motivated or able participate in these interventions. Finding methods to overcome these barriers is important and can be solved to some extent by providing the technology and technical support. Full article
(This article belongs to the Special Issue Policy Interventions to Promote Health and Prevent Disease)
14 pages, 28439 KiB  
Article
A Multi-Channel Urine Sensing Detection System Based on Creatinine, Uric Acid, and pH
by Qiya Gao, Jie Fu, Fangying Xiong, Jiawang Wang, Ziyue Qin and Shuang Li
Biosensors 2024, 14(10), 473; https://doi.org/10.3390/bios14100473 - 2 Oct 2024
Viewed by 817
Abstract
Urine analysis represents a crucial diagnostic technique employed in clinical laboratories. Creatinine and uric acid in urine are essential biomarkers in the human body and are widely utilized in clinical analysis. Research has demonstrated a correlation between the normal physiological concentrations of creatinine [...] Read more.
Urine analysis represents a crucial diagnostic technique employed in clinical laboratories. Creatinine and uric acid in urine are essential biomarkers in the human body and are widely utilized in clinical analysis. Research has demonstrated a correlation between the normal physiological concentrations of creatinine and uric acid in urine and an increased risk of hypertension, cardiovascular diseases, and kidney disease. Furthermore, the pH of urine indicates the body’s metabolic processes and homeostatic balance. In this study, an integrated multi-channel electrochemical sensing system was developed, combining electrochemical analysis techniques, microelectronic design, and nanomaterials. The architecture of an intelligent medical detection system and the production of an interactive interface for smartphones were accomplished. Initially, multi-channel selective electrodes were designed for creatinine, uric acid, and pH detection. The detection range was 10 nM to 100 μM for creatinine, 100 μM to 500 μM for uric acid, and 4 to 9 for pH. Furthermore, interference experiments were also conducted to verify the specificity of the sensors. Subsequently, multi-channel double-sided sensing electrodes and function-integrated hardware were designed, with the standard equations of target analytes stored in the system’s read-only memory. Moreover, a WeChat mini-program platform was developed for smartphone interaction, enabling off-body detection and real-time display of target analytes through smartphones. Finally, the aforementioned electrochemical detection electrodes were integrated with the smart sensing system and wirelessly interfaced with smartphones, allowing for intelligent real-time detection in primary healthcare and individual household settings. Full article
(This article belongs to the Special Issue State-of-the-Art Biosensors in China (2nd Edition))
Show Figures

Figure 1

Figure 1
<p>Multi-channel urine sensing system. (<b>a</b>) Schematic diagram of the modification process of dual-sided sensing electrodes for a multi-channel urine sensing system. (<b>b</b>) Schematic diagram of the multi-channel urine sensing system structure. (<b>c</b>) Multi-channel urine sensing printed circuit board and its various parts’ functions.</p>
Full article ">Figure 2
<p>Characterization of multi-channel sensing electrodes. (<b>a</b>) SEM characterization of the bare electrode; (<b>b</b>) TEM characterization of N-Gr; (<b>c</b>) SEM characterization of the AuNPs/N-Gr electrode; electrochemical characterization of (<b>d</b>) the creatinine sensing electrode, (<b>e</b>) the uric acid sensing electrode, and (<b>f</b>) the pH sensing electrode.</p>
Full article ">Figure 3
<p>Detection results of multi-channel sensors. (<b>a</b>) Sensing detection results for creatinine, with a detection range of 10 nM to 100 μM. (<b>b</b>) Sensing detection results for uric acid, with a detection range of 100 μM to 500 μM. (<b>c</b>) Sensing detection results for pH, with forward and reverse detection in solutions ranging from pH 4 to 9. The standard fitting curves for sensing detection are shown in (<b>d</b>) for creatinine, (<b>e</b>) for uric acid, and (<b>f</b>) for pH.</p>
Full article ">Figure 4
<p>Interference test of multi-channel sensors. (<b>a</b>) DPV curve of the creatinine sensor for detecting different interferents. (<b>b</b>) DPV curve of the uric acid sensor for detecting different interferents. (<b>c</b>) OCP curve of the pH sensor detecting different interferents. Bar graph of the interference test signal for the creatinine sensor (<b>d</b>) and uric acid sensor (<b>e</b>).</p>
Full article ">Figure 5
<p>(<b>a</b>) WeChat mini-program interface. (<b>b</b>) Displayed results of multi-channel sensing detection on the WeChat mini-program. Sensing detection results of creatinine (<b>c</b>), uric acid (<b>d</b>), and pH (<b>e</b>) in artificial urine.</p>
Full article ">
23 pages, 4087 KiB  
Article
SWiLoc: Fusing Smartphone Sensors and WiFi CSI for Accurate Indoor Localization
by Khairul Mottakin, Kiran Davuluri, Mark Allison and Zheng Song
Sensors 2024, 24(19), 6327; https://doi.org/10.3390/s24196327 - 30 Sep 2024
Viewed by 716
Abstract
Dead reckoning is a promising yet often overlooked smartphone-based indoor localization technology that relies on phone-mounted sensors for counting steps and estimating walking directions, without the need for extensive sensor or landmark deployment. However, misalignment between the phone’s direction and the user’s actual [...] Read more.
Dead reckoning is a promising yet often overlooked smartphone-based indoor localization technology that relies on phone-mounted sensors for counting steps and estimating walking directions, without the need for extensive sensor or landmark deployment. However, misalignment between the phone’s direction and the user’s actual movement direction can lead to unreliable direction estimates and inaccurate location tracking. To address this issue, this paper introduces SWiLoc (Smartphone and WiFi-based Localization), an enhanced direction correction system that integrates passive WiFi sensing with smartphone-based sensing to form Correction Zones. Our two-phase approach accurately measures the user’s walking directions when passing through a Correction Zone and further refines successive direction estimates outside the zones, enabling continuous and reliable tracking. In addition to direction correction, SWiLoc extends its capabilities by incorporating a localization technique that leverages corrected directions to achieve precise user localization. This extension significantly enhances the system’s applicability for high-accuracy localization tasks. Additionally, our innovative Fresnel zone-based approach, which utilizes unique hardware configurations and a fundamental geometric model, ensures accurate and robust direction estimation, even in scenarios with unreliable walking directions. We evaluate SWiLoc across two real-world environments, assessing its performance under varying conditions such as environmental changes, phone orientations, walking directions, and distances. Our comprehensive experiments demonstrate that SWiLoc achieves an average 75th percentile error of 8.89 degrees in walking direction estimation and an 80th percentile error of 1.12 m in location estimation. These figures represent reductions of 64% and 49%, respectively for direction and location estimation error, over existing state-of-the-art approaches. Full article
(This article belongs to the Special Issue Advanced Wireless Positioning and Sensing Technologies)
Show Figures

Figure 1

Figure 1
<p>Geometry of Fresnel Zones.</p>
Full article ">Figure 2
<p>Illustration of SWiLoc System.</p>
Full article ">Figure 3
<p>Towards Solving Unreliable Direction Problem. (<b>a</b>) Unreliable Direction using 2 receivers. (<b>b</b>) SWiLoc solves unreliable direction using 4 receivers.</p>
Full article ">Figure 4
<p>System Workflow of SWiLoc.</p>
Full article ">Figure 5
<p>Geometrical Derivation of SWiLoc using Triangle.</p>
Full article ">Figure 6
<p>The <span class="html-italic">m</span>th and <span class="html-italic">n</span>th Fresnel Zone for Rx2 and Rx4.</p>
Full article ">Figure 7
<p>Mapping Phone Orientation to User Direction.</p>
Full article ">Figure 8
<p>LoS Crossing Detection.</p>
Full article ">Figure 9
<p>Before and After CSI Smoothing.</p>
Full article ">Figure 10
<p>Testbed Setup.</p>
Full article ">Figure 11
<p>Phase 1 Accuracy in 2 Different Environments.</p>
Full article ">Figure 12
<p>SWiLoc Performance in Continuous Traces.</p>
Full article ">Figure 13
<p>Localization Error for Continuous Trace 1 &amp; 2.</p>
Full article ">Figure 14
<p>CDF of Different Phone Holding Positions, Comparing SWiLoc with Humaine and Android API.</p>
Full article ">Figure 15
<p>Localization Error for Different State-of-the-art methods (Holding Phone on Hand Palm).</p>
Full article ">Figure 16
<p>Paths with Different LoS Crossing Locations.</p>
Full article ">Figure 17
<p>Impact of LoS Crossing Locations and Distances.</p>
Full article ">Figure 18
<p>Impact of Direction Error on Loc. Accuracy.</p>
Full article ">
18 pages, 3143 KiB  
Article
Estimating Rainfall Intensity Using an Image-Based Convolutional Neural Network Inversion Technique for Potential Crowdsourcing Applications in Urban Areas
by Youssef Shalaby, Mohammed I. I. Alkhatib, Amin Talei, Tak Kwin Chang, Ming Fai Chow and Valentijn R. N. Pauwels
Big Data Cogn. Comput. 2024, 8(10), 126; https://doi.org/10.3390/bdcc8100126 - 29 Sep 2024
Viewed by 745
Abstract
High-quality rainfall data are essential in many water management problems, including stormwater management, water resources management, and more. Due to the high spatial–temporal variations, rainfall measurement could be challenging and costly, especially in urban areas. This could be even more challenging in tropical [...] Read more.
High-quality rainfall data are essential in many water management problems, including stormwater management, water resources management, and more. Due to the high spatial–temporal variations, rainfall measurement could be challenging and costly, especially in urban areas. This could be even more challenging in tropical regions with their typical short-duration and high-intensity rainfall events, as some of the undeveloped or developing countries in those regions lack a dense rain gauge network and have limited resources to use radar and satellite readings. Thus, exploring alternative rainfall estimation methods could be helpful to back up some shortcomings. Recently, a few studies have examined the utilisation of citizen science methods to collect rainfall data as a complement to the existing rain gauge networks. However, these attempts are in the early stages, and limited works have been published on improving the quality of such data. Therefore, this study focuses on image-based rainfall estimation with potential usage in citizen science. For this, a novel convolutional neural network (CNN) model is developed to predict rainfall intensity by processing the images captured by citizens (e.g., by smartphones or security cameras) in an urban area. The developed model is merely a complementary sensing tool (e.g., better spatial coverage) to the existing rain gauge network in an urban area and is not meant to replace it. This study also presents one of the most extensive datasets of rain image data ever published in the literature. The estimated rainfall data by the proposed CNN model of this study using images captured by surveillance cameras and smartphone cameras are compared with observed rainfall by a weather station and exhibit strong R2 values of 0.955 and 0.840, respectively. Full article
Show Figures

Figure 1

Figure 1
<p>Locations of rainfall images captured using surveillance cameras and smartphones during rain events near the Monash University campus in Malaysia.</p>
Full article ">Figure 2
<p>Sample images captured by smartphone on campus during rainfall events.</p>
Full article ">Figure 3
<p>Schematic representation of the CNN architecture in this study.</p>
Full article ">Figure 4
<p>The regression CNN workflow.</p>
Full article ">Figure 5
<p>Schematic structure of the CNN model importing image data, thresholding, partitioning, training, and testing (deep learning) for rainfall intensity prediction.</p>
Full article ">Figure 6
<p>Rainfall intensity distribution corresponding to (<b>a</b>) surveillance camera and (<b>b</b>) smartphone camera images.</p>
Full article ">Figure 7
<p>(<b>a</b>) displays the raw rain image, (<b>b</b>) is a sharpened image of the raw image input, (<b>c</b>) is a greyscale image that shows pixel intensity, and (<b>d</b>) displays the outcome of applying Otsu’s thresholding method. (<b>e</b>) combines the thresholding approach with image processing to merge two images.</p>
Full article ">Figure 8
<p>Samples of pre-processed images using Otsu’s method under different rainfall conditions: (<b>a</b>) no or low rain, (<b>b</b>) moderate rain, and (<b>c</b>) heavy rain.</p>
Full article ">Figure 9
<p>Observed vs. predicted rainfall intensity by CNN Model 4 using rainfall images captured by a surveillance camera. The blue dot line shows the fitted line corresponding to the R<sup>2</sup>.</p>
Full article ">Figure 10
<p>Observed vs. simulated rainfall intensities by CNN Model 4 using Approach 2 on the smartphone testing dataset.</p>
Full article ">
13 pages, 2703 KiB  
Article
Portable Electrochemical System and Platform with Point-of-Care Determination of Urine Albumin-to-Creatinine Ratio to Evaluate Chronic Kidney Disease and Cardiorenal Syndrome
by Shuenn-Yuh Lee, Ding-Siang Ciou, Hao-Yun Lee, Ju-Yi Chen, Yi-Chieh Wei and Meng-Dar Shieh
Biosensors 2024, 14(10), 463; https://doi.org/10.3390/bios14100463 - 27 Sep 2024
Viewed by 784
Abstract
Abstract: The urine albumin (Alb)-to-creatinine (Crn) ratio (UACR) is a sensitive and early indicator of chronic kidney disease (CKD) and cardiorenal syndrome. This study developed a portable and wireless electrochemical-sensing platform for the sensitive and accurate determination of UACR. The developed platform consists [...] Read more.
Abstract: The urine albumin (Alb)-to-creatinine (Crn) ratio (UACR) is a sensitive and early indicator of chronic kidney disease (CKD) and cardiorenal syndrome. This study developed a portable and wireless electrochemical-sensing platform for the sensitive and accurate determination of UACR. The developed platform consists of a carbon nanotube (CNT)-2,2′-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid)(ABTS)-based modified UACR sensor, a miniaturised potentiostat, a cup holder embedded with a magnetic stirrer and a smartphone app. The UACR sensing electrode is composed of two screen-printed carbon working electrodes, one screen-printed carbon counter electrode and a screen-printed AgCl reference electrode. The miniaturised potentiostat, which is controlled by the developed app, performs cyclic voltammetry and amperometry to detect Alb and Crn, respectively. Clinical trials of the proposed system by using spot urine samples from 30 diabetic patients indicate that it can accurately classify all three CKD risk statuses within 30 min. The high accuracy of our proposed sensing system exhibits satisfactory agreement with the commercial biochemical analyser TBA-25FR (Y = 0.999X, R2 = 0.995). The proposed UACR sensing system offers a convenient, reliable and affordable solution for personal mobile health monitoring and point-of-care urinalysis. Full article
(This article belongs to the Special Issue Electrochemical Biosensors for Disease Detection)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>(<b>a</b>) Layered graph of the customised electrode. (<b>b</b>) Proposed electrochemical UACR dual working sensor.</p>
Full article ">Figure 2
<p>Proposed electrochemical detection system.</p>
Full article ">Figure 3
<p>Design of the potentiostat for performing the CA and CV programmes.</p>
Full article ">Figure 4
<p>(<b>a</b>) Proposed portable UACR electrochemical system and platform. (<b>b</b>) UACR examination process of the proposed portable electrochemical system and platform.</p>
Full article ">Figure 5
<p>(<b>a</b>) CV responses of the SPE(W<sub>1</sub>)|CNT-ABTS<sub>(CV)</sub> to Alb of various concentrations. (<b>b</b>) Calibration curve of the proposed albumin sensor in urine with known concentrations of 0.08, 0.1, 0.18, 0.54, 0.8, 0.94, 0.98, 5.14, 6.1, and 19.2 mg/dL, respectively. (<b>c</b>) Regression analysis of C<sub>Alb</sub> in five urine samples determined using TBA-25FR and the SPE(W<sub>1</sub>)|CNT-ABTS<sub>(CV)</sub> electrode is shown in red; The deviation percentage in the C<sub>Alb</sub> values determined by TBA™-25FR and SPE(W<sub>1</sub>)|CNT-ABTS<sub>(CV)</sub> electrode is shown in blue.</p>
Full article ">Figure 6
<p>(<b>a</b>) CA responses of the SPE(W<sub>2</sub>)|CNT-ABTS|Nafion electrode to various Crn concentrations. (<b>b</b>) Calibration curve of the proposed Crn sensor in urine with known concentrations of 1.67, 11.55, 20.10, 31.22, 41.80 and 58.62 mg/dL, respectively. (<b>c</b>) Regression analysis of C<sub>Crn</sub> in five urine samples determined by TBA-25FR and SPE(W<sub>2</sub>)|CNT-ABTS|Nafion is shown in red; The deviation percentage in the C<sub>Crn</sub> values determined by TBA™-25FR and SPE(W<sub>2</sub>)|CNT-ABTS|Nafion electrode is shown in blue.</p>
Full article ">Figure 7
<p>(<b>a</b>) Regression analysis of the C<sub>Alb</sub> of 30 spot urine samples determined using TBA-25FR and the SPE(W<sub>1</sub>)|CNT-ABTS<sub>(CV)</sub> electrode (<b>b</b>) Regression analysis of the C<sub>Crn</sub> of 30 spot urine samples determined using TBA-25FR and the SPE(W<sub>2</sub>)|CNT-ABTS|Nafion electrode. (<b>c</b>) Regression analysis of the UACR value of 30 spot urine samples calculated using the Alb and Crn results of TBA-25FR and the proposed UACR sensing system and platform.</p>
Full article ">
22 pages, 2788 KiB  
Article
Comparative Assessment of Multimodal Sensor Data Quality Collected Using Android and iOS Smartphones in Real-World Settings
by Ramzi Halabi, Rahavi Selvarajan, Zixiong Lin, Calvin Herd, Xueying Li, Jana Kabrit, Meghasyam Tummalacherla, Elias Chaibub Neto and Abhishek Pratap
Sensors 2024, 24(19), 6246; https://doi.org/10.3390/s24196246 - 26 Sep 2024
Viewed by 775
Abstract
Healthcare researchers are increasingly utilizing smartphone sensor data as a scalable and cost-effective approach to studying individualized health-related behaviors in real-world settings. However, to develop reliable and robust digital behavioral signatures that may help in the early prediction of the individualized disease trajectory [...] Read more.
Healthcare researchers are increasingly utilizing smartphone sensor data as a scalable and cost-effective approach to studying individualized health-related behaviors in real-world settings. However, to develop reliable and robust digital behavioral signatures that may help in the early prediction of the individualized disease trajectory and future prognosis, there is a critical need to quantify the potential variability that may be present in the underlying sensor data due to variations in the smartphone hardware and software used by large population. Using sensor data collected in real-world settings from 3000 participants’ smartphones for up to 84 days, we compared differences in the completeness, correctness, and consistency of the three most common smartphone sensors—the accelerometer, gyroscope, and GPS— within and across Android and iOS devices. Our findings show considerable variation in sensor data quality within and across Android and iOS devices. Sensor data from iOS devices showed significantly lower levels of anomalous point density (APD) compared to Android across all sensors (p  <  1 × 10−4). iOS devices showed a considerably lower missing data ratio (MDR) for the accelerometer compared to the GPS data (p  <  1 × 10−4). Notably, the quality features derived from raw sensor data across devices alone could predict the device type (Android vs. iOS) with an up to 0.98 accuracy 95% CI [0.977, 0.982]. Such significant differences in sensor data quantity and quality gathered from iOS and Android platforms could lead to considerable variation in health-related inference derived from heterogenous consumer-owned smartphones. Our research highlights the importance of assessing, measuring, and adjusting for such critical differences in smartphone sensor-based assessments. Understanding the factors contributing to the variation in sensor data based on daily device usage will help develop reliable, standardized, inclusive, and practically applicable digital behavioral patterns that may be linked to health outcomes in real-world settings. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic representation of potential noise sources in collecting smartphone sensor data in real-world settings across three data quality dimensions—completeness, correctness, and consistency. The illustration depicts data quality metrics (DQMs) for a single observation (“record”). The same DQMs can be applied to multiple sensor records to assess temporal variations in sensor data quality within and across study participants.</p>
Full article ">Figure 2
<p>Density histograms comparing selected data quality metrics (DQMs) across three sensors—Accelerometer, Gyroscope, and GPS—stratified by the device (Android in green, iOS in pink). Device-specific medians for each DQM are shown as vertical dashed lines. Significant differences (<span class="html-italic">p</span> &lt; 1 × 10<sup>−4</sup>) in data point missingness across device types: (<b>a</b>) Accelerometer (iOS: Median = 0.13 vs. Android: Median = 0.34), (<b>b</b>) Gyroscope (iOS: Median = 0.13 vs. Android: Median = 0.39), and (<b>c</b>) GPS data (iOS: Median = 0.99 vs. Android: Median = 0.42). Data point anomaly levels showing significant differences (<span class="html-italic">p</span> &lt; 1 × 10<sup>−3</sup>) in (<b>d</b>) Accelerometer (iOS: Median = 0.01 vs. Android: Median = 0.02), (<b>e</b>) Gyroscope (iOS: Median = 0.01 vs. Android: Median = 0.02), and (<b>f</b>) GPS data (iOS: Median = 0.01 vs. Android: Median = 0.02). The signal-to-noise ratio (SNR) varied significantly (<span class="html-italic">p</span> &lt; 1 × 10<sup>−4</sup>), with SNR for iOS being lower: (<b>g</b>) Accelerometer (iOS: Median = −3.88 dB vs. Android: Median = −2.85 dB), (<b>h</b>) Gyroscope (iOS: Median = −13.2 dB vs. Android: Median = −7.63 dB), and (<b>i</b>) GPS data (iOS: Median = −4.5 dB vs. Android: Median = −2.35 dB).</p>
Full article ">Figure 3
<p>Overall and sensor-specific device type classification precision–recall curves (PRCs). Median PRC bounded by 95% CI for 1000 stratified 12-fold cross-validation permutations using (<b>a</b>) overall sensor DQMs; (<b>b</b>) accelerometer-specific DQMs; (<b>c</b>) gyroscope-specific DQMs; and (<b>d</b>) GPS-specific DQMs.</p>
Full article ">Figure 4
<p>DQM feature importance scores and impact level plots for the combined sensors and sensor-specific DQM impact on device type classification; (<b>a</b>–<b>d</b>): mean DQM feature importance levels on the combined and sensor-specific model outputs: APD: anomalous point density; SNR: signal-to-noise ratio; MDR: missing data ratio, VRC: value range consistency; IRLR: interpretable record length ratio; RLC: record length consistency, SRC: sampling rate consistency, and SCR: sensor channel ratio; (<b>e</b>–<b>h</b>): bidirectional DQM SHAP impact levels on combined and sensor-specific model outputs.</p>
Full article ">Figure 5
<p>Density histograms for selected data quality metrics across device types—Android and iOS—stratified by the sensor (accelerometer in green, GPS in orange, and gyroscope in blue). Sensor-specific medians for each DQM are shown as vertical dashed lines; (<b>a</b>,<b>b</b>) density plots showing significant differences (<span class="html-italic">p</span> &lt; 1 × 10<sup>−4</sup>) in data noise levels across sensors, with the lowest SNR scored by gyroscope data for both Android (Median = −7.6 dB) and iOS (Median = −13.1 dB); (<b>c</b>,<b>d</b>) density plots showing significant differences (<span class="html-italic">p</span> &lt; 1 × 10<sup>−4</sup>) in data point anomaly levels. GPS data showed lower anomalous point density (APD) across sensors. GPS data scored the lowest APD levels for both Android (Median = 0.013) and iOS (Median = 0.010); (<b>e</b>,<b>f</b>) density plots showing significant differences (<span class="html-italic">p</span> &lt; 1 × 10<sup>−4</sup>) in data point missingness across sensors. GPS data showed the highest MDR, most remarkable in iOS (Median = 0.99), while accelerometer data showed lower MDR levels across Android (Median = 0.34) and iOS (Median = 0.13).</p>
Full article ">
21 pages, 11115 KiB  
Review
Mobile Devices in Forest Mensuration: A Review of Technologies and Methods in Single Tree Measurements
by Robert Magnuson, Yousef Erfanifard, Maksymilian Kulicki, Torana Arya Gasica, Elvis Tangwa, Miłosz Mielcarek and Krzysztof Stereńczak
Remote Sens. 2024, 16(19), 3570; https://doi.org/10.3390/rs16193570 - 25 Sep 2024
Viewed by 1208
Abstract
Mobile devices such as smartphones, tablets or similar devices are becoming increasingly important as measurement devices in forestry due to their advanced sensors, including RGB cameras and LiDAR systems. This review examines the current state of applications of mobile devices for measuring biometric [...] Read more.
Mobile devices such as smartphones, tablets or similar devices are becoming increasingly important as measurement devices in forestry due to their advanced sensors, including RGB cameras and LiDAR systems. This review examines the current state of applications of mobile devices for measuring biometric characteristics of individual trees and presents technologies, applications, measurement accuracy and implementation barriers. Passive sensors, such as RGB cameras have proven their potential for 3D reconstruction and analysing point clouds that improve single tree-level information collection. Active sensors with LiDAR-equipped smartphones provide precise quantitative measurements but are limited by specific hardware requirements. The combination of passive and active sensing techniques has shown significant potential for comprehensive data collection. The methods of data collection, both physical and digital, significantly affect the accuracy and reproducibility of measurements. Applications such as ForestScanner and TRESTIMATM have automated the measurement of tree characteristics and simplified data collection. However, environmental conditions and sensor limitations pose a challenge. There are also computational obstacles, as many methods require significant post-processing. The review highlights the advances in mobile device-based forestry applications and emphasizes the need for standardized protocols and cross-device benchmarking. Future research should focus on developing robust algorithms and cost-effective solutions to improve measurement accuracy and accessibility. While mobile devices offer significant potential for forest surveying, overcoming the above-mentioned challenges is critical to optimizing their application in forest management and protection. Full article
Show Figures

Figure 1

Figure 1
<p>The word cloud consists of a total of 40 keywords that represent the main topics and concepts of the reviewed articles. The selected keywords were repeated at least twice.</p>
Full article ">Figure 2
<p>Overview of the initial search and screening process for this study in three databases: Google Scholar, Scopus, and Web of Science. The search terms (bolded in each section) and inclusion criteria are detailed for each database, with the number of entries excluded after the initial screening also noted.</p>
Full article ">Figure 3
<p>The devices, auxiliary equipment, and applications discussed in 34 reviewed articles are shown with labels indicating their reference numbers.</p>
Full article ">Figure 4
<p>Exploring data collection methods for mobile device application in tree attribute assessment: indirect vs. direct.</p>
Full article ">Figure 5
<p>The RGB images of a beaver-damaged oak (<span class="html-italic">Quercus robur</span>) tree taken by iPhone 15 Pro Max in Leśnictwo Zielony Dwór Forest in Gajewo, Poland (1 April 2024) (<b>a</b>) and the results of commonly used applications, i.e., Polycam (<b>b</b>); ForestScanner (<b>c</b>); 3D Scanner App (<b>d</b>); PIX4Dcatch (<b>e</b>) installed on the iPhone 15 Pro Max to scan the tree and process its point clouds. Blue boxes in (<b>e</b>) show the locations of photos automatically taken by the phone.</p>
Full article ">
20 pages, 36563 KiB  
Article
grARffiti: The Reconstruction and Deployment of Augmented Reality (AR) Graffiti
by Naai-Jung Shih and Ching-Hsuan Kung
Technologies 2024, 12(9), 169; https://doi.org/10.3390/technologies12090169 - 17 Sep 2024
Viewed by 1616
Abstract
Graffiti relies on social instrumentation for its creation on spatial structures. It is questioned whether different mechanisms exist to transfer social and spatial hierarchies under a new model for better engagement, management, and governance. This research aims to replace physical graffiti using augmented [...] Read more.
Graffiti relies on social instrumentation for its creation on spatial structures. It is questioned whether different mechanisms exist to transfer social and spatial hierarchies under a new model for better engagement, management, and governance. This research aims to replace physical graffiti using augmented reality (AR) in smartphones. Contact-free AR graffiti starts with the creation of 3D graffiti; this is followed by an AR cloud platform upload, quick response (QR) code access, and site deployment, leading to the secondary reconstruction of a field scene using smartphone screenshots. The working structure was created based on the first 3D reconstruction of graffiti details as AR models and second 3D reconstruction of field graffiti on different backgrounds using a photogrammetry method. The 3D graffiti can be geotagged as a personal map and 3D printed for collections. This culture-engaged AR creates a two-way method of interacting with spatial structures where the result is collected as a self-governed form of social media. The reinterpreted context is represented by a virtual 3D sticker or symbolized name card shared on the cloud. The hidden or social hierarchy was reinterpreted by a sense of ritual without altering any space. The application of digital stickers in AR redefines the spatial order, typology, and governance of graffiti. Full article
(This article belongs to the Special Issue Immersive Technologies and Applications on Arts, Culture and Tourism)
Show Figures

Figure 1

Figure 1
<p>Observations of graffiti (painters unknown, Harvard, Boston, 2011; Berkeley, San Francisco, 2004).</p>
Full article ">Figure 2
<p>(<b>a</b>) Research flowchart; (<b>b</b>) tasks, tools, and platforms; and (<b>c</b>) three-dimensional reconstruction process.</p>
Full article ">Figure 3
<p>(<b>a</b>) Seven official sites and a number of observed occurrences around Taipei Metro; (<b>b</b>) three-dimensional documented field graffiti, as indicated no. 14 in the red circled area of (<b>a</b>).</p>
Full article ">Figure 4
<p>Physical and first 3D reconstructed models: (<b>a</b>) 3D physical model and printed replicate in a gray color; (<b>b</b>) 3D scanned model; (<b>c</b>) computer model; and (<b>d</b>) 3D scanned model. Models (<b>b</b>–<b>d</b>) are presented in AR form.</p>
Full article ">Figure 5
<p>Second reconstruction: (<b>a</b>) model created using a 3D software; (<b>b</b>) screenshots of field composition; (<b>c</b>) second reconstruction process: blue lines are drawn to mask and remove the screenshot background; and (<b>d</b>) model created.</p>
Full article ">Figure 6
<p>Evolved fabric exemplified from historical maps of (<b>a</b>) 1916 [<a href="#B48-technologies-12-00169" class="html-bibr">48</a>], (<b>b</b>) 1945 [<a href="#B48-technologies-12-00169" class="html-bibr">48</a>], (<b>c</b>) 2024 [<a href="#B48-technologies-12-00169" class="html-bibr">48</a>] around the circled area of graffiti site.</p>
Full article ">Figure 7
<p>The evolving fabric around the Jingmei graffiti area (red circled area): (<b>a</b>) northern Taiwan; (<b>b</b>) 3D urban model; (<b>c</b>) 1947 aerial image [<a href="#B48-technologies-12-00169" class="html-bibr">48</a>]; (<b>d</b>) 1963 aerial image [<a href="#B48-technologies-12-00169" class="html-bibr">48</a>]; (<b>e</b>) 2003 Google<sup>®</sup> Maps image, where a new riverfront recreation area and bike routes have been added since 2011; (<b>f</b>) field scenes; and (<b>g</b>) aligning 3D point cloud model and QGIS<sup>®</sup> (Quantum Geographic Information System) within the circled area.</p>
Full article ">Figure 8
<p>The images of three-dimensional model of public graffiti sites on river embankment: (<b>a</b>) in different scales; and (<b>b</b>) at different locations.</p>
Full article ">Figure 9
<p>Field deployments of first reconstructed models: (<b>a</b>) graffiti on utility box using last-name characters; (<b>b</b>) foreground and background differentiation using 3D-scanned wood texture; (<b>c</b>) graffiti with the moon in the Mid-Autumn Festival; and (<b>d</b>) offsite and field deployment using seats and frames of different textures and forms to join the conversation made with existing graffiti or the environment.</p>
Full article ">Figure 10
<p>Three-dimensional AR graffiti models: (<b>a</b>) benches for “Have a seat!” and picture frames; (<b>b</b>) semi-transparent overlay; and (<b>c</b>) graffiti source image sets.</p>
Full article ">Figure 11
<p>Second reconstructed models combined scenes and stools of different textures and forms.</p>
Full article ">Figure 12
<p>(<b>a</b>) Model of 3D graffiti with the background; (<b>b</b>) model of 3D graffiti with sticker board on the background; (<b>c</b>) 3D printing interface; and (<b>d</b>) 3D-printed results.</p>
Full article ">Figure 13
<p>The infill of field images: (<b>a</b>) instructP2P (1912504687, 2797104547); (<b>b</b>) inpaint, instructP2P; and (<b>c</b>) inpaint.</p>
Full article ">Figure 14
<p>Spatial structure reinterpreted via identity propagation of graffiti: (<b>a</b>) cross-wall composition; (<b>b</b>) prelude or postlude: at least three personal icons deployed outside the main graffiti; (<b>c</b>) bulletin board takeover in front and back; and (<b>d</b>) same graffiti was used to declare or replot personal territory.</p>
Full article ">Figure 15
<p>Graffiti stickers: (<b>a</b>) purchased collections; (<b>b</b>) deployment outside and inside an elevator; and (<b>c</b>) deployment on a bulletin board.</p>
Full article ">Figure 16
<p>Assessment of model configuration: (<b>a</b>) comparison by checking the alignment statistics; (<b>b</b>) AR graffiti model and field reconstructed 3D models; and (<b>c</b>) global and manual registrations with untrimmed and trimmed boundaries.</p>
Full article ">
29 pages, 6780 KiB  
Article
Phenological and Biophysical Mediterranean Orchard Assessment Using Ground-Based Methods and Sentinel 2 Data
by Pierre Rouault, Dominique Courault, Guillaume Pouget, Fabrice Flamain, Papa-Khaly Diop, Véronique Desfonds, Claude Doussan, André Chanzy, Marta Debolini, Matthew McCabe and Raul Lopez-Lozano
Remote Sens. 2024, 16(18), 3393; https://doi.org/10.3390/rs16183393 - 12 Sep 2024
Viewed by 974
Abstract
A range of remote sensing platforms provide high spatial and temporal resolution insights which are useful for monitoring vegetation growth. Very few studies have focused on fruit orchards, largely due to the inherent complexity of their structure. Fruit trees are mixed with inter-rows [...] Read more.
A range of remote sensing platforms provide high spatial and temporal resolution insights which are useful for monitoring vegetation growth. Very few studies have focused on fruit orchards, largely due to the inherent complexity of their structure. Fruit trees are mixed with inter-rows that can be grassed or non-grassed, and there are no standard protocols for ground measurements suitable for the range of crops. The assessment of biophysical variables (BVs) for fruit orchards from optical satellites remains a significant challenge. The objectives of this study are as follows: (1) to address the challenges of extracting and better interpreting biophysical variables from optical data by proposing new ground measurements protocols tailored to various orchards with differing inter-row management practices, (2) to quantify the impact of the inter-row at the Sentinel pixel scale, and (3) to evaluate the potential of Sentinel 2 data on BVs for orchard development monitoring and the detection of key phenological stages, such as the flowering and fruit set stages. Several orchards in two pedo-climatic zones in southeast France were monitored for three years: four apricot and nectarine orchards under different management systems and nine cherry orchards with differing tree densities and inter-row surfaces. We provide the first comparison of three established ground-based methods of assessing BVs in orchards: (1) hemispherical photographs, (2) a ceptometer, and (3) the Viticanopy smartphone app. The major phenological stages, from budburst to fruit growth, were also determined by in situ annotations on the same fields monitored using Viticanopy. In parallel, Sentinel 2 images from the two study sites were processed using a Biophysical Variable Neural Network (BVNET) model to extract the main BVs, including the leaf area index (LAI), fraction of absorbed photosynthetically active radiation (FAPAR), and fraction of green vegetation cover (FCOVER). The temporal dynamics of the normalised FAPAR were analysed, enabling the detection of the fruit set stage. A new aggregative model was applied to data from hemispherical photographs taken under trees and within inter-rows, enabling us to quantify the impact of the inter-row at the Sentinel 2 pixel scale. The resulting value compared to BVs computed from Sentinel 2 gave statistically significant correlations (0.57 for FCOVER and 0.45 for FAPAR, with respective RMSE values of 0.12 and 0.11). Viticanopy appears promising for assessing the PAI (plant area index) and FCOVER for orchards with grassed inter-rows, showing significant correlations with the Sentinel 2 LAI (R2 of 0.72, RMSE 0.41) and FCOVER (R2 0.66 and RMSE 0.08). Overall, our results suggest that Sentinel 2 imagery can support orchard monitoring via indicators of development and inter-row management, offering data that are useful to quantify production and enhance resource management. Full article
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)
Show Figures

Figure 1

Figure 1
<p>Schematic of the three approaches used to monitor orchard development at different spatial scales throughout the year (from tree level for phenological observations to watershed level using Sentinel 2 data).</p>
Full article ">Figure 2
<p>(<b>a</b>) Locations of the monitored orchards in the Ouvèze–Ventoux watershed (green points at right) and in the La Crau area (yellow points at left), (<b>b</b>) pictures of 2 cherry orchards (13 September and 22 July 2022): top, non-grassed orchard drip-irrigated by two rows of drippers and bottom, grassed orchard drip-irrigated in summer, (<b>c</b>) pictures of 2 orchards in La Crau (top, nectarine tree in spring 22 March 2023 and bottom, in summer 26 June 2022).</p>
Full article ">Figure 3
<p>(<b>a</b>) Main steps in processing the hemispherical photographs. (<b>b</b>) The three methods of data acquisition around the central tree. (<b>c</b>) Protocol used with hemispherical photographs. (<b>d</b>) Protocol used with the Viticanopy application, with 3 trees monitored in the four directions (blue arrows). (<b>e</b>) Protocols used with the ceptometer: P1 measured in the shadow of the trees and (blue) P2 in the inter-rows (black).</p>
Full article ">Figure 4
<p>Protocol for the monitoring of the phenological stages of cherry trees. (<b>a</b>) Phenology of cherry trees according to BBCH; (<b>b</b>) at plot scale, in an orchard, three trees in red monitored by observations (BBCH scale); (<b>c</b>) at tree scale, two locations are selected to classify flowering stage in the tree; and (<b>d</b>) flowering stage of a cherry tree in April 2022.</p>
Full article ">Figure 5
<p>Comparison of temporal profiles of Sentinel 2 LAI interpolated profile (black line) and PAI obtained from the ceptometer (blue line, P2 protocol) and Viticanopy (green line) for three orchards: (<b>a</b>) 3099 (cherry—grassed—Ouvèze), (<b>b</b>) 183 (cherry—non-grassed—Ouvèze), and (<b>c</b>) 4 (nectarine—La Crau) at the beginning of 2023.</p>
Full article ">Figure 6
<p>Comparison between Sentinel 2 LAI and PAI from (<b>a</b>) ceptometer measurements taken at all orchards of the two areas (La Crau and Ouvèze), (<b>b</b>) Viticanopy measurements at all orchards, and (<b>c</b>) Viticanopy measurements excluding 2 non-grassed orchards (183, 259). The black line represents the optimal correlation 1:1; the red line represents the results from linear regression.</p>
Full article ">Figure 7
<p>(<b>a</b>)—(top graphs) Proportion of tree (orange <span class="html-italic">100*FCOVER<sub>t</sub>/FCOVER<sub>c</sub></span>, see Equation (1)) and of inter-row (green <span class="html-italic">100*((1-FCOVER<sub>t</sub>)*FCOVER<sub>g</sub>)/FCOVER<sub>c</sub></span>) components computed from hemispherical photographs used to estimate FCOVER for two dates, 22 March 2022 (doy:81) and 21 June 2022 (doy 172), for all the monitored fields. (<b>b</b>)—(bottom graphs) For two plots, left, field 183.2 and right, field 3099.1, temporal variations in proportion of tree and inter-row components for the different observation dates in 2022.</p>
Full article ">Figure 8
<p>(<b>a</b>) Averaged percentage of grass contribution on FAPAR computed from hemispherical photographs according to Equation (1) for all grassed orchard plots in 2022. Examples of Sentinel 2 FAPAR dynamics (black lines) for plots at (<b>b</b>) non-grassed site 183 and (<b>c</b>) grassed site 1418. Initial values of FAPAR, as computed from BVNET, are provided in black. The green line represents adjusted FAPAR after subtracting the grass contribution (percentage obtained from hemispherical photographs). It corresponds to FAPAR only for the trees. The percentage of grass contribution is in red.</p>
Full article ">Figure 9
<p>Correlation between (<b>a</b>) FCOVER obtained from hemispherical photographs (from Equation (1)) for all orchards of the two studied areas and FCOVER from Sentinel 2 computed with BVNET (<b>b</b>) FAPAR from hemispherical photographs and FAPAR from Sentinel 2 for all orchards and for the 3 years. (<b>c</b>) Correlation between FCOVER from Viticanopy and Sentinel 2 for all orchards for the two areas, except 183 and 259. (<b>d</b>) Correlation between FCOVER from upward-aimed hemispherical photographs and from Viticanopy for all plots.</p>
Full article ">Figure 10
<p>(<b>a</b>) LAI temporal profiles obtained from BVNET applied to Sentinel 2 data averaged at plot and field scales (field 3099) for the year 2022 and (<b>b</b>) soil water stock (in mm in blue) computed at 0–50 cm using capacitive sensors (described in <a href="#sec2dot1-remotesensing-16-03393" class="html-sec">Section 2.1</a>), with rainfall recorded at the Carpentras station (see <a href="#app1-remotesensing-16-03393" class="html-app">Supplementary Part S1 and Table S1</a>).</p>
Full article ">Figure 11
<p>Time series of FCOVER (mean value at field scale) for the cherry trees in field 3099 in Ouvèze area from 2016 to 2023.</p>
Full article ">Figure 12
<p>Sentinel 2 FAPAR evolution in 2022 for two cherry tree fields, with the date of flowering observation (in green) and the date of fruit set observation (in red) for (<b>a</b>) plot 183 (non-grassed cherry trees) and (<b>b</b>) plot 3099 (grassed cherry trees).</p>
Full article ">Figure 13
<p>Variability in dates for the phenological stages of a cherry tree orchard (plot 3099) observed in 2022.</p>
Full article ">Figure 14
<p>(<b>a</b>) Normalised FAPAR computed for all observed cherry trees relative to observation dates for BBCH stages in the Ouvèze area in 2021 for five plots. (<b>b</b>) Map of dates distinguishing between flowering and fruit set stages for 2021 obtained by thresholding FAPAR images.</p>
Full article ">
Back to TopTop