Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (443)

Search Parameters:
Keywords = smart phone

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 4402 KiB  
Article
Smart Compression Sock for Early Detection of Diabetic Foot Ulcers
by Julia Billings, Julia Gee, Zinah Ghulam and Hussein A. Abdullah
Sensors 2024, 24(21), 6928; https://doi.org/10.3390/s24216928 - 29 Oct 2024
Viewed by 771
Abstract
The prevention of diabetic foot ulcers remains a critical challenge. This study evaluates a smart compression sock designed to address this issue by integrating temperature, plantar pressure, and blood oxygen sensors and monitoring data recorded by these sensors. The smart sock, developed with [...] Read more.
The prevention of diabetic foot ulcers remains a critical challenge. This study evaluates a smart compression sock designed to address this issue by integrating temperature, plantar pressure, and blood oxygen sensors and monitoring data recorded by these sensors. The smart sock, developed with input from a certified Pedorthist, was tested on 20 healthy adult participants aged 16 to 53. It includes four temperature sensors and pressure sensors at common ulcer sites (first and fifth metatarsal heads, calcaneus, and hallux), and a blood oxygen sensor on the hallux. The sensors are monitored, and their transduced data are collected and stored through an app installed on a personal cell phone. The mobile app interface is user-friendly, providing intuitive navigation and easy access to the sensors’ data. Using repeated measures ANOVA and post hoc tests, we analyzed the impact of various physical activities on physiological changes in the foot. The device effectively detected significant variations in blood oxygen, temperature, and pressure across six activities. Statistical analyses revealed significant differences based on activity type and sensor location. These results highlight the smart sock’s sensitivity and accuracy, suggesting its potential to prevent diabetic foot ulcers. Further clinical trials are needed to evaluate its efficacy in a larger, more diverse population. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

Figure 1
<p>Final design of the device: (<b>a</b>) 3D overview and (<b>b</b>) Actual prototype built for data collection.</p>
Full article ">Figure 2
<p>Mobile app interface developed for the sock to provide the user with data [<a href="#B29-sensors-24-06928" class="html-bibr">29</a>].</p>
Full article ">Figure 3
<p>Flow diagram of the device.</p>
Full article ">Figure 4
<p>Mean temperatures across all participants by sensor foot location for Activity.</p>
Full article ">Figure 5
<p>Foot temperature boxplot with pairwise comparisons.</p>
Full article ">Figure 6
<p>Mean force across all participants for each activity.</p>
Full article ">Figure 7
<p>Pairwise comparisons of activities.</p>
Full article ">Figure 8
<p>Pairwise comparisons of foot sensor locations.</p>
Full article ">Figure 9
<p>Mean FSR by Activity State Across All Participants.</p>
Full article ">Figure 10
<p>Mean SpO₂ by activity state across all participants.</p>
Full article ">
23 pages, 2454 KiB  
Article
CO-TSM: A Flexible Model for Secure Embedded Device Ownership and Management
by Konstantinos Markantonakis, Ghada Arfaoui, Sarah Abu Ghazalah, Carlton Shepherd, Raja Naeem Akram and Damien Sauveron
Smart Cities 2024, 7(5), 2887-2909; https://doi.org/10.3390/smartcities7050112 - 8 Oct 2024
Viewed by 1029
Abstract
The Consumer-Oriented Trusted Service Manager (CO-TSM) model has been recognised as a significant advancement in managing applications on Near Field Communication (NFC)-enabled mobile devices and multi-application smart cards. Traditional Trusted Service Manager (TSM) models, while useful, often result in market fragmentation and limit [...] Read more.
The Consumer-Oriented Trusted Service Manager (CO-TSM) model has been recognised as a significant advancement in managing applications on Near Field Communication (NFC)-enabled mobile devices and multi-application smart cards. Traditional Trusted Service Manager (TSM) models, while useful, often result in market fragmentation and limit widespread adoption due to their centralised control mechanisms. The CO-TSM model addresses these issues by decentralising management and offering greater flexibility and scalability, making it more adaptable to the evolving needs of embedded systems, particularly in the context of the Internet of Things (IoT) and Radio Frequency Identification (RFID) technologies. This paper provides a comprehensive analysis of the CO-TSM model, highlighting its application in various technological domains such as smart cards, HCE-based NFC mobile phones, TEE-enabled smart home IoT devices, and RFID-based smart supply chains. By evaluating the CO-TSM model’s architecture, implementation challenges, and practical deployment scenarios, this paper demonstrates how CO-TSM can overcome the limitations of traditional TSM approaches. The case studies presented offer practical insights into the model’s adaptability and effectiveness in real-world scenarios. Through this examination, the paper aims to underscore the CO-TSM model’s role in enhancing scalability, flexibility, and user autonomy in secure embedded device management, while also identifying areas for future research and development. Full article
Show Figures

Figure 1

Figure 1
<p>Generic TSM deployment architecture.</p>
Full article ">Figure 2
<p>The TSM Deployment Models proposed by GlobalPlatform.</p>
Full article ">Figure 3
<p>GSMA’s TSM Proposal: Mode 1.</p>
Full article ">Figure 4
<p>GSMA’s TSM Proposal: Mode 2.</p>
Full article ">Figure 5
<p>GSMA’s TSM Proposal: Mode 3.</p>
Full article ">Figure 6
<p>Overview of the Consumer-Oriented Trusted Service Manager (CO-TSM) model.</p>
Full article ">Figure 7
<p>NFC-enabled Device Using SE- and HCE-based Card Emulation(Source: Smart Card Alliance [<a href="#B13-smartcities-07-00112" class="html-bibr">13</a>]).</p>
Full article ">Figure 8
<p>GlobalPlatform TEE hardware architecture (Source: GlobalPlatform Specification [<a href="#B15-smartcities-07-00112" class="html-bibr">15</a>]. Trusted components shown in blue; untrusted units are uncoloured.</p>
Full article ">Figure 9
<p>GlobalPlatform TEE software architecture (Source: GlobalPlatform Specifiation [<a href="#B16-smartcities-07-00112" class="html-bibr">16</a>].</p>
Full article ">Figure 10
<p>Ecosystem of a HCE-TEE enabled Device.</p>
Full article ">Figure 11
<p>Generic smart home architecture.</p>
Full article ">Figure 12
<p>Illustrative example of a smart supply chain with integrated RFID technology (Source: Gupta et al. [<a href="#B32-smartcities-07-00112" class="html-bibr">32</a>].</p>
Full article ">Figure 13
<p>Sequence diagram illustrating the ownership transfer process in a CO-TSM-based smart supply chain.</p>
Full article ">
18 pages, 2511 KiB  
Article
Smart City Aquaculture: AI-Driven Fry Sorting and Identification Model
by Chang-Yi Kao and I-Chih Chen
Appl. Sci. 2024, 14(19), 8803; https://doi.org/10.3390/app14198803 - 30 Sep 2024
Viewed by 655
Abstract
The development of smart agriculture has become a critical issue for the future of smart cities, with large-scale management of aquaculture posing numerous challenges. Particularly in the fish farming industry, producing single-sex fingerlings (especially male fingerlings) is crucial for enhancing rearing efficiency and [...] Read more.
The development of smart agriculture has become a critical issue for the future of smart cities, with large-scale management of aquaculture posing numerous challenges. Particularly in the fish farming industry, producing single-sex fingerlings (especially male fingerlings) is crucial for enhancing rearing efficiency and could even provide key support in addressing future global food demands. However, traditional methods of manually selecting the gender of broodfish rely heavily on experienced technicians, are labor-intensive and time-consuming, and present significant bottlenecks in improving production efficiency, thus limiting the capacity and sustainable development potential of fish farms. In response to this situation, this study has developed an intelligent identification system based on the You Only Look Once (YOLO) artificial intelligence (AI) model, specifically designed for analyzing secondary sexual characteristics and gender screening in farmed fish. Through this system, farmers can quickly photograph the fish’s cloaca using a mobile phone, and AI technology is then used to perform real-time gender identification. The study involved two phases of training with different sample sets: in the first phase, the AI model was trained on a single batch of images with varying parameter conditions. In the second phase, additional sample data were introduced to improve generalization. The results of the study show that the system achieved an identification accuracy of over 95% even in complex farming environments, significantly reducing the labor costs and physical strain associated with traditional screening operations and greatly improving the production efficiency of breeding facilities. This research not only has the potential to overcome existing technological bottlenecks but also may become an essential tool for smart aquaculture. As the system continues to be refined, it is expected to be applicable across the entire life cycle management of fish, including gender screening during the growth phase, thereby enabling a more efficient production and management model. This not only provides an opportunity for technological upgrades in the aquaculture industry but also promotes the sustainable development of aquaculture. The smart aquaculture solution proposed in this study demonstrates the immense potential of applying AI technology to the aquaculture industry and offers strong support for global food security and the construction of smart cities. Full article
(This article belongs to the Special Issue IoT in Smart Cities and Homes, 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Research methodology framework.</p>
Full article ">Figure 2
<p>Results for Sample A with and without fish fin annotation: (<b>a</b>) Sample A male fish test, no fin annotation; (<b>b</b>) Sample A female fish test, no fin annotation; (<b>c</b>) Sample A male fish test, with fin annotation; (<b>d</b>) Sample A female fish test, with fin annotation.</p>
Full article ">Figure 3
<p>Comparison results for Sample A male fish: no fin annotation, angle rotation, and random scaling: (<b>a</b>) Sample A male fish test, random 180° rotation; (<b>b</b>) Sample A male fish test, random scaling; (<b>c</b>) Sample A male fish test, no rotation; (<b>d</b>) Sample A male fish test, fixed size.</p>
Full article ">Figure 4
<p>Test results for Sample B imaging: (<b>a</b>) male fish test with fixed image size; (<b>b</b>) female fish test with fixed image size; (<b>c</b>) male fish test with random image scaling; (<b>d</b>) female fish test with random image scaling.</p>
Full article ">Figure 5
<p>Overall sample recognition accuracy: (<b>a</b>) female fish recognition accuracy: 96.57%; (<b>b</b>) Male fish recognition accuracy: 95.30%.</p>
Full article ">Figure 6
<p>Female fish recognition rate distribution charts: (<b>a</b>) female fish high-recognition-rate distribution; (<b>b</b>) female fish low-recognition-rate distribution.</p>
Full article ">Figure 7
<p>Male fish recognition rate distribution charts: (<b>a</b>) male fish high-recognition-rate distribution; (<b>b</b>) male fish low-recognition-rate distribution.</p>
Full article ">
21 pages, 796 KiB  
Article
High-Performance Grape Disease Detection Method Using Multimodal Data and Parallel Activation Functions
by Ruiheng Li, Jiarui Liu, Binqin Shi, Hanyi Zhao, Yan Li, Xinran Zheng, Chao Peng and Chunli Lv
Plants 2024, 13(19), 2720; https://doi.org/10.3390/plants13192720 - 28 Sep 2024
Cited by 1 | Viewed by 665
Abstract
This paper introduces a novel deep learning model for grape disease detection that integrates multimodal data and parallel heterogeneous activation functions, significantly enhancing detection accuracy and robustness. Through experiments, the model demonstrated excellent performance in grape disease detection, achieving an accuracy of 91%, [...] Read more.
This paper introduces a novel deep learning model for grape disease detection that integrates multimodal data and parallel heterogeneous activation functions, significantly enhancing detection accuracy and robustness. Through experiments, the model demonstrated excellent performance in grape disease detection, achieving an accuracy of 91%, a precision of 93%, a recall of 90%, a mean average precision (mAP) of 91%, and 56 frames per second (FPS), outperforming traditional deep learning models such as YOLOv3, YOLOv5, DEtection TRansformer (DETR), TinySegformer, and Tranvolution-GAN. To meet the demands of rapid on-site detection, this study also developed a lightweight model for mobile devices, successfully deployed on the iPhone 15. Techniques such as structural pruning, quantization, and depthwise separable convolution were used to significantly reduce the model’s computational complexity and resource consumption, ensuring efficient operation and real-time performance. These achievements not only advance the development of smart agricultural technologies but also provide new technical solutions and practical tools for disease detection. Full article
Show Figures

Figure 1

Figure 1
<p>Dataset samples. (<b>a</b>) is powdery mildew; (<b>b</b>) is anthracnose; (<b>c</b>) is black rot; (<b>d</b>) is gray mold; (<b>e</b>) is white rot.</p>
Full article ">Figure 2
<p>The figure shows the process of collecting input from different data sources, processing it through independent feature extraction paths, and then integrating and optimizing features through the parallel heterogeneous activation function module and the multimodal fusion module, finally outputting disease detection results. Each part’s specific functions and processes are detailed in the figure, demonstrating the model’s complexity and highly integrated processing capability.</p>
Full article ">Figure 3
<p>The structural schematic of the parallel heterogeneous activation function module proposed in this paper displays the configuration where different activation functions, such as ReLU, LeakyReLU, and PReLU, work in parallel in the same network layer and how they enhance the model’s capability to process various data features through a specific structure.</p>
Full article ">Figure 4
<p>The structural schematic of the multimodal fusion module, which processes data from different sensors using specific convolution operations and residual connections, utilizing the shuffle and shift techniques to effectively merge features, enhancing the model’s accuracy and robustness in detecting grape diseases.</p>
Full article ">
47 pages, 2588 KiB  
Article
Observations and Considerations for Implementing Vibration Signals as an Input Technique for Mobile Devices
by Thomas Hrast, David Ahlström and Martin Hitz
Multimodal Technol. Interact. 2024, 8(9), 76; https://doi.org/10.3390/mti8090076 - 2 Sep 2024
Viewed by 843
Abstract
This work examines swipe-based interactions on smart devices, like smartphones and smartwatches, that detect vibration signals through defined swipe surfaces. We investigate how these devices, held in users’ hands or worn on their wrists, process vibration signals from swipe interactions and ambient noise [...] Read more.
This work examines swipe-based interactions on smart devices, like smartphones and smartwatches, that detect vibration signals through defined swipe surfaces. We investigate how these devices, held in users’ hands or worn on their wrists, process vibration signals from swipe interactions and ambient noise using a support vector machine (SVM). The work details the signal processing workflow involving filters, sliding windows, feature vectors, SVM kernels, and ambient noise management. It includes how we separate the vibration signal from a potential swipe surface and ambient noise. We explore both software and human factors influencing the signals: the former includes the computational techniques mentioned, while the latter encompasses swipe orientation, contact, and movement. Our findings show that the SVM classifies swipe surface signals with an accuracy of 69.61% when both devices are used, 97.59% with only the smartphone, and 99.79% with only the smartwatch. However, the classification accuracy drops to about 50% in field user studies simulating real-world conditions such as phone calls, typing, walking, and other undirected movements throughout the day. The decline in performance under these conditions suggests challenges in ambient noise discrimination, which this work discusses, along with potential strategies for improvement in future research. Full article
Show Figures

Figure 1

Figure 1
<p>Picture (<b>a</b>) shows the swipe path on the different surfaces in this work, whereas picture (<b>b</b>) points out the relationship between bumps and spaces on these surfaces.</p>
Full article ">Figure 2
<p>The flow diagram illustrates the process of capturing the vibration signal and classifying it into a feature vector using SVM. It also demonstrates the implementation of the Java library <span class="html-italic">libSVM</span> and its corresponding software components. This diagram illustrates the software-determined aspects of the process. In <a href="#sec5dot1dot1-mti-08-00076" class="html-sec">Section 5.1.1</a>, we provide a detailed explanation of the stages shown in the flow diagram. This section focuses on illustrating the process of creating the sliding window <math display="inline"><semantics> <msub> <mi>w</mi> <mi>i</mi> </msub> </semantics></math>, as depicted in Figure 3. The feature elements for <math display="inline"><semantics> <mover accent="true"> <mo>Υ</mo> <mo>→</mo> </mover> </semantics></math> are listed in Table 3, while Table 4 presents the values for <math display="inline"><semantics> <msub> <mi>ω</mi> <mi>c</mi> </msub> </semantics></math>. Additionally, <a href="#mti-08-00076-t002" class="html-table">Table 2</a> displays the <tt>svm_parameter</tt> for the SVM, and the kernels applied for the SVM are outlined in Table 4.</p>
Full article ">Figure 3
<p>Stages one to five illustrate ①–⑤ the process of performing a swipe gesture over a textured surface. From this time series <math display="inline"><semantics> <mover accent="true"> <mi>s</mi> <mo>→</mo> </mover> </semantics></math> sliding windows <span class="html-italic">w</span>s are extracted. Two <span class="html-italic">w</span>s from each swipe movement are used to train the SVM on the swipe surface. One <span class="html-italic">w</span> is used to classify the swipe surface.</p>
Full article ">Figure 4
<p>Vibration signals when a participant swipes over the swipe surface while the hand is moving. The first plot highlights again the five stages of a swipe gesture. The applied swipe contact is the nail, and the swipe orientation is horizontal. These last two conditions are fixed during the recorded time samples.</p>
Full article ">Figure 5
<p>Swiping with different movement behaviors on the surface.</p>
Full article ">Figure 6
<p>Confusion matrices (multiclass SVMs) and bar charts (one-class SVMs) for the best conditions from <a href="#mti-08-00076-t005" class="html-table">Table 5</a> are indicated by the column name. The row names represent the user studies.</p>
Full article ">Figure 7
<p>Confusion matrices (multiclass SVMs) and bar charts (one-class SVMs) for the worst conditions from <a href="#mti-08-00076-t005" class="html-table">Table 5</a> are indicated by the column name. The row names represent the user studies.</p>
Full article ">Figure 8
<p>Confusion matrices (multiclass SVM) and bar charts (one-class SVM) illustrating the results under the best conditions during the in-field user study. The column name in the matrices represents the best conditions for the SVM, which was listed in <a href="#mti-08-00076-t005" class="html-table">Table 5</a>. The row name in the matrices corresponds to different hand poses, as depicted in <a href="#mti-08-00076-f006" class="html-fig">Figure 6</a>.</p>
Full article ">Figure 9
<p>Confusion matrices (multiclass SVM) and bar charts (one-class SVM) illustrating the results under the worst conditions during the in-field user study. The column name in the matrices represents the best conditions for the SVM, which was listed in <a href="#mti-08-00076-t005" class="html-table">Table 5</a>. The row name in the matrices corresponds to different hand poses, as depicted in <a href="#mti-08-00076-f006" class="html-fig">Figure 6</a>.</p>
Full article ">Figure 10
<p>Misclassified ambient noise signals as swipe gestures. The best and the worst conditions are taken from <a href="#mti-08-00076-t005" class="html-table">Table 5</a>.</p>
Full article ">Figure A1
<p>The vibration signal shape while participants swipe over their closed fingers.</p>
Full article ">Figure A2
<p>Feature vector shapes for swipe surfaces. The best and the worst conditions are taken from <a href="#mti-08-00076-t005" class="html-table">Table 5</a>.</p>
Full article ">Figure A3
<p>Feature vector shape for ambient noise signals. The best and the worst conditions are taken from <a href="#mti-08-00076-t005" class="html-table">Table 5</a>.</p>
Full article ">Figure A4
<p>Schematic of the best and worst feature vector conditions for high classification accuracy of different swipe surfaces in <a href="#mti-08-00076-f001" class="html-fig">Figure 1</a>.</p>
Full article ">
13 pages, 373 KiB  
Article
Ambient Backscatter-Based User Cooperation for mmWave Wireless-Powered Communication Networks with Lens Antenna Arrays
by Rongbin Guo, Rui Yin, Guan Wang, Congyuan Xu and Jiantao Yuan
Electronics 2024, 13(17), 3485; https://doi.org/10.3390/electronics13173485 - 2 Sep 2024
Viewed by 508
Abstract
With the rapid consumer adoption of mobile devices such as tablets and smart phones, tele-traffic has experienced a tremendous growth, making low-power technologies highly desirable for future communication networks. In this paper, we consider an ambient backscatter (AB)-based user cooperation (UC) scheme for [...] Read more.
With the rapid consumer adoption of mobile devices such as tablets and smart phones, tele-traffic has experienced a tremendous growth, making low-power technologies highly desirable for future communication networks. In this paper, we consider an ambient backscatter (AB)-based user cooperation (UC) scheme for mmWave wireless-powered communication networks (WPCNs) with lens antenna arrays. Firstly, we formulate an optimization problem to maximize the minimum rate of two users by jointly designing power and time allocation. Then, we introduce auxiliary variables and transform the original problem into a convex form. Finally, we propose an efficient algorithm to solve the transformed problem. Simulation results demonstrate that the proposed AB-based UC scheme outperforms the competing schemes, thus improving the fairness performance of throughput in WPCNs. Full article
Show Figures

Figure 1

Figure 1
<p>System model and transmission protocol for UC.</p>
Full article ">Figure 2
<p>Performance of different schemes.</p>
Full article ">Figure 3
<p>Performance of different schemes.</p>
Full article ">Figure 4
<p>Performance of different schemes.</p>
Full article ">Figure 5
<p>Performance of different schemes.</p>
Full article ">
15 pages, 1407 KiB  
Study Protocol
Digital Platform for the Prevention of Suicidal Behaviour and Non-Suicidal Self-Injuries in Adolescents: The SmartCrisis-Teen Study Protocol
by Sofía Abascal-Peiró, Inmaculada Peñuelas-Calvo, Adrian Alacreu-Crespo, Pilar Alejandra Sáiz, Alejandro De la Torre-Luque, Miguel Ruiz-Veguilla, María Luisa Barrigón, Philippe Courtet, Jorge López-Castroman, Enrique Baca-García and Alejandro Porras-Segovia
Behav. Sci. 2024, 14(9), 740; https://doi.org/10.3390/bs14090740 - 25 Aug 2024
Viewed by 848
Abstract
Suicidal behavior and Non-Suicidal Self-Injuries (NSSIs) are a major health problem in the adolescent population. New technologies can contribute to the development of innovative interventions in suicide prevention. Here, we present the SmartCrisis-Teen study protocol. The study consists of a randomized clinical trial [...] Read more.
Suicidal behavior and Non-Suicidal Self-Injuries (NSSIs) are a major health problem in the adolescent population. New technologies can contribute to the development of innovative interventions in suicide prevention. Here, we present the SmartCrisis-Teen study protocol. The study consists of a randomized clinical trial which aims to evaluate the effectiveness of a digital safety plan to prevent suicidal behavior and NSSIs in adolescents. This is a multicentric study which will be conducted among the adolescent population, both in clinical and student settings, with a target sample of 1080 participants. The intervention group will receive an Ecological Momentary Intervention (EMI) consisting of a digital safety plan on their mobile phone. All participants will receive their Treatment As Usual (TAU). Participants will be followed for six months, with weekly and monthly telephone visits and face-to-face visits at three and six months. Participants will be assessed using traditional questionnaires as well as Ecological Momentary Assessment (EMA) and Implicit Association Tests (IATs). With this intervention, we expect a reduction in NSSIs through the acquisition of coping strategies and a decrease in suicidal behavior over the course of follow-up. This study provides a novel, scalable digital intervention for preventing suicidal behavior and NSSIs in adolescents, which could contribute to improving adolescent mental health outcomes globally. Full article
(This article belongs to the Section Psychiatric, Emotional and Behavioral Disorders)
Show Figures

Figure 1

Figure 1
<p>SmartCrisis-Teen study design.</p>
Full article ">Figure 2
<p>Example of customization of the “Reasons for living” tab.</p>
Full article ">Figure 3
<p>Tabs of the safety plan.</p>
Full article ">
35 pages, 4465 KiB  
Review
A Review of Gas Sensors for CO2 Based on Copper Oxides and Their Derivatives
by Christian Maier, Larissa Egger, Anton Köck and Klaus Reichmann
Sensors 2024, 24(17), 5469; https://doi.org/10.3390/s24175469 - 23 Aug 2024
Viewed by 908
Abstract
Buildings worldwide are becoming more thermally insulated, and air circulation is being reduced to a minimum. As a result, measuring indoor air quality is important to prevent harmful concentrations of various gases that can lead to safety risks and health problems. To measure [...] Read more.
Buildings worldwide are becoming more thermally insulated, and air circulation is being reduced to a minimum. As a result, measuring indoor air quality is important to prevent harmful concentrations of various gases that can lead to safety risks and health problems. To measure such gases, it is necessary to produce low-cost and low-power-consuming sensors. Researchers have been focusing on semiconducting metal oxide (SMOx) gas sensors that can be combined with intelligent technologies such as smart homes, smart phones or smart watches to enable gas sensing anywhere and at any time. As a type of SMOx, p-type gas sensors are promising candidates and have attracted more interest in recent years due to their excellent electrical properties and stability. This review paper gives a short overview of the main development of sensors based on copper oxides and their composites, highlighting their potential for detecting CO2 and the factors influencing their performance. Full article
(This article belongs to the Special Issue Gas Sensors: Materials, Mechanism and Applications)
Show Figures

Figure 1

Figure 1
<p>Hole accumulation layer of a p-type semiconductor [<a href="#B66-sensors-24-05469" class="html-bibr">66</a>].</p>
Full article ">Figure 2
<p>Band model with band bending for p-type semiconductor: (<b>a</b>) inert conditions, (<b>b</b>) upward band bending and (<b>c</b>) decrease of band bending. E<sub>VAC</sub>, energy level of electrons in vacuum level; E<sub>C</sub>, energy level of conduction band; E<sub>F</sub>, energy level of Fermi level; E<sub>A</sub>, energy level of acceptor state; E<sub>V</sub>, energy level of valence band; q, electron charge; qV<sub>S</sub>, potential barrier [<a href="#B69-sensors-24-05469" class="html-bibr">69</a>].</p>
Full article ">Figure 3
<p>Different types of thin film produced by spray pyrolysis: (<b>a</b>) porous films, (<b>b</b>) dense films and (<b>c</b>) powdery films [<a href="#B95-sensors-24-05469" class="html-bibr">95</a>].</p>
Full article ">Figure 4
<p>Schematic representation of magnetron sputtering process [<a href="#B99-sensors-24-05469" class="html-bibr">99</a>].</p>
Full article ">Figure 5
<p>Schematic representation of thermal evaporation process [<a href="#B102-sensors-24-05469" class="html-bibr">102</a>].</p>
Full article ">Figure 6
<p>Steps of sol–gel method for fabrication of metal oxides [<a href="#B115-sensors-24-05469" class="html-bibr">115</a>].</p>
Full article ">Figure 7
<p>Surface loading of metal oxide with additives.</p>
Full article ">Figure 8
<p>(<b>a</b>) Chemical and (<b>b</b>) electronic sensitization [<a href="#B123-sensors-24-05469" class="html-bibr">123</a>].</p>
Full article ">Figure 9
<p>Different metal atoms (blue and red) with (<b>a</b>) bulk integration, (<b>b</b>) randomly arranged bulk integration and (<b>c</b>) multilayer arranged bulk integration [<a href="#B120-sensors-24-05469" class="html-bibr">120</a>].</p>
Full article ">Figure 10
<p>Work function change of CuO-NPs when CO<sub>2</sub> levels are increased from 400 to 4000 ppm between room temperature and 110 °C, with humidity of 45% (green) and 60% (blue) [<a href="#B130-sensors-24-05469" class="html-bibr">130</a>].</p>
Full article ">Figure 11
<p>(<b>a</b>) Resistance measurement of pristine and functionalized CuO at 300 °C and 25, 50 and 100% r.h.; (<b>b</b>) CO<sub>2</sub> pulses of 250, 500, 1000, 1500 and 2000 ppm [<a href="#B179-sensors-24-05469" class="html-bibr">179</a>].</p>
Full article ">
21 pages, 1560 KiB  
Article
WMLinks: Wearable Smart Devices and Mobile Phones Linking through Bluetooth Low Energy (BLE) and WiFi Signals
by Naixuan Guo, Zhaofeng Chen, Heyang Xu, Yu Liu, Zhechun Zhao and Sen Xu
Electronics 2024, 13(16), 3268; https://doi.org/10.3390/electronics13163268 - 17 Aug 2024
Viewed by 631
Abstract
Wearable smart devices have gradually become indispensable devices in people’s lives. Their security and privacy have gained increasing popularity among the public due to their ability to monitor and record various aspects of users’ daily activities and health data. These devices maintain a [...] Read more.
Wearable smart devices have gradually become indispensable devices in people’s lives. Their security and privacy have gained increasing popularity among the public due to their ability to monitor and record various aspects of users’ daily activities and health data. These devices maintain a wireless connection with mobile phones through periodic signal transmissions, which can be intercepted and analyzed by external observers. While these signal packets contain valuable information about the device owner, the identity of the actual user remains unknown. In this study, we propose two approaches to link wearable smart devices with users’ mobile phones, which serve as electronic identities, to enable novel applications such as multi-device authentication and user-device graph construction for targeted advertising. To establish this linkage, we propose two approaches: a passive-sniffing-based linking approach and an active-interference-based linking approach, which solve the problem of sniffing Bluetooth Low Energy broadcast packets in two stages of Bluetooth Low Energy communication. Through experiments conducted across three scenarios, we demonstrate that seven wearable devices can be successfully linked with an accuracy rate exceeding 80%, with accuracy rates approaching 100% when a device is recorded more than 11 times. Additionally, we find that four wearable devices can be linked via an active-interference-based linking approach with an accuracy rate exceeding 70%. Our results highlight the potential of wearable devices and mobile phones as a means of establishing user identities and enabling the development of more sophisticated applications in the field of wearable technology. Full article
(This article belongs to the Special Issue Wearable Device Design and Its Latest Applications)
Show Figures

Figure 1

Figure 1
<p>The workflow of the passive-sniffing-based linking approach.</p>
Full article ">Figure 2
<p>The packet headers of the WiFi signal and BLE broadcast signal.</p>
Full article ">Figure 3
<p>The comparison of RSS raw data and filtered data.</p>
Full article ">Figure 4
<p>The workflow of active-interference-based linking approach.</p>
Full article ">Figure 5
<p>The device deployment of the active-interference-based linking approach.</p>
Full article ">Figure 6
<p>The virtual marks of the corridor.</p>
Full article ">Figure 7
<p>The procedure of searching for the nearest point.</p>
Full article ">Figure 8
<p>Calculating the actual coordinates of the pedestrian.</p>
Full article ">Figure 9
<p>The condition of turning on the jammer.</p>
Full article ">Figure 10
<p>The condition of turning off the jammer.</p>
Full article ">Figure 11
<p>The tests of the effective sniffing range.</p>
Full article ">Figure 12
<p>Interference test results of two users at different distances.</p>
Full article ">Figure 13
<p>Moving trajectories in the indoor hall scenario.</p>
Full article ">Figure 14
<p>The moving trajectories of the outdoor scenario.</p>
Full article ">
18 pages, 11425 KiB  
Article
SmartVR Pointer: Using Smartphones and Gaze Orientation for Selection and Navigation in Virtual Reality
by Brianna McDonald, Qingyu Zhang, Aiur Nanzatov, Lourdes Peña-Castillo and Oscar Meruvia-Pastor
Sensors 2024, 24(16), 5168; https://doi.org/10.3390/s24165168 - 10 Aug 2024
Viewed by 892
Abstract
Some of the barriers preventing virtual reality (VR) from being widely adopted are the cost and unfamiliarity of VR systems. Here, we propose that in many cases, the specialized controllers shipped with most VR head-mounted displays can be replaced by a regular smartphone, [...] Read more.
Some of the barriers preventing virtual reality (VR) from being widely adopted are the cost and unfamiliarity of VR systems. Here, we propose that in many cases, the specialized controllers shipped with most VR head-mounted displays can be replaced by a regular smartphone, cutting the cost of the system, and allowing users to interact in VR using a device they are already familiar with. To achieve this, we developed SmartVR Pointer, an approach that uses smartphones to replace the specialized controllers for two essential operations in VR: selection and navigation by teleporting. In SmartVR Pointer, a camera mounted on the head-mounted display (HMD) is tilted downwards so that it points to where the user will naturally be holding their phone in front of them. SmartVR Pointer supports three selection modalities: tracker based, gaze based, and combined/hybrid. In the tracker-based SmartVR Pointer selection, we use image-based tracking to track a QR code displayed on the phone screen and then map the phone’s position to a pointer shown within the field of view of the camera in the virtual environment. In the gaze-based selection modality, the user controls the pointer using their gaze and taps on the phone for selection. The combined technique is a hybrid between gaze-based interaction in VR and tracker-based Augmented Reality. It allows the user to control a VR pointer that looks and behaves like a mouse pointer by moving their smartphone to select objects within the virtual environment, and to interact with the selected objects using the smartphone’s touch screen. The touchscreen is used for selection and dragging. The SmartVR Pointer is simple and requires no calibration and no complex hardware assembly or disassembly. We demonstrate successful interactive applications of SmartVR Pointer in a VR environment with a demo where the user navigates in the virtual environment using teleportation points on the floor and then solves a Tetris-style key-and-lock challenge. Full article
(This article belongs to the Section Sensing and Imaging)
Show Figures

Figure 1

Figure 1
<p>Illustration of SmartVR Pointer applications in a game-style scenario. On the (<b>left</b>), we show the navigation task within the Viking Village. Footprints are shown in bright green when they are available for selection, and turn purple indicating they will be used for teleporting there upon clicking. On the (<b>right</b>), we show the <span class="html-italic">Tetris</span>-style task. The <span class="html-italic">Tetris</span> shapes are shown in bright colors and the rotation regions are shown in green with black circles. The user drags and drops shapes to be placed in the appropriate location and orientation which is highlighted with the darker shade.</p>
Full article ">Figure 2
<p>Setup for SmartVR Pointer with the camera mounted on top of the HTC Vive Cosmos Elite HMD.</p>
Full article ">Figure 3
<p>Comparison of baseline condition (<b>left</b>) to the usage of a SmartVR Pointer (<b>right</b>) with the camera mounted on top of the HTC Vive Pro Headset.</p>
Full article ">Figure 4
<p>Setup for SmartVR Pointer with the camera mounted at the bottom of the Meta Quest 2 HMD.</p>
Full article ">Figure 5
<p>Using Vuforia Engine to display a red cube on top of the tracked QR code image displayed on the smartphone.</p>
Full article ">Figure 6
<p>Teleporting application enabled by this technique where the user can select footprint-shaped teleport points to navigate around the VR environment by placing the pointer on one of the footprints.</p>
Full article ">Figure 7
<p>Illustration of how the ray-casting mechanism works. (<b>a</b>) The view the player sees while wearing the HMD shows a pointer highlighting the <span class="html-italic">Tetris</span> block in the bottom left for selection. (<b>b</b>) The developer view illustrates how a black ray is cast from the player towards the pointer, and through the <span class="html-italic">Tetris</span> block.</p>
Full article ">Figure 8
<p>Screenshot of the <span class="html-italic">Tetris</span>-like task view shown in Unity from a third-person perspective, illustrating the semi-transparent VR canvas (the rectangle shown on the right side, darkened here for emphasis). From the user’s perspective (the white camera in the middle), the panel appears fixed in front of the viewer.</p>
Full article ">Figure 9
<p>Teleporting task completion times. (<b>Top</b>) Distribution of completion time per condition. Horizontal line inside the box indicates the median completion time, and the box height indicates the interquartile range (IQR). (<b>Bottom</b>) Pairwise mean time completion difference between conditions. The gray dotted line marks the point where the difference between the mean levels of the conditions compared is zero.</p>
Full article ">Figure 10
<p><span class="html-italic">Tetris</span>-like task completion times. (<b>Top</b>) Distribution of completion time per condition. Horizontal line inside the box indicates the median completion time, and the box height indicates the interquartile range (IQR). (<b>Bottom</b>) Pairwise mean time completion difference between conditions. The gray dotted line marks the point where the difference between the mean levels of the conditions compared is zero.</p>
Full article ">Figure 11
<p>Click success rate. (<b>Top</b>) Distribution of click success rate per condition. Horizontal line inside the box indicates the median click success rate, and the box height indicates the interquartile range (IQR). (<b>Bottom</b>) Pairwise mean click success rate difference between conditions. The gray dotted line marks the point where the difference between the mean levels of the conditions compared is zero.</p>
Full article ">Figure 12
<p>Participants scores regarding perception of ease of use per condition. Horizontal line inside the box indicates the median click success rate, and the box height indicates the interquartile range (IQR). For Ranking, 1 is “Least Preferred” and 7 is “Most preferred”.</p>
Full article ">Figure 13
<p>Participants scores regarding condition preference. Horizontal line inside the box indicates the median click success rate, and the box height indicates the interquartile range (IQR). For Ranking, 1 is “Least Preferred” and 7 is “Most preferred”.</p>
Full article ">Figure 14
<p>Participants agreement with the statement that one of the three SmartVR Pointer modes was easier to use for the teleporting task than the VR controller. Horizontal line inside the box indicates the median click success rate, and the box height indicates the interquartile range (IQR). For Ranking, 1 is “Not agree at all” and 7 is “Completely agree”.</p>
Full article ">Figure 15
<p>Participants agreement with the statement that one of the three SmartVR Pointer modes was easier to use for the <span class="html-italic">Tetris</span>-like task than the VR controller. Horizontal line inside the box indicates the median click success rate, and the box height indicates the interquartile range (IQR). For Ranking, 1 is “Not agree at all” and 7 is “Completely agree”.</p>
Full article ">
17 pages, 3841 KiB  
Article
An Image-Based User Interface Testing Method for Flutter Programming Learning Assistant System
by Soe Thandar Aung, Nobuo Funabiki, Lynn Htet Aung, Safira Adine Kinari, Khaing Hsu Wai and Mustika Mentari
Information 2024, 15(8), 464; https://doi.org/10.3390/info15080464 - 3 Aug 2024
Viewed by 1176
Abstract
Flutter has become popular for providing a uniform development environment for user interfaces (UIs) on smart phones, web browsers, and desktop applications. We have developed the Flutter programming learning assistant system (FPLAS) to assist its novice students’ self-study. We implemented the Docker-based Flutter [...] Read more.
Flutter has become popular for providing a uniform development environment for user interfaces (UIs) on smart phones, web browsers, and desktop applications. We have developed the Flutter programming learning assistant system (FPLAS) to assist its novice students’ self-study. We implemented the Docker-based Flutter environment with Visual Studio Code and three introductory exercise projects. However, the correctness of students’ answers is manually checked, although automatic checking is necessary to reduce teachers’ workload and provide quick responses to students. This paper presents an image-based user interface (UI) testing method to automate UI testing by the answer code using the Flask framework. This method produces the UI image by running the answer code and compares it with the image made by the model code for the assignment using ORB and SIFT algorithms in the OpenCV library. One notable aspect is the necessity to capture multiple UI screenshots through page transitions by user input actions for the accurate detection of changes in UI elements. For evaluations, we assigned five Flutter exercise projects to fourth-year bachelor and first-year master engineering students at Okayama University, Japan, and applied the proposed method to their answers. The results confirm the effectiveness of the proposal. Full article
Show Figures

Figure 1

Figure 1
<p>Overview of <span class="html-italic">Docker</span>-based <span class="html-italic">Flutter</span> development environment.</p>
Full article ">Figure 2
<p>Overview of image-based <span class="html-italic">UI</span> testing method.</p>
Full article ">Figure 3
<p>Highlighting differences with similarity result in <span class="html-italic">UI</span> testing.</p>
Full article ">Figure 4
<p><span class="html-italic">To-do list</span> project in <b>Exercise-5</b>.</p>
Full article ">Figure 5
<p>Similarity scores for all images of five exercises.</p>
Full article ">Figure 6
<p>Results for <b>Exercise-1</b> and <b>Exercise-2</b>.</p>
Full article ">Figure 7
<p>Results for <b>Exercise-3</b>.</p>
Full article ">Figure 8
<p>Results for <b>Exercise-4</b>.</p>
Full article ">Figure 9
<p>Results for <b>Exercise-5</b>.</p>
Full article ">
13 pages, 1058 KiB  
Article
Subjective and Objective Day-to-Day Performance Measures of People with Essential Tremor
by Navit Roth, Adham Salih and Sara Rosenblum
Sensors 2024, 24(15), 4854; https://doi.org/10.3390/s24154854 - 26 Jul 2024
Viewed by 583
Abstract
This paper aims to map the daily functional characteristics of people diagnosed with essential tremor (ET) based on their subjective self-reports. In addition, we provide objective measurements of a cup-drinking task. This study involved 20 participants diagnosed with ET who completed the Columbia [...] Read more.
This paper aims to map the daily functional characteristics of people diagnosed with essential tremor (ET) based on their subjective self-reports. In addition, we provide objective measurements of a cup-drinking task. This study involved 20 participants diagnosed with ET who completed the Columbia University Assessment of Disability in Essential Tremor (CADET) questionnaire that included five additional tasks related to digital equipment operation we wrote. Participants also described task-performance modifications they implemented. To create objective personal performance profiles, they performed a cup-drinking task while being monitored using a sensor measurement system. The CADET’s subjective self-report results indicate that the most prevalent tasks participants reported as having difficulty with or requiring modifications were writing, threading a needle, carrying a cup, using a spoon, pouring, and taking a photo or video on a mobile phone. Analysis of participants’ modifications revealed that holding the object with two hands or with one hand supporting the other were the most prevalent types. No significant correlation was found between the CADET total scores and the cup drinking objective measures. Capturing patients’ perspectives on their functional disability, alongside objective performance measures, is envisioned to contribute to the development of custom-tailored interventions aligned with individual profiles, i.e., patient-based/smart healthcare. Full article
(This article belongs to the Special Issue Innovative Sensors and IoT for AI-Enabled Smart Healthcare)
Show Figures

Figure 1

Figure 1
<p>Cup accelerometer axes layout.</p>
Full article ">Figure 2
<p>Cup position states: State 1—On the table, start position; State 2—Mid-duration between States 2 and 3; State 3—Near the mouth, before drinking; State 4—Near the mouth, after drinking, start of returning the cup to the table movement; State 5—Mid-duration between States 4 and 6; State 6—Contact with the table, end of returning movement.</p>
Full article ">Figure 3
<p>MSE Calculation.</p>
Full article ">Figure 4
<p>MSE-T 5–6 Measure vs. CADET Score.</p>
Full article ">
18 pages, 3407 KiB  
Article
GenTrajRec: A Graph-Enhanced Trajectory Recovery Model Based on Signaling Data
by Hongyao Huang, Haozhi Xie, Zihang Xu, Mingzhe Liu, Yi Xu and Tongyu Zhu
Appl. Sci. 2024, 14(13), 5934; https://doi.org/10.3390/app14135934 - 8 Jul 2024
Viewed by 659
Abstract
Signaling data are records of the interactions of users’ mobile phones with their nearest cellular stations, which could provide long-term and continuous-time location data of large-scale citizens, and therefore have great potential in intelligent transportation, smart cities, and urban sensing. However, utilizing the [...] Read more.
Signaling data are records of the interactions of users’ mobile phones with their nearest cellular stations, which could provide long-term and continuous-time location data of large-scale citizens, and therefore have great potential in intelligent transportation, smart cities, and urban sensing. However, utilizing the raw signaling data often suffers from two problems: (1) Low positioning accuracy. Since the signaling data only describes the interaction between the user and the mobile base station, they can only restore users’ approximate geographical location. (2) Poor data quality. Due to the limitations of mobile signals, user signaling may be missing and drifting. To address the above issues, we propose a graph-enhanced trajectory recovery network, GenTrajRec, to recover precise trajectories from signaling data. GenTrajRec encodes signaling data through spatiotemporal encoders and enhances the traveling semantics by constructing a signaling transition graph. In fusing the spatiotemporal information as well as the deep traveling semantics, GenTrajRec can well tackle the challenge of poor data quality, and recover precise trajectory from raw signaling data. Extensive experiments have been conducted on two real-world datasets from Mobile Signaling and Geolife, and the results confirm the effectiveness of our approach, and the positioning accuracy can be improved from 315 m per point to 82 m per point for signaling data using our network. Full article
(This article belongs to the Special Issue Advances in Image Recognition and Processing Technologies)
Show Figures

Figure 1

Figure 1
<p>Signaling data. (<b>a</b>) shows that the base stations cannot represent the real position of the users. These base stations still have a distance from the real trajectory. (<b>b</b>) shows that there could be redundant segments in the trajectories.</p>
Full article ">Figure 2
<p>An overview of the model.</p>
Full article ">Figure 3
<p>The structure of the feature fusion part; this part can be used for any two related features. The right figure is the MLP structure we are using in this work.</p>
Full article ">Figure 4
<p>Results for different distances between the matched cell towers and the real positions.</p>
Full article ">Figure 5
<p>Detailed ablation study on MSD dataset.</p>
Full article ">Figure 6
<p>The recovering cases for some of the moving results.</p>
Full article ">Figure 7
<p>The recovering cases for some of the staying results.</p>
Full article ">Figure 8
<p>Results for long sequence.</p>
Full article ">
33 pages, 2403 KiB  
Article
The Development Trends of Computer Numerical Control (CNC) Machine Tool Technology
by Kai-Chao Yao, Dyi-Cheng Chen, Chih-Hsuan Pan and Cheng-Lung Lin
Mathematics 2024, 12(13), 1923; https://doi.org/10.3390/math12131923 - 21 Jun 2024
Viewed by 2325
Abstract
In the industrial era, production equipment serves as an essential mother machine. In the global manufacturing industry, components such as laptop computers, mobile phones, and automotive parts all strive for aesthetic appearance. Taiwan’s machine tool industry plays a significant role globally. Faced with [...] Read more.
In the industrial era, production equipment serves as an essential mother machine. In the global manufacturing industry, components such as laptop computers, mobile phones, and automotive parts all strive for aesthetic appearance. Taiwan’s machine tool industry plays a significant role globally. Faced with the constantly changing market environment, the development and competitive advantage of CNC machines are crucial topics for manufacturers. Domestic manufacturers of computer numerical control machines should move towards the integration of automated equipment to accommodate various advanced parts processing procedures. Smart manufacturing will become the trend of the industry in the future. This study invited experts from academia, industry, and research institutions to conduct expert interviews. Their opinions were compiled and analyzed, supplemented by fuzzy Delphi analysis to establish the development trends of various modules. The feasibility and demand of the product’s functional technology for industrial development were analyzed under three research dimensions and eight technical items. A total of 26 key sub-technical items were identified, achieving an expert consensus level of over 80. Furthermore, the importance ranking was analyzed using the fuzzy analytic hierarchy process, and the consistency tests were passed with C.I. < 0.1 and C.R. < 0.1. Finally, the obtained importance ranking of the hierarchical structure was used to predict the future development of computer numerical control machines through a technology roadmap, helping manufacturers use it as a reference model for future development trends to enhance market competitiveness. Full article
(This article belongs to the Special Issue Fuzzy Applications in Industrial Engineering, 3rd Edition)
Show Figures

Figure 1

Figure 1
<p>Double triangular fuzzy function diagram.</p>
Full article ">Figure 2
<p>Process diagram for planning expert interviews.</p>
Full article ">Figure 3
<p>Research flow chart of the fuzzy Delphi method.</p>
Full article ">Figure 4
<p>Process diagram for fuzzy AHP research.</p>
Full article ">Figure 5
<p>The Hierarchical Structure for Evaluating the Development Trends of CNC Machine Tools.</p>
Full article ">Figure 5 Cont.
<p>The Hierarchical Structure for Evaluating the Development Trends of CNC Machine Tools.</p>
Full article ">Figure 6
<p>Diagram of research dimension structure.</p>
Full article ">Figure 7
<p>Technological roadmap of development trends in computer numerical control (CNC) machine tools.</p>
Full article ">
21 pages, 9698 KiB  
Article
Soft Electrohydraulic Bending Actuators for Untethered Underwater Robots
by Hao Lin, Yihui Chen and Wei Tang
Actuators 2024, 13(6), 214; https://doi.org/10.3390/act13060214 - 8 Jun 2024
Viewed by 1080
Abstract
Traditional underwater rigid robots have some shortcomings that limit their applications in the ocean. In contrast, because of their inherent flexibility, soft robots, which have gained popularity recently, offer greater adaptability, efficiency, and safety than rigid robots. Among them, the soft actuator is [...] Read more.
Traditional underwater rigid robots have some shortcomings that limit their applications in the ocean. In contrast, because of their inherent flexibility, soft robots, which have gained popularity recently, offer greater adaptability, efficiency, and safety than rigid robots. Among them, the soft actuator is the core component to power the soft robot. Here, we propose a class of soft electrohydraulic bending actuators suitable for underwater robots, which realize the bending motion of the actuator by squeezing the working liquid with an electric field. The actuator consists of a silicone rubber film, polydimethylsiloxane (PDMS) films, soft electrodes, silicone oils, an acrylic frame, and a soft flipper. When a square wave voltage is applied, the actuator can generate continuous flapping motions. By mimicking Haliclystus auricula, we designed an underwater robot based on six soft electrohydraulic bending actuators and constructed a mechanical model of the robot. Additionally, a high-voltage square wave circuit board was created to achieve the robot’s untethered motions and remote control using a smart phone via WiFi. The test results show that 1 Hz was the robot’s ideal driving frequency, and the maximum horizontal swimming speed of the robot was 7.3 mm/s. Full article
(This article belongs to the Special Issue Soft Robotics: Actuation, Control, and Application)
Show Figures

Figure 1

Figure 1
<p>Soft electrohydraulic bending actuators: (<b>A</b>) structure of the actuator and (<b>B</b>) exploded view of the actuator.</p>
Full article ">Figure 2
<p>Illustration of the working principle of the soft electrohydraulic bending actuators. <span class="html-italic">V</span><sub>1</sub>, <span class="html-italic">V</span><sub>2</sub>, and <span class="html-italic">V</span><sub>3</sub> represent different voltages, where <span class="html-italic">V</span><sub>1</sub> &lt; <span class="html-italic">V</span><sub>2</sub> &lt; <span class="html-italic">V</span><sub>3</sub>.</p>
Full article ">Figure 3
<p>Simulation of the soft electrohydraulic bending actuators: (<b>A</b>) electric potential of the model surface; (<b>B</b>) equipotential line; (<b>C</b>) electric field line; and (<b>D</b>) superposition of electric field lines and potential lines.</p>
Full article ">Figure 4
<p>Schematic diagram of the experimental setup: (<b>A</b>) measuring device of actuator bending angle and (<b>B</b>) measurement device of actuator output force.</p>
Full article ">Figure 5
<p>Graphs for each group (the volume of silicone oil corresponding to green, red, and blue are 0.7 mL, 0.8 mL and 0.9 mL, respectively): (<b>A</b>) relationship between actuator bending angle and actuation voltage and (<b>B</b>) relationship between actuator output force and actuation voltage.</p>
Full article ">Figure 6
<p>Maximum bending angle of the soft electrohydraulic bending actuators.</p>
Full article ">Figure 7
<p>Schematic diagrams and photographs of pressing molds.</p>
Full article ">Figure 8
<p>Fabrication process of soft electrode chamber and soft electrode.</p>
Full article ">Figure 9
<p>Fabrication process of silicone oil chamber: (<b>A</b>) pressing of silicone oil chamber; (<b>B</b>) injection of silicone oil.</p>
Full article ">Figure 10
<p>Photograph of the soft electrohydraulic bending actuator.</p>
Full article ">Figure 11
<p>The model of the underwater robot.</p>
Full article ">Figure 12
<p>Force analysis of the underwater robot: (<b>A</b>) static force condition and (<b>B</b>) dynamic force condition.</p>
Full article ">Figure 13
<p>Actuator physical model: (<b>A</b>) tensile strength of silicone oil chamber and (<b>B</b>) liquid propulsion.</p>
Full article ">Figure 14
<p>The composition of the control circuit: (<b>A</b>) the principle of the control circuit and (<b>B</b>) high-voltage circuit board.</p>
Full article ">Figure 15
<p>Photograph of the untethered underwater robot.</p>
Full article ">Figure 16
<p>Swimming setup of the untethered underwater robot.</p>
Full article ">Figure 17
<p>Swimming state of the underwater robot at different moments in time. T<sub>1</sub>-T<sub>9</sub> represent time series.</p>
Full article ">Figure 18
<p>Plot of input signal frequency versus average robot speed.</p>
Full article ">
Back to TopTop