Abstract
Human-machine interaction has emerged as a revolutionary and transformative technology, bridging the gap between human and machine. Gesture recognition, capitalizing on the inherent dexterity of human hands, plays a crucial role in human-machine interaction. However, existing systems often struggle to meet user expectations in terms of comfort, wearability, and seamless daily integration. Here, we propose a handwriting recognition technology utilizing an intelligent hybrid-fabric wristband system. This system integrates spun-film sensors into textiles to form the smart fabric, enabling intelligent functionalities. A thermal encapsulation process is proposed to bond multiple spun-films without additional materials, ensuring the lightweight, breathability, and stretchability of the spun-film sensors. Furthermore, recognition algorithms facilitate precise accurate handwriting recognition of letters, with an accuracy of 96.63%. This system represents a significant step forward in the development of ergonomic and user-friendly wearable devices for enhanced human-machine interaction, particularly in the virtual world.
Similar content being viewed by others
Introduction
The convergence of artificial intelligence (AI) and humanoid robotics has significantly accelerated the advancement of human-machine interaction (HMI) technology. HMI now plays a pivotal role in diverse fields, including industrial control and automation1,2, health monitoring3,4,5, virtual and augmented reality (VR/AR)6,7,8, robotics9. The inherent flexibility and high precision of human hands allow for the rapid performance of a wide range of tasks, making gesture recognition technology an essential component of HMI. Compared to capturing movements from other body parts as the input method of HMI, gesture-based interaction offers superior intuitiveness and convenience10,11. Users can execute various commands through simple hand motions, enhancing operational efficiency and user experience. Current gesture recognition systems primarily fall into two categories: non-contact devices (e.g., cameras12 and wireless devices13), and contact devices (e.g., detecting hand mechanics14,15,16 and surface electromyography17). While non-contact approaches offer ease of use, they are susceptible to environmental interference and require complex setups. Existing contact devices often involve cumbersome accessories attached to the hand or forearm14,17,18,19, compromising user comfort. These factors hinder the user’s immersive experience and prevent accurate capture of fine motor actions, such as handwriting letter recognition. The wrist provides comprehensive information on hand movement, and thus wristbands offer a compelling solution due to their strategic location and inherent structure. Wristbands are capable of capturing comprehensive hand movement data while simultaneously reducing the user burden. With superior fusion capabilities and ergonomic design principles, wristband wearable devices have proven highly effective in gesture-based HMI20,21,22,23,24.
With technological advancements, there has been a corresponding increase in the expectations placed upon the development of handwriting recognition devices25. In the context of virtual handwriting recognition applications18, the paramount qualities include accuracy, comfort, and wearability. Currently handwriting recognition technology can be categorized into online25,26,27 and offline recognition28,29,30. Online recognition involves real-time tracking of pen tip movements, typically on pen-based computer screens or specialized drawing tools. These devices can display handwritten strokes as they are being made. Conversely, offline recognition focuses on capturing and interpreting written text from physical documents or photographs, often employing optical character recognition or intelligent word recognition. However, these methods are often constrained by existing equipment or recognition mechanisms, hindering their application in virtual handwriting scenarios. Utilizing the gesture recognition of the wearable wristband structure to develop a virtual handwriting recognition technology can break through the shortcomings of current handwriting recognition in virtual environments.
In this work, we propose an intelligent hybrid-fabric wristband system (IHFWs) for virtual handwriting recognition, which also has applications for directional control and object recognition (Fig. 1a). The system leverages a thermal encapsulation process to fabricate a TPU-based sensor directly integrated into the wristband. This composite textile wristband comprises a fabric bandage as a support and an all-spun-film capacitive sensor, both featuring excellent skin conformability (Fig. 1c). The sensor utilizes biocompatible single-spun thermoplastic polyurethane (TPU) as the dielectric layer and silver nanowires (AgNWs) and TPU film prepared by a blended-spun process as the electrode layer. The heat-assisted packaging process efficiently combines these multi-layered components (Fig. 1e). The resulting sensor boasts key features like lightness, breathability, and stretchability (Figs. 1d, Supplementary Note 1 and Supplementary Figs. 1–3). Furthermore, a 3D-printed enclosure houses the detection circuitry and wireless module, ensuring user comfort and system integrity (Fig. 1b). We have also developed corresponding neural network recognition algorithms, enabling seamless interaction with applications through wireless transmission of sensor-collected signals (Fig. 1f). This system achieves remarkable virtual handwriting recognition accuracy of 96.63% for all 26 letters.
a Schematic of the Intelligent Hybrid-Fabric Wristband system. b Schematic of capacitance detection module including a control circuit and a battery. The control circuit consists of power module, bluetooth (BLE),and microcontroller unit (MCU). c Photograph of the intelligent hybrid-fabric wristband system (IHFWs) worn on the wrist. d Photograph of a six-channel capacitive pressure sensor composed of all-spun-film. e The cross-link interface morphologies of two films packaged together by thermal encapsulation. The experiment was conducted six times with similar results. f Data flow chart of the wristband human-machine interaction interface system.
Results
Thermal encapsulation process
Integrating functional electro-spun films into textiles to create intelligent fabrics has been a growing area of research for decades31,32. These fabrics are lightweight, breathable, and highly comfortable against human skin, making them suitable for wearable sensors33,34,35,36. Nevertheless, traditional encapsulation techniques that rely on tape, adhesives, or directly covering materials onto flexible devices have been demonstrated to compromise sensor sensitivity and diminish the breathability and brightness of films9,37,38. To address these limitations, we developed a unique heat-assisted encapsulation method that utilizes the inherent melting properties of polymers to bond multiple electro-spun films without additional materials (Fig. 2a). This method significantly enhances the stability and robustness of the sensor system while maintaining breathability and lightweight properties.
a Schematic of the thermal encapsulation process. b Scanning electron microscopy (SEM) images of the electrospun film at different heat encapsulation temperatures (Pressure: 100 kPa; Mode size: 1 mm × 10 mm). The experiment was conducted four times with similar results. c Schematic of the stainless steel mold and encapsulation process. d Schematic of the sensor array composed of thermoplastic polyurethane (TPU) film and silver nanowires-thermoplastic polyurethane (AgNWs/TPU) film. e–f SEM images of TPU film and AgNWs/TPU film. The experiment was conducted six times with similar results. g Sensitivity curves of three different dielectric layer types. h Signal diagram of device response time. i Response signal of the device at different pressures. j Mechanical durability test. The proposed sensor can perform up to 1500 consecutive pressed-release cycles.
We conducted differential scanning calorimetry (DSC) analysis to identify the optimum temperature for encapsulation using the prepared TPU film. The glass transition temperature of TPU film is 70 °C. 109 °C is the melting temperature. The peak heat absorption temperature of TPU film melting is 163 °C. (Supplementary Fig. 4). Based on the results, we tested four temperatures around the melting point of the polymer (75 °C, 95 °C, 115 °C, and 135 °C) to evaluate their packaging effectiveness. Figure 2b illustrates the microstructure of the electro-spun film observed at different temperatures, using the pressure of 100 kPa, showing that spun film has a melting effect at hot pressing (95-135 °C). Images of other temperatures and pressures are shown in Supplementary Fig. 5. The mechanical test of the bonding effect shows that the optimal hot-pressing effect is achieved at 115 °C, 100 kPa (Supplementary Fig. 6). At lower temperatures, the TPU film does not reach a molten state, thus failing to establish an adhesive effect between the multiple layers. Conversely, excessively high temperatures can cause uncontrolled liquefaction of the spun film in the heated area, leading to insufficient bonding between the multi-layers (Supplementary Note. 2). The encapsulation process utilizes a specially designed mold, which is first heated to 115 °C and then hot-pressed onto a pre-assembled multilayer fabric film (Supplementary Fig. 7). This heat encapsulation technique boosts the stability and robustness of multi-layer all-fabric capacitive sensor arrays while streamlining the manufacturing process.
Device characteristics
A capacitive sensor array was fabricated utilizing thermal packaging technology, as illustrated in Fig. 2c. The sensor array consists of four discrete films: two TPU-fiber films functioning as dielectric layers, and upper as well as lower electrode layers composed of silver nanowires/polyurethane (AgNWs/TPU) composite films (Fig. 2d). Supplementary Fig. 8 illustrates the detailed device fabrication process. The morphology characteristics of the pure TPU film and the AgNWs/TPU composite electrodes are shown in Fig. 2e–f, respectively. Recently, research methods for the combination of silver nanowires and spun films have emerged39,40. We used in-situ electrospinning and electro-spraying techniques to prepare AgNWs/TPU composite fiber membranes (Supplementary Fig. 9). The upper electrode is patterned using laser cutting. The AgNWs are uniformly interspersed within the TPU fibers, endowing the fiber electrodes with superior electrical properties and durability properties (Supplementary Fig. 10). To achieve accurate detection of subtle hand-wrist movements, we further explored the effect of the capacitive sensor structural parameters on its sensing performance. As illustrated in Fig. 2g, the dielectric layer structure plays a critical role in determining the sensitivity of capacitive pressure sensors. Thinner spinning dielectric layers tend to enhance sensitivity but may limit the high-sensitivity detection range or even risk electrical contact between the upper and lower electrodes (Supplementary Fig. 11). Conversely, thicker dielectric layers offer a wider detection range but at the cost of sensitivity. Utilizing dual thin TPU films as dielectrics, we optimized sensor performance by balancing high sensitivity with an expanded detection range. The capacitive pressure sensor, featuring a dual 20 μm dielectric layer, exhibited exceptional sensitivity over a wide range. Within the 0–4 kPa pressure range, it delivers a high sensitivity of 4.3 kPa-1, which decreases to 0.40 kPa-1 in the 4–20 kPa range. Furthermore, a relevant theoretical analysis elaborating on the experimental approach is provided in the methodology section.
Due to the high elasticity of the TPU material, the capacitive sensor has superior response and recovery times of 12 ms and 18 ms, respectively, when responding to pressure changes (Fig. 2h). Additionally, the sensor demonstrates consistent performance under varying frequencies (0.5–4 Hz) and boasts a minimum detectable weight of 0.6 g, highlighting its ability to capture subtle signals (Supplementary Fig. 12). Figure 2i showcases the consistent output response of the sensor across a range of pressures (0.2–18 kPa), simulating its ability to detect hand-wrist movements in varying degrees. To further validate the stability of the device, we conducted a durability test on the sensor. The sensor exhibited a minimal variation in output after 1500 pressure cycles at 10 kPa (Fig. 2j). The combination of high sensitivity, rapid response time, and exceptional durability enables the wristband system for accurate recognition of diverse fine movements, laying the foundation for subsequent applications.
Intelligent hybrid-fabric wristband system
The ergonomically designed IHFWs provides an effective pathway for future comfortable HMI. The core element is a full-fiber, ultra-thin capacitive sensor array that seamlessly conforms to the human wrist joint, enabling real-time, high-precision monitoring of intricate wrist movements, mainly including wrist tendon motion (Supplementary Fig. 13 and Supplementary Note 3). Figure 3a illustrates the spatial correlation between the individual elements within the sensor array and their corresponding locations on the wrist. Channels 1, 5, and 6 are strategically designed to detect wrist joint movement, while the smaller channels 2, 3, and 4 aim at capturing the subtle muscle movements associated with finger activities. To validate the system’s ability to differentiate between finger and wrist movements, we compared signal collection capabilities during these actions. As shown in Fig. 3b, finger bending resulted in distinct waveform changes in channel 2, while wrist bending produced significant fluctuations in channels 1, 5, and 6. These experimental results confirm the proficiency of IHFWs in accurately distinguishing between finger and wrist movements.
a Diagram of sensor channel placement. b Sensor signals generated by finger bending and wrist bending actions. Scale bar, 12 pF. c Signal diagram for the same action at different speeds. Scale bar, 20 pF. d–f Signal waveform obtained by repeating the action. They are repeated index finger circles, fist-clenching and palm extensions, and index finger bending. Scale bar, 15 pF. g Schematic of capacitance array hardware detection, including power, microcontroller unit (MCU), analog switches, and NE555 chip. h Structure diagram of deep learning neural network. i Signal diagram when hand grasping ball. Scale bar, 15 pF. j Cluster results of object recognition signal from CNN ‘softmax’ layer. k The system controls the up, down, left, and right movements of the snake in the Snake game. Scale bar, 20 pF.
The heat packaging process plays a critical role in ensuring exceptional sensor performance. By creating effective bonding between the capacitive sensor layers, it provides excellent resistance to mechanical mismatch and allows for rapid response to external deformations. Consequently, the collected signal waveforms exhibit remarkable consistency even when performing circular motions at varying speeds (Fig. 3c). As demonstrated in Fig. 3d–f and Supplementary Fig. 14, each data channel responds uniquely to different hand gestures. More importantly, the outputs from all channels exhibit high consistency when the same gesture is repeated, highlighting the system’s reliability. Our device demonstrates excellent anti-interference capabilities for everyday wear, long-term placement, and washing (Supplementary Fig. 15 and Supplementary Note 4).
In terms of the intelligent wristband system, the hardware circuit responsible for processing the collected signals is also crucial. Conventional capacitive detection circuits, which convert capacitive signals into voltage or current signals, often have stringent circuit design requirements and exhibit a relatively weak resistance to interference24,41. Figure 3g shows our design principle that integrates a capacitance-tuning oscillator and a microcontroller unit (MCU) for detecting capacitance. This approach enhances the system interference resistance by converting capacitance signals into frequency information. Additionally, it offers a broad range of capacitance measurements from a few picofarads to several hundred picofarads. It also offers the advantages of compact size and easy integration, rendering it well-suited for wearable devices. The connection of sensor and detection circuit is shown in Supplementary Fig. 16. After being processed by the MCU, the signals are transmitted via Bluetooth module to the host computer (Supplementary Fig. 17), where subsequent algorithms can access and analyze the collected data.
Neural networks and gesture recognition applications
We implemented a 1D-Convolutional Neural Network (1D-CNN) classification algorithm on the host computer to extract features from the Bluetooth-transmitted signals. Unlike recurrent neural networks42 and long short-term memory networks43,44, 1D-CNN offers a simpler and more efficient solution for classifying and recognizing time-series signals without requiring massive datasets or complex deep learning architectures45,46. It operates by sliding a filter over the input signal, focusing on specific local patterns within a window. This approach is highly effective in recognizing signal patterns. Furthermore, the algorithm exhibits translation-invariant. Once trained, the translation-invariant characteristics of the algorithm allow the network to identify the newly acquired signal accurately, regardless of the position of the signal patterns in the timing sequence. These advantages are particularly beneficial for recognizing hand motion time-series signals obtained from the sensor array.
Figure 3h depicts the CNN structure, which receives the six-channel gesture signals as input. To optimize the CNN model for efficient clustering recognition, we fine-tuned the size of the convolution kernels and the number of filters used. Through experimentation, we found that the model achieves the best recognition precision when 5 convolution kernels and 64 filters are used (Supplementary Fig. 18). The IHFWs can be implemented for grasping object recognition (Supplementary Movie 1). Figure 3i and Supplementary Fig. 19 exhibit signal diagrams of different grasped objects. The signals have low correlation (Supplementary Fig. 20). Upon the collection of only 15 or more data samples for each object, our trained model can achieve nearly 100% recognition for eight objects. The processed signal passes through the ‘softmax’ layer of the trained neural network. Subsequently, a t-distributed stochastic neighbor embedding (t-SNE) dimensionality reduction is applied to visualize the feature distribution of 20 signals, as shown in Fig. 3j. This visualization demonstrates that the features for each object are well-separated, with low correlation between the acquired signals. Finally, Fig. 3k illustrates the application of the IHFWs for controlling movement in four directions (up, down, left, right) within a snake-game (Supplementary Movie 2). These demonstrations showcase the potential of IHFWs in revolutionizing human-computer interaction.
Virtual handwriting recognition
Although gesture recognition plays an essential role in HMI, existing gesture recognition systems are often confronted with challenges of bulky weight and sensitivity to environmental variations. Future HMI modalities are certain to evolve towards becoming more comfortable, convenient, and high-accuracy. Our IHFWs represents a significant step forward in the development of ergonomic and user-friendly wearable devices for enhanced HMI. The hybrid-fabric wristband device possesses characteristics of rapid response, high sensitivity, and excellent compatibility with the skin, maintaining effective contact even during hand movements. These superior properties give our system a significant advantage in the recognition of fine movements (Supplementary Table 1). Therefore, we can leverage this system to address the shortcomings of existing handwriting recognition devices in virtual handwriting, achieving effective virtual handwriting recognition (Supplementary Movie. 3).
The wristband system accurately captures each handwritten stroke, as illustrated in Fig. 4a–c. Figure 4a demonstrates the discrete movements involved in handwriting the letter ‘U’. Notably, our sensing system provides reliable acquisition data, as exemplified by writing the letters ‘U’, ‘N’, and ‘W’ (Fig. 4b). When writing the letter ‘W’, characterized by two identical strokes (1 and 2), the collected signal waveforms display same patterns. The system can also accurately capture each stroke while writing the letter ‘U’ and ‘N’. Figure 4c illustrates the comprehensive signal diagram of handwritten letters from A to Z, highlighting the distinct signal waveforms for each letter. Moreover, the exceptional response speed and stability of our hybrid-fiber capacitive sensor array are evident during handwriting letters at different speeds. Whether handwriting “HELLO WORLD” quickly or slowly, the signal exhibits the same characteristics, with only a time-scale difference (Fig. 4d, e). This outstanding sensor performance significantly reduces the amount of data collection during recognition network training. To further demonstrate the effectiveness of our sensor in capturing the nuances of each handwritten letter, correlation processing was conducted on the collected signals (Fig. 4f). The correlation coefficient between the letters ‘E’ and ‘F’ is 0.67, indicating similar strokes. The strong correlation (greater than 0.6) accounts for only 3.69% of the total, indicating that our sensor has excellent detection ability (Fig. 4g). At the same time, the recognition results (Fig. 4h) show that ‘F’ and ‘K’ have relatively lower accuracy compared to other handwritten letters. Nonetheless, our overall recognition accuracy can reach 96.63%. Detailed signal waveforms for each handwritten letter are shown in Supplementary Fig. 21, and more correlation coefficients are described in Supplementary Fig. 22.
a Photograph of handwriting letters, showing the letter ‘U’. b Signal diagram of handwriting ‘U’, ‘N’, and ‘W’. The black is marked with the actual stroke, and the yellow is marked with the virtual stroke. Scale bar, 15 pF. c Signal diagram output from 26 handwriting letters. Scale bar, 25 pF. d, e Diagram(d) and diagram(e) are the signal when writing the words ‘HELLO’ and ‘WORLD’ at two speeds, respectively. Scale bar, 20 pF. f The correlation coefficient of handwriting 26 letters. g Correlation coefficient distribution curve of 26 handwriting words. h The recognition accuracy of each handwriting letter. Data are presented as mean values +/- SD, sample size (n) is six.
The 1D-CNN network can perform feature extraction from the raw sensor signals. To visualize the feature extraction and clustering capabilities of the neural networks, we extracted the feature results of the ‘softmax’ layer in the neural networks and performed the t-SNE dimensionality reduction on the high-dimensional features. The resulting t-SNE clustering diagram in Supplementary Fig. 23 demonstrates the excellent classification performance of our 1D-CNN model, designed for the wearable HMI system. The boundaries are clear and the overlap is minimal between these 26 letter classes. This proves that our system is fully competent in handwriting recognition for 26 letters.
Discussion
We have developed an intelligent hybrid-fabric wristband system (IHFWs) designed to revolutionize HMI. The system leverages several key advancements, including a self-heating TPU encapsulation process for enhancing durability and maintaining breathability, smart fabric technology for a comfortable fit, and an ergonomic design for seamless wrist integration. The combination of technological advancements enables the IHFWs to support a wide range of HMI applications, including object recognition, directional control in virtual environments, and high-accuracy virtual handwriting recognition. By offering a comfortable, convenient, and precise interaction experience, the IHFWs has the potential to blur the lines between physical and virtual worlds, opening possibilities for immersive and interactive VR experiences. This technology paves the way for a more natural, intuitive, and enjoyable way of interacting with machines.
Methods
Software and system design of wireless measuring module
The IHFWs comprises a breathable and flexible textile wristband and a box fabricated through 3D printing. We have opted for a breathable, soft, and highly elastic textile band for comfortable wear on the wrist. Integrated into this band is a full-spin sensor array capable of detecting hand movements. The customized box houses the system’s processing and communication hardware: a multi-channel capacitance detection circuit, MCU computational circuitry, low-power BLE data transmission circuitry, a 400 mAh lithium battery, and corresponding charging circuitry. This highly integrated design allows the fabric sensors to capture hand movements, while the circuits within the box process and transmit the resulting capacitance signals as digital data streams. MATLAB’s AppDesigner is configured with corresponding data reception algorithms to process the received data through neural networks.
Capacitance detection circuit
The NE555 chip is configured to generate an oscillator whose frequency directly correlates with the value of the capacitance being measured. In this application, the NE555 typically operates in an unsteady state mode. During operation, a capacitance is charged through a resistor and discharged through a discharge pin inside the NE555, generating a square wave signal. When connecting different capacitance inputs, the output frequency of the square signal varies accordingly. The timer of the MCU can be configured to capture mode for measuring the time interval, i.e., period, between two rising edges or falling edges. Finally, the capacitance value can be calculated using the measured period and the known resistance value.
Fabrication of TPU films
TPU films were fabricated using an electrospinning process. TPU granules (EME-75A, Evermore) and ethanol solution (E111965, Aladdin) were stirred for 5 h to fabricate 5 wt% TPU solution. The prepared solution (5 ml) was pipetted using a no.26 needle syringe under an electric field of 8 kV for electrospinning. The film thickness was precisely controlled by adjusting the spinning time.
Fabrication of AgNWs/TPU films
AgNWs solution (XFJ95 7440-22-4, XFNANO) and ethanol solution were stirred for 10 min to fabricate 5 wt% AgNWs solution. The parament of AgNWs is Preparation method (Polyol reduction), Average diameter (30 nm), Length (20 μm), and Dispersant (Ethanol). First, 5 wt% TPU solution was spun for about 0.5 h to produce a thin layer of pure TPU film. Then, we used in-situ electrospinning and electro-spraying techniques to prepare AgNWs/TPU composite fabric membranes for 2.5 h.
Fabrication of all-fabric sensor
The sensor assembly consists of the lower electrode, two layers of TPU as the dielectric layer, the upper electrode, and a TPU supporting layer, arranged sequentially from bottom to top. The multi-layer film is hot-pressed with a stainless steel mold heated to 115°C, resulting in a fully fabricated spun-film capacitive pressure sensor array (Supplementary Fig. 8). Regarding the interface method, we first use conductive silver paste as a coupler to connect the copper wire with the AgNMs/TPU electrode. We connect the copper wire with PCB1, then use FPCB to connect PCB1 with the detection circuit (PCB2) to realize transmission of the sensor signal (Supplementary Fig. 16).
Principle of capacitive sensitivity
The pressure changes of the fully spun capacitive pressure sensor can be written as
where \({\varepsilon }_{{Air}}\) and \({\varepsilon }_{{TPU}}\) are the dielectric constants of air and TPU, respectively, ∆\({d}_{{Air}}\) and \(\Delta {{\mbox{d}}}_{{\mbox{TPU}}}\) are the thickness changes of air and TPU films, respectively, and \({d}_{{Air}}^{{\prime} }\) \({d}_{{TPU}}^{{\prime} }\) are the final thicknesses of air and TPU under applied pressure, respectively. We can get the sensitivity Eq. 1. The Young’s modulus of TPU is usually 1 ~ 10 MPa. With a little pressure, \({\varepsilon }_{{TPU}}\varDelta {d}_{{Air}}{\gg \varepsilon }_{{Air}}\varDelta {d}_{{TPU}}\). We also know that in the dielectric layer, the main material is TPU, so \({{d}_{{TPU}}^{{\prime} }{\varepsilon }_{{Air}}\gg d}_{{Air}}^{{\prime} }{\varepsilon }_{{TPU}}\). Therefore, we can obtain the capacitance change equation of conventional pressure sensors, as shown in Eq. 2.
According to Eq. 2, we can get that the greater the change of \(\varDelta {d}_{{Air}}\) or the smaller the \({d}_{{TPU}}^{{\prime} }\), the higher the sensitivity of the sensor. Supplementary Fig. 24 illustrates the theoretical sensitivity of sensors using 20 μm, 40 μm, and dual-layer 20 μm TPU films as the dielectric layer. More analysis of the theoretical and experimental data is shown in Supplementary Fig. 25 and Supplementary Note 5.
Method of signal clustering
t-distributed stochastic neighbor embedding (t-SNE) is a dimensionality reduction technique utilized to visualize high-dimensional data in a lower-dimensional space, typically 2D or 3D. In our study, we employ an Euclidean-type data projection algorithm to reduce the dimensionality of the data extracted by the CNN network from the ‘softmax’ layer. Subsequently, we utilize clustering techniques to display the dimensionality-reduced signal features, enabling a more intuitive understanding of the underlying data structure.
Training neural networks
This study was reviewed and approved by the Medical Ethics Committee of Tsinghua University, project No.20220227. We collected right-handed data from two healthy participants (one male and one female) between the ages of 24 and 35. The original data is collected, processed and used with the consent of the participants. The input format of neural network data is set as [228 6]. The number of mini-batch is 30. Supplementary Fig. 26 presents the training process and recognition results of object recognition applications and virtual handwriting applications. We evaluated the use of our device by two subjects. The device needs to be calibrated for each use and for each subject. The accuracy of object grasp recognition for each subject is 100%, and the accuracy of handwriting recognition is 96.63% and 95.71 %, respectively. More detailed information regarding the CNN networks is displayed in Supplementary Table 2.
Reporting summary
Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.
Data availability
All experimental data generated in this study are available and presented in the paper and the Supplementary Information.The collected dataset for classification are provided via Zenodo (https://doi.org/10.5281/zenodo.14334787). Source Data are provided with this paper. Other study findings are available from T.-L.R. on reasonable request, and in accordance with the Institutional Review Board at Tsinghua University. Source data are provided with this paper.
Code availability
The codes that support the findings of this study are available via GitHub (https://github.com/chengab12/IHFWs).
References
Beker, L. et al. A bioinspired stretchable membrane-based compliance sensor. PNAS 117, 11314–11320 (2020).
Yu, Y. et al. All-printed soft human-machine interface for robotic physicochemical sensing. Sci. Robot. 7, eabn0495 (2022).
Liu, D. et al. Active‐matrix sensing array assisted with machine‐learning approach for lumbar degenerative disease diagnosis and postoperative assessment. Adv. Funct. Mater. 32, 2113008 (2022).
Clementine, M. Boutry et al. A stretchable and biodegradable strain and pressure sensor for orthopaedic application. Nat. Electron. 1, 314–321 (2018).
Chung, H. U. et al. Binodal, wireless epidermal electronic systems with in-sensor analytics for neonatal intensive care. Science 363, 947 (2019).
Qu, X. et al. Artificial tactile perception smart finger for material identification based on triboelectric sensing. Sci. Adv. 8, eabq2521 (2022).
Liu, Y. et al. Electronic skin as wireless human-machine interfaces for robotic VR. Sci. Adv. 8, eabl6700 (2022).
Yu, X. et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 575, 473–479 (2019).
Shi, J. et al. Embedment of sensing elements for robust, highly sensitive, and cross-talk-free iontronic skins for robotics applications. Sci. Adv. 9, eadf8831 (2023).
Sundaram, S. et al. Learning the signatures of the human grasp using a scalable tactile glove. Nature 569, 698–702 (2019).
Pyun, K. R. et al. Machine-learned wearable sensors for real-time hand-motion recognition: toward practical applications. Natl Sci. Rev. 11, nwad198 (2024).
Venkatnarayan, R. H. & Shahzad, M. Gesture recognition using ambient light. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 1–28 (2018).
Yonglong, T. et al. RF-based fall monitoring using convolutional neural networks. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 137 (2018).
Wen, F. et al. AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove. Nat. Commun. 12, 5378 (2021).
Kim, K. K. et al. A substrate-less nanomesh receptor with meta-learning for rapid hand task recognition. Nat. Electron. 6, 64–75 (2023).
Zhou, Z. et al. Sign-to-speech translation using machine-learning-assisted stretchable sensor arrays. Nat. Electron. 3, 571–578 (2020).
Moin, A. et al. A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition. Nat. Electron. 4, 54–63 (2020).
Sun, Z. et al. Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 13, 5224 (2022).
Tashakori, A. et al. Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves. Nat. Mach. Intell. 6, 106–118 (2024).
Tan, P. et al. Self-powered gesture recognition wristband enabled by machine learning for full keyboard and multicommand input. Adv. Mater. 34, 2200793 (2022).
Liu, Y. et al. Ultralight smart patch with reduced sensing array based on reduced graphene oxide for hand gesture recognition. Adv. Intell. Syst. 4, 2200193 (2022).
An, S. et al. Noncontact human-machine interaction based on hand-responsive infrared structural color. Nat. Commun. 13, 1446 (2022).
Kim, K. K. et al. A deep-learned skin sensor decoding the epicentral human motions. Nat. Commun. 11, 2149 (2020).
Wang, T., Zhao, Y. & Wang, Q. A flexible iontronic capacitive sensing array for hand gesture recognition using deep convolutional neural networks. Soft Robot. 10, 443–453 (2022).
Cho, H. et al. Real-time finger motion recognition using skin-conformable electronics. Nat. Electron. 6, 619–629 (2023).
Chen, T. et al. Triboelectric self-powered wearable flexible patch as 3D motion control interface for robotic manipulator. ACS Nano 12, 11561–11571 (2018).
Xu, R. et al. Skin-friendly and wearable iontronic touch panel for virtual-real handwriting interaction. ACS Nano 17, 8293–8302 (2023).
Najam, R. & Faizullah, S. Analysis of recent deep learning techniques for arabic handwritten-text OCR and post-OCR correction. Appl. Sci. 13, 7568 (2023).
Alghyaline, S. Arabic optical character recognition: a review. Cmes-Comp. Model. Eng. 135, 1825–1861 (2023).
Khorsheed, M. S. Off line Arabic character recognition - a review. Pattern Anal. Appl. 5, 31–45 (2002).
Lee, S. et al. Nanomesh pressure sensor for monitoring finger manipulation without sensory interference. Science 370, 966–970 (2020).
Lu, C. et al. High-performance fibre battery with polymer gel electrolyte. Nature 629, 86–91 (2024).
Yu, P. et al. All-fabric ultrathin capacitive sensor with high pressure sensitivity and broad detection range for electronic skin. ACS Appl. Mater. Interfaces 13, 24062–24069 (2021).
Zhao, S. et al. 3D dielectric layer enabled highly sensitive capacitive pressure sensors for wearable electronics. ACS Appl. Mater. Interfaces 12, 32023–32030 (2020).
Sharma, S. et al. Hydrogen-bond-triggered hybrid nanofibrous membrane-based wearable pressure sensor with ultrahigh sensitivity over a broad pressure range. ACS Nano 15, 4380–4393 (2021).
Li, R. et al. Supercapacitive Iontronic Nanofabric Sensing. Adv. Mater. 29, 1700253 (2017).
Feng, C. P. et al. 3D Printable, form stable, flexible phase-change-based electronic packaging materials for thermal management. Addit. Manuf. 71, 103586 (2023).
Valavan, A. et al. Vacuum thermoforming for packaging flexible electronics and sensors in E-Textiles. IEEE Trans. Compon. Packaging Manuf. Technol. 13, 715–723 (2023).
Yoon, H. J. et al. Adaptive epidermal bioelectronics by highly breathable and stretchable metal nanowire bioelectrodes on electrospun nanofiber membrane. Adv. Funct. Mater. 34, 2313504 (2024).
Jiang, Z. et al. Highly stretchable metallic nanowire networks reinforced by the underlying randomly distributed elastic polymer nanofibers via interfacial adhesion improvement. Adv. Mater. 31, 1903446 (2019).
An, B. W. et al. Transparent and flexible fingerprint sensor array with multiplexed detection of tactile pressure and skin temperature. Nat. Commun. 9, 2458 (2018).
Lu, Y. et al. Decoding lip language using triboelectric sensors with deep learning. Nat. Commun. 13, 1401 (2022).
Gers, F. A., Schmidhuber, J. & Cummins, F. Learning to forget: continual prediction with LSTM. Neural Comput. 12, 2451–2471 (2000).
Hochreiter, S. & Schmidhuber, J. Long short-term memory. Neural Comput. 9, 1735–1780 (1997).
Kim, T. et al. Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces. Nat. Commun. 13, 5815 (2022).
Lee, J. H. et al. Heterogeneous structure omnidirectional strain sensor arrays with cognitively learned neural networks. Adv. Mater. 35, 2208184 (2023).
Acknowledgements
This work was partially supported by National Key R&D Program (Grant No.2022- YFB3204100 to T.L.R., No.2021YFC3002200 to Y.Y.), the National Natural Science Foundation of China (Grant No.U20A20168 to T.L.R., No.51861145202 to Y.Y., No.52303319 to T.L.R., and No.62274101 to Y.Y.), and China Postdoctoral Science Foundation(Grant No.2023M731865 to X.L.).
Author information
Authors and Affiliations
Contributions
T.L.R., Y.Y., and H.F.L. guided and supervised the whole project. A.B.C., X.L., and H.J.X. planned and performed the experiments. A.B.C., D.L., Z.Y.T., and Z.K.C. wrote the deep learning algorithms and applications. W.C.S., X.Y.L., and J.M.J. took all the photos shown in figures. A.B.C., L.Q.T., and T.R.C. contributed to the data analysis. H.F.L., A.B.C., X.L., and Z.R.D. drafted the manuscript. All authors reviewed the manuscript.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks the anonymous reviewers for their contribution to the peer review of this work. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Cheng, A., Li, X., Li, D. et al. An intelligent hybrid-fabric wristband system enabled by thermal encapsulation for ergonomic human-machine interaction. Nat Commun 16, 591 (2025). https://doi.org/10.1038/s41467-024-55649-1
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-024-55649-1