NC State Researchers Use Machine Learning To Create a Fabric-based Touch Sensor
Integrated into clothing, the device can control mobile apps, enter passwords and play video games.
A new study from NC State University combines three-dimensional embroidery techniques with machine learning to create a fabric-based sensor that can control electronic devices through touch.
As the field of wearable electronics gains more interest and new functions are added to clothing, an embroidery-based sensor or “button” capable of controlling those functions becomes increasingly important. Integrated into the fabric of a piece of clothing, the sensor can activate and control electronic devices like mobile apps entirely by touch.
The device is made up of two parts; the embroidered pressure sensor itself and a microchip which processes and distributes the data collected by that sensor. The sensor is triboelectric, which means that it powers itself using the electric charge generated from the friction between its multiple layers. It is made from yarns consisting of two triboelectric materials, one with a positive electric charge and the other with a negative charge, which were integrated into conventional textile fabrics using embroidery machines.
Rong Yin, corresponding author of the study, said that the three-dimensional structure of the sensor was important to get right.
“Because the pressure sensor is triboelectric, it needed to have two layers with a gap in between them. That gap was one of the difficult parts in the process, because we are using embroidery which is usually two-dimensional. It’s a technique for decorating fabric,” he said. “It’s challenging to make a three-dimensional structure that way. By using a spacer, we were able to control the gap between the two layers which lets us control the sensor’s output.”
Data from the pressure sensor is then sent to the microchip, which is responsible for turning that raw input into specific instructions for any connected devices. Machine learning algorithms are key to making sure this runs smoothly, Yin said. The device needs to be able to tell the difference between gestures assigned to different functions, as well to disregard any unintentional inputs that might come from the cloth’s normal movement.
“Sometimes the data that the sensor acquires is not very accurate, and this can happen for all kinds of reasons,” Yin said. “Sometimes the data will be affected by environmental factors like temperature or humidity, or the sensor touches something by mistake. By using machine learning, we can train the device to recognize those kinds of things.
“Machine learning also allows this very small device to achieve many different tasks, because it can recognize different kinds of inputs.”
The researchers demonstrated this input recognition by developing a simple music playing mobile app which connected to the sensor via Bluetooth. They designed six functions for the app: play/pause, next song, last song, volume up, volume down and mute, each controlled by a different gesture on the sensor. Researchers were able to use the device for several other functions, including setting and inputting passwords and controlling video games.
The idea is still in its early stages, Yin said, as existing embroidery technology is not capable of easily handling the types of materials used in the creation of the sensor. Still, the new sensor represents another piece of the developing wearable electronics puzzle, which is sure to continue picking up interest in the near future.
The paper, “A clickable embroidered triboelectric sensor for smart fabric,” is published in Device.
Note to Editors: The study abstract follows.
“A clickable embroidered triboelectric sensor for smart fabric”
Authors: Yu Chen, Yali Ling, Yiduo Yang, Zihao Wang, Yang Liu, Wei Gao, Bao Yang, Xiaoming Tao, Rong Yin
Published: April 12, 2024
DOI: 10.1016/j.device.2024.100355
Abstract: Textile-based human-machine interfaces need to seamlessly integrate electronics with conventional fabrics. Here, we present an embroidery-based device that transforms conventional fabric into a “clickable” button. The device is realized through the integration of dual triboelectric yarns using an embroidery pattern that enables a 3D structure. The design can be customized and optimized by adjusting the gaps between the triboelectric yarns for the needed triboelectric output and other performance metrics, such as consistent contact and separation for the clicking mechanism. Machine learning algorithms are used for signal identification of a diverse range of pressing and swiping gestures on the embroidered device.
Acknowledgements: This work was supported by the Wilson College Strategic Collaborative Research & Innovation Fund (SCRIF) at NCSU. Y. Ling acknowledges financial support from the VF Graduate Student Impact Award. Y.Y. acknowledges financial support from the Provost’s Doctoral Fellowship and Goodnight Doctoral Fellowship at NCSU. We thank Shubham Kakirde for assistance with developing the music player app on Android.
This post was originally published in NC State News.
- Categories: