Human skin can sense subtle patterns of pressure, timing, and movement, but most digital devices only register simple taps and swipes. This difference has led researchers to explore new touch-based technologies.
They have tested sensor-filled gloves, wearable bands that track small pressure changes, and thin surfaces that create precise vibrations. While these ideas show promise, many are still too rigid, limited in the gestures they detect, or unable to provide meaningful, nuanced feedback.
One big challenge is that digital text relies on ASCII, a standardized code of 128 characters that includes letters, numbers, punctuation, and control symbols. Translating this full range into tactile signals is difficult because each character must be represented in a way that can be accurately felt, distinguished, and interpreted through touch alone, without relying on sight or sound.
New wearable converts touch into full digital text
Recent advances in soft materials and AI are opening up new ways to interact with technology. Stretchable circuits can move with the skin, gel-based sensors can detect tiny forces, and small motors can produce distinct vibration patterns. AI algorithms can quickly interpret complex, changing signals. Together, these innovations create a vision where the skin becomes more than a touchpoint – it becomes a channel for two-way information flow.
Building on this concept, a study in Advanced Functional Materials introduces a soft, skin-like patch that turns touch into text and delivers text feedback through the skin. The device combines iontronic sensors, flexible circuits, compact vibration modules, and an AI model trained to recognize pressing patterns. This creates a full two-way loop capable of representing all 128 ASCII characters purely through touch, Nanowerk reports.
The patch uses a stretchable copper circuit on polyimide that bends, twists, and stretches without breaking. A soft silicone layer keeps it flexible, while a skin-like stiffness of 435 kPa and a silicone adhesive make it comfortable to wear and remove. Its main sensor is an iontronic array, where a gel-coated rice paper layer changes capacitance when pressed. A copper electrode detects these changes, turning touch into measurable signals.
Using sensors and vibration patterns
The patch encodes text by breaking each ASCII character into four segments, with each sensor representing a two-bit segment. The number of presses on a sensor within a short time determines the segment’s value. To provide feedback, the system sends vibration pulses, where each actuator vibrates a set number of times corresponding to its segment. This creates a tactile communication method directly aligned with ASCII.
Instead of collecting massive datasets, the researchers built a mathematical model of pressing behavior. Each press has four phases – rise, peak, fall, and return – and variations in force, duration, and number of presses are sampled to generate synthetic data that mimics real sensor signals.
The patch has been demonstrated in two ways. In one, a user types “Go!” with a series of presses, and the computer decodes the text while sending tactile confirmation – letting interaction happen without looking. In the other, the patch controls a racing game: presses steer the car, and vibration intensity shows the distance to nearby vehicles, with stronger vibrations indicating closer objects.