Exploring the Role of Touch in the Digital World
This is Part 2 in a series exploring how we experience technology through the five senses.
In Part 1, we looked at why Human–Computer Interaction (HCI) matters far beyond tech teams. First, we turn to touch. From the buzz in your pocket to the friction of a scroll, touch is one of the most immediate ways we interact with the digital world. What does the human system of touch involve, and how much of it can current technology replicate?
TL;DR:
Touch is a vital but underused part of how we experience technology.
The human touch system includes five components: texture, temperature, pain, body position, and movement.
Most interfaces rely on simple tactile feedback, like vibrations and clicks.
New tools, including surgical simulators and mid-air haptics, point to richer possibilities.
Designing better experiences means understanding the full scope of touch and where technology still falls short.
Maybe your phone buzzed in your pocket three times today, or your watch tapped you with a reminder. These aren’t just notifications; they’re the digital world reaching out to touch you. But what does it mean to feel in a digital environment? And how far can technology go in simulating that feeling?
I’ve been thinking a lot about how we don’t just use technology, we feel it. And that changes everything.
Touch is how we make sense of the physical world. Haptics is how machines try to recreate that sense through vibrations, resistance, and, as we’ll discuss, even through mid-air signals. Understanding the difference helps us design better, more human experiences.
What Is Touch?
Touch is a collection of sensory systems, each tuned to a different kind of input, from texture to temperature to movement:
Tactile: Pressure and texture, like the feel of fabric or the surface of a touchscreen
Thermoreception: Temperature, such as sensing a warm mug or a cold doorknob
Nociception: Pain, like touching something sharp or hot
Proprioception: Awareness of body position, like knowing where your hand is without looking
Kinesthesia: Sense of movement, like the rhythm of walking or reaching for a glass
Back in the early days of computing, Human–Computer Interaction (HCI) was reserved for serious systems such as flight simulators and nuclear control rooms, places where precision and feedback were mission-critical. These days, we tap, swipe, and scroll without much thought. But the richness of what we feel through those interactions still lags behind the real world.
Touch Today: How We Interact
Most of what we engage with now is tactile: that satisfying click on a trackpad, the jolt when your game controller rumbles, or the soft pulse of your smartwatch.
Haptic feedback typically falls into two categories: passive and active. Passive haptics are simple cues, like a phone buzz that says, “Pay attention”. They alert us, but they don’t mimic physical qualities. Active haptics go further, for example, a VR controller that simulates resistance when you pull a virtual lever or renders the subtle texture of a surface. These systems aim to replicate the feeling of interacting with physical objects.
Some newer systems even create the illusion of being touched or nudged, and full-body haptic suits are showing up in training and immersive gaming.
How Well Does HCI Support Our Sense of Touch?
Sensory System | What It Does | Current HCI Support |
---|---|---|
Tactile | Pressure, texture, surface vibration | Well-developed, widely used in phones, smartwatches, and controllers |
Thermoreception | Sense of temperature (hot or cold) | Emerging, in research tools and haptic suits |
Nociception | Perception of pain | Experimental, mainly in medical or conceptual work |
Proprioception | Awareness of body position | Partial, used in VR training and rehab tech |
Kinesthesia | Awareness of body movement | Limited, early use in full-body haptic systems |
Where Research Is Heading
Researchers like Dr. Katherine Kuchenbecker are exploring not just how we can use touch in technology, but how we can truly feel with it. Kuchenbecker’s work focuses on the emotional and communicative power of touch, and her team has developed surgical simulators that allow students to feel the tension and resistance of real tissue. This kind of haptic feedback makes training more intuitive and more human.
The results are compelling. In one study, junior doctors who trained with haptics made fewer errors and used drills with greater control. It’s a clear example of how feel can impact performance.
Other researchers, like Dr. Hiroyuki Kajimoto, are working on touch without direct contact. Electro-tactile stimulation uses tiny electric currents to trigger nerve responses, while mid-air haptics use ultrasonic waves to create touch sensations in open space. These approaches aim to create immersive and hygienic interfaces that don’t rely on traditional screens or surfaces.
We’re already seeing these breakthroughs move beyond the lab, as research turns into tools we’re starting to feel in the real world.
Real-World Applications in Focus
Take the Teslasuit. It doesn’t just vibrate, it simulates pressure, motion, and even temperature across your body. In virtual reality, you can feel the kick of a recoil or the jolt of landing after a jump. That kind of realism is changing how we train, compete, and communicate.
Ultraleap’s tech lets you feel a button press without touching a thing. It uses ultrasound to create mid-air sensations. It’s cleaner, yes, but also opens up new ways to interact when screens aren’t practical or accessible.
Haptics are also making an impact in physical therapy, where tactile feedback can guide motion and correct alignment. In automotive safety, vibrating steering wheels alert drivers to hazards. And in the operating room, robotic systems are evolving to include force feedback for enhanced control.
For organizations investing in user experience, understanding these haptic advances isn’t academic. It’s about staying competitive as touch becomes a differentiator, not just a feature.
The Human Impact
As these technologies move from prototypes to everyday products, their impact on real people becomes even more important.
But, as haptic technologies evolve, they also raise important questions, such as who they serve, how reliably they work, and whether they foster trust.
Replicating the full range of touch isn’t simple, nor is it always welcome. Overuse can lead to fatigue, irritation, or even phantom touch, the belief that you’re being tapped when you aren’t.
The accessibility challenge is real, but it’s also revealing new possibilities. People with neuropathy, prosthetics, or sensory differences may not experience haptic feedback in the same way, or at all. But with thoughtful design, haptic technology can do more than work around limitations. A well-tuned system could help someone with low vision navigate by feel, or reroute sensation to help a prosthetic user regain awareness through alternate input.
Robots have already transformed surgery through precision tools. The next step is giving surgeons back the ability to feel, to know through touch what the machine or tool is doing inside the body. That could improve outcomes and extend a sense of connection and control.
Beyond surgical applications, some systems now let users customize their haptic experience: choosing intensity, pattern, even the type of feedback they prefer. And researchers are pushing further, using AI to adapt haptics in real time. Imagine a device that dials down its vibration when you're anxious, or guides you differently when you're in motion. A productivity app might use gentle pulses to cue a break, while a training simulation adjusts resistance to match your learning pace or stress level.
Recognition Through Touch
To appreciate how far haptics have come and how far they still need to go, try this:
Put your hand in your pocket or your bag. Without looking, try to identify three objects by touch alone. What gives them away? Texture? Weight? Shape? Temperature?
Now, imagine a digital system trying to replicate that. What would it take? Most of what we get today is just a buzz. And when we reduce touch to that one sensation, we lose something vital—complexity, nuance, and meaning.
Closing Thought
Imagine shaking hands in virtual reality, or feeling a reassuring pat during a video call. That’s where haptics is going, toward digital experiences that feel more authentically human.
We may never replicate every detail of human touch. But we can design for what matters: connection, clarity, and care. That’s the opportunity and the challenge.
When did a piece of technology last really connect with you, not just functionally, but in a way you felt it mattered?
Next, we’ll explore how sound shapes our digital experiences and why what you hear can change what you feel.
Further Reading
Advanced Haptic Technologies and Human-Machine Integration by Shull et al. (2015)
Foundational review of wearable haptics, categorizing applications in sensory replacement, augmentation, and training for rehabilitation and accessibility.
Mid-Air Ultrasound Haptic Technologies and Applications by Rakkolainen et al. (2021)
Survey of contactless ultrasonic haptics, covering sensation rendering techniques and emerging immersive interaction models.
Electrotactile Feedback Systems for Enhanced Portability by Yang et al. (2025)
Systematic analysis of 110 studies on electrical haptics, exploring perception, device design, and VR/HCI applications.
Novel Force Feedback Technology Improves Suturing in Robotic Surgery (2025)
Controlled study showing 43% reduced force and improved precision for novice surgeons using haptic-enhanced robotic systems.
Interference Haptic Stimulation and Consistent Quantitative Tactility (2024)
Breakthrough electrotactile screen design using pressure-sensitive transistors, enabling stable tactile rendering validated via neural response.