Connect with us

David Kepron

Brain Food: Feeling Through the Screen

Haptics and getting touchy-feely with tech

mm

Published

on

Before we open our eyes as newborns we “feel” the new world with the largest and most finely tuned sense organ we have – our skin. Despite knowing that we experience the world with our five senses, our computer-user experiences have been dominated by sight and sound, leaving out one of the most profound of our senses, which is touch/feeling. Increasing the sensory input of an experience increases the perception of reality and makes the interaction more profound. When we can “feel through a screen,” we will be able to deliver customer experiences that are more intense on a sensorial level and more meaningful because they provide better body-based memories.

As mobile technology adoption continues, the creators of digital-centric brand experiences come up against the problem of not being able to engage the shopper fully, in an embodied way. Pictures can be beautiful, but touching, smelling, hearing, and even tasting, round out experience. While we access more and more products online and through our devices, our brains want the satisfaction that comes from engaging all of the senses. As we access the brand through the screen, we fall dramatically short of really “getting it,” even if we create content to maximize the impact on the form factor we are using.

Researchers today are addressing the problem that our technology interfaces are typically smooth, touchscreens with no sense of texture or temperature. Dragging your finger across a smooth glass screen is very different from feeling the texture of fabric on your fingertips.

What if our screens could get us closer to embodied experiences by providing tactile sensations?

What if a shopper could swipe their fingers across the screen of a handheld tablet and feel silk, wool, wood or the rough texture of brick?

Even though scientists have studied the physiological and neurological components of touch for decades, computer scientists have lagged behind in transferring the understanding in the biology of touch to human-computer interfaces.

Advertisement

This is now changing.

Technology developments are introducing the “touchy-feely” nature of experience into our devices. Haptic technology provides stimulation to our body by way of touch-capacitive sensors, or a device’s movement that delivers sensorial information either from the user to the device or the reverse. The term “haptic” comes from the Greek haptesthai, meaning to touch/grasp/perceive. As an adjective, it means relating to, or based on the sense of touch. As a noun, it’s usually used in a plural form (haptics), and refers to the science and physiology of the sense of touch.

Most of us have experienced a form of haptic technology at work with our smartphones when they are set to vibrate with alerts and calls. These forms of haptics require small motors that vibrate, creating a feeling of movement.

We may also have experienced touchscreens with digital buttons that combine the sound of a click and the sensation of their moving when touched. Some time ago, designers at Nokia created a touchscreen that makes on-screen buttons feel as if they were real. When users pressed the button, they felt a subtle movement replicating the push down and pop up. Adding to the sensation, users also heard an audible click. To do this, Nokia engineers placed two small piezoelectric (electric polarity due to pressure especially in a crystalline substance such as quartz) sensor pads under a screen, and designed them to move slightly when pressed. To make this convincing, the movement and sound were synchronized perfectly, giving the simulation of a real button moving when pressed.

We’re increasingly turning to digital devices for communication and, in many cases, interacting with them requires the touch or swipe of a finger. Digital interfaces are getting better and better, both in the quality of images that they are now able to display and the lag times that traditionally resulted between a touch and having something happen on the screen. While touch technology is getting better at providing immediate cause and effect, the things we touch, pinch and drag across the screen give us no sensory input that is related to what they are. Everything on the screen feels like glass.

Today, some companies are reimagining haptics, creating touch surfaces that have the feeling of various materials and finishes. Senseg created high-fidelity tactile interfaces with no moving parts. Their technology uses an electrostatic field to turn touchscreens into feelscreens. By passing an ultra-low electrical current into an insulated electrode, they can create a small, attractive force between the finger and the screen. As they change this attractive force over the surface, a variety of physical perceptions can be generated, including physical edges, contours and textures.

Advertisement

With no moving parts, the technology’s response to touch is immediate and silent. The technology is scalable to different form factors, from cellphones and touchpads to larger interactive displays. What’s even more interesting is that this haptic technology can be applied to almost any flat or curved, hard or soft, transparent or opaque surface, which extends the applications far beyond the touchscreens we use today.

In the future, haptic technology will allow us to feel through the screen, connecting us in a more natural touch-sensitive way to our environment. Being able to feel through the screen will give us a more profound and emotional connection to the virtual world and each other.

The long-standing criticism about digital customer experiences being void of the ability to “touch and feel” products may well be resolved with further developments in the use of haptic technology. Online shopping will take a significant turn towards being more “real” when shoppers are able to touch what they are considering buying.

It is true that digital technologies such as augmented and virtual reality (AR/VR) are able to put shoppers in places that are visually compelling. However, we are a ways off from full adoption of these tools and we haven’t yet been able to circumvent people spending extended periods of time in virtual space without the nauseating side effects of a brain fooled into believing it is moving when the body it telling it otherwise.

As technologies such as haptic computing evolve, they will support the dissolution of barriers to emotionally fulfilling and relevant shopping experiences. Shoppers will engage with technology that allows for a degree of physical interaction even when they are not in the store.

David Kepron is Vice President – Global Design Strategies with Marriott International. His focus is on the creation of compelling customer experiences within a unique group of Marriott brands called the “Lifestyle Collection,” including Autograph, Renaissance and Moxy hotels. As a frequently requested speaker to retailers, hoteliers and design professionals nationally and internationally, David shares his expertise on subjects ranging from consumer behaviors and trends, brain science and buying behavior, store design and visual merchandising as well as creativity and innovation. David is also author of “Retail (r)Evolution: Why Creating Right-Brain Stores will Shape the Future of Shopping in a Digitally Driven World,” published by ST Media Group Intl. and available online from ST Books. @davidkepron; www.retail-r-evolution.com.

Advertisement

Advertisement

SPONSORED HEADLINE

7 design trends to drive customer behavior in 2024

7 design trends to drive customer behavior in 2024

In-store marketing and design trends to watch in 2024 (+how to execute them!). Learn More.

Promoted Headlines

Advertisement
Advertisement

Subscribe

Advertisement

Facebook

Most Popular