Sensory design principles emphasize the interconnection of human perception and prompt designers to explore non-visual solutions.
Life Is Multisensory
Smell-O-Vision.
AromaRama.
iSmell.
Real names of real products once thought to be the next big things in entertainment and technology. All three failed miserably, along with countless other olfactory gadgets and multisensory gizmos. iSmell bankrupted its founders, AromaRama faded into oblivion, and Smell-O-Vision made Time’s “100 Worst Ideas of the Century” in 1999.
Contraptions such as Smell-O-Vision and iSmell represent the lower rungs of practicality. They also reveal a profound impulse that permeates invention: the desire to form symbiotic ties between products and the people who use them.
Unfortunately, most digital designers attempt to establish these ties through sight and sound alone, as if humans were all eyes and ears. To some degree, that makes sense. The practical constraints of digital devices make vision and hearing the most obvious experiential targets.It would be unwise to advocate the implementation of AromaRama-like hardware into smartphones, tablets, and laptops.
Still, myriad activities of humanity are multisensory. Everything from leisure to language requires a symphony of senses. Are vision and hearing the only modes of perception worth considering in the digital design process?
The Five-Sense Myth
If there were a sense hierarchy in digital product design, it would consist of sight, hearing, and touch. The reason why is evident: mobile devices rely on visual, auditory, and tactile feedback. But not only are there more than three senses, there are more than the five commonly cited. Aristotle made that pentamerous proposition, but today, experts suggest that humans have between 9 and 33 distinct senses.
At a high level, there are four types of human sensory receptors and four physical stimuli: photoreceptors (light), chemoreceptors (chemicals), thermoreceptors (temperature), and mechanoreceptors (mechanical forces). The information gathered from receptors and stimuli triggers processes such as vision, hearing, and smell (also called “sense modalities”). There are nine sense modalities — or sensations perceived after stimulus:
- Vision: The power to see objects by use of the eyes
- Hearing: The faculty by which sounds are perceived in the ears
- Smell: To detect odors through the nose using the olfactory nerves
- Taste: The sense by which the tongue discerns the flavor of something
- Touch: The sense by which materials are perceived through physical contact
- Pain: A distressing sensation occurring in a part of the body
- Mechanoreception: The body’s perception of vibration, stretching, pressure, or other mechanical stimuli
- Temperature: The discernment of hot and cold through receptors in the skin
- Interoception: The detection of stimuli and sensations originating within the body
Each of the nine modalities has sub-senses that are up for debate. Some are considered plausible, and others are deemed radical.
How Are Sensory Design and Digital Design Related?
Whether there are 5, 9, or 33 senses, designers prioritize sight, hearing, and touch because it’s impossible to taste, smell, or feel an app’s temperature. But what if it wasn’t?
At the core of sensory design lies this reality: Every digital interaction is a sensory experience. The aim is to:
- Make sensory engagement more intentional and multifaceted
- Activate the senses in ways that bolster UX (enhanced navigation, improved discoverability, etc.)
- Create product (and brand) experiences that are more appealing and memorable
To leverage perception’s full potential, designers need a principled framework for including senses in the digital design process.