Humans are multi-sensory beings who make sense of the
world via multiple sensory modalities. However, most everyday technologies, including smartphones, tablets, and personal computers, often fail to engage all the senses effectively. Instead, vision is heavily relied on to interface with everyday technologies, and as a result, it is disproportionately stimulated when compared to the rest of our senses. This hegemony of vision in design not only excludes the visually challenged but imbalanced sensory experiences also create sensory overload and fatigue in
those who are not. Limiting the senses that are engaged while interfacing with digital devices also increases the cognitive burden placed on the user. When complex information is conveyed unimodally in highly stimulating environments, people must expend a large amount of
energy and cognitive resources to attend to and grasp information to commit it to memory and recall it.
This thesis aims to address contemporary sensory imbalance challenges by investigating and hypothesizing effective and accessible multimodal interfaces for everyday technologies and providing considerations for designers who seek to leverage multiple senses to convey information. Given the fact that taste often requires people to consume substances, the effects are difficult to control when designing for larger populations. Therefore, this thesis intentionally focuses on designing for touch, smell, hearing, and vision as a means of creating experiences that effectively disperse the intake of information across multiple
senses. In particular, it aims to investigate how interactions
with everyday technology can be designed to effectively engage the senses beyond vision to create balanced, immersive, and inclusive experiences.