top of page

Emothesia: interoception and interpersonal senses

Updated: Mar 18

Made possible through the Magic Leap + OpenBCI Emotibit / Apple Watch

Overview

Emothesia is a sensory augmentation system that interprets and maps biosignals from yourself or someone else to your vision and hearing. The system consists of an AR system (based on Magic Leap 2) for visual-audio mapping and dashboard display of heart rate, temperature, and electrodermal activity (EDA) signals (based on OpenBCI Emotibit and the Apple Watch). By enabling you to listen to someone else’s body, the system aims to enhance interpersonal awareness of emotional states and triggers. By providing an interoceptive sense of your own biosignals, the system aims to enhance individual emotional awareness.

Woman wearing augmented reality glasses and moving towards a man with the controller in her hand.
Treyden and Alice testing out Emothesia on the Magic Leap 2

Rationale

Alfred Adler once said that all problems are interpersonal problems – in modern societies, despite digital technology augmenting our intelligence with access to near infinite information, and automation ever increasing our efficiency, we still struggle with most of the same (if not exacerbated) interpersonal problems we have always had. We envision augmentation technology that allows us to enhance our connection and understanding of each other as a core method of creating ripple improvements in the systems of society.


As a small step in that direction, we aimed to increase our awareness of each other’s and our own emotions. Humans have the natural ability to perceive parts of their own being and state – whether that be externally, via proprioceptive senses, or whether that be internally, via interoceptive senses. This awareness of the self, often sharpened through spiritual and meditative practices, orients us in the world and positions us better to navigate complex situations. Emerging technologies such as AR headsets, haptic wearables, and biometric sensors provide the opportunity to augment our interoceptive senses through visual, audio, and haptic mapping of biodata. In this project, we explored initial prototypes of wearable interoception augmentation systems.




Development Process

We prototyped two systems, one focused on visual-audio interoception and the other focused on haptic intervention.


I. Visual-Audio Interoception

We developed an AR system based on the Magic Leap 2 and OpenBCI Emotibit with both explicit and implicit visual-audio representation of your biosignals. The explicit representation includes a dashboard with animated icons representing and changing based on different biosignals (heart rate, temperature, EDA). The implicit representation includes visual dimming of the world and heartbeat audio that plays on each of your heart beats. The signal is processed through TouchDesigner and sent via OSC facilitated by mobile hotspot to the Magic Leap headset. The heart rate processing involves performing peak recognition on raw photoplethysmography (ppg) signals to detect heart beats. Heart beats trigger the dimming and animation of the heart icon. The temperature and EDA processing involve normalization to experimentally calculated ranges and control color and height lerping of the temperature icon, as well as the magnitude of random movement of the nervous icon.


II. Haptic Intervention

Research has shown that audio and haptic simulations of heart beat can affect your real heart beat. We developed a haptic intervention for controlling your own heartbeat on the Apple Watch. The system consisted of a heart rate sensing module based on open-source heart rate measurement app, a haptic actuation module through which the user can adjust the rate of a vibrational beat (to simulate a heartbeat), with an iPhone companion app to facilitate OSC communication with other watches or headset devices. The device could be used for hacking your own physiology, e.g. when you want to become more energetic through increasing your heart rate, or for synchronizing with another person – maybe one person needs to calm down and the other needs to become more aroused. Research has shown that physiological synchronization could be associated with increased team performance and parent-child relationships [Interpersonal Autonomic Physiology: A Systematic Review of the Literature].


Speculative notion: We explored the notion of interpersonal senses, i.e. senses that reflect similarities and differences between the biodata from two or more people in one cohesive language. For example, a visual-audio system in which the dimming and background noise reflect the absolute difference between your physiological state and someone else’s physiological state – the more in sync you become, the less dim and loud your world becomes. Interventions such as the haptic heartbeat synchronization intervention could be integrated into this interpersonal sensing system to provide increased control over interpersonal states.


Findings

Through our work, we discovered the following insights, some of which may present novel perspectives, some of which support existing perspectives:


I. Dimming and implicit sensory feedback creates a higher sense of connection & embodiment.

Contrasting the explicit and implicit modes of sensory feedback, we found that the implicit feedback through dimming and playing audio on heart beats created a greater sense of embodied connection to our own or others’ biosignals. Dimming in particular, enabled by Magic Leap 2’s unique capabilities, created immersion through diminishment – the invasive nature of the whole world dimming created a greater sense of immediacy and intimacy with the source of the dimming signal. The explicit icon dashboard provided more quantitative representations of data, which we also found useful for comparisons across time – e.g. it maybe difficult to realize your heart rate is slowing down from implicit feedback, but the icon dashboard shows your current biosignals in relation to a static range, allowing for a bird’s eye view monitoring of the state of the self. We recommend hybrid systems integrating both explicit and implicit feedback.


II. Headsets present a barrier in interpersonal connection due to facial visibility.

Eye contact and recognition of microexpressions are crucial to interpersonal connection and communication – by obscuring the eyes and a large portion of the face, XR headsets interfere with face-to-face interactions. If systems like ours are to be used to enhance interpersonal connection, the tradeoff of hiding the face will outweigh the benefits of augmenting perception of each other with additional biosensory information until headsets become far sleeker in form factor. In select directional observation use cases, however, such as a therapist or doctor observing a patient, this system could be appropriate.


III. Hardware system constraints limit the immediate usability of emerging systems.

Due to the discomfort of wearing headsets for prolonged periods of time, along with the inaccuracies of biosensing based on wearable systems, our technology is still purely speculative. Significant hardware advancements would need to be made before such AR-biosensor based augmentation systems would be usable in daily life.


IV. Biosignal interpretation begs philosophical questions about the role of the machine in interpreting human data.

In exploring ways to interpret the raw biosignal data to display more meaningful feedback to the user, we found fundamental issues with the process of interpretation. Like language, the processes and metrics we develop in data processing and interpretation shape our phenomenological reality. In processing biosignals, common methods aim to classify human emotional states into discrete categories – namely, the six basic emotions defined by Paul Lckman: happiness, sadness, disgust, fear, surprise, and anger. The majority of our current emotion recognition systems are based on this taxonomy. Yet this categorization is imperfect and can even be misleading, as evident by the numerous variations of taxonomies different cultures employ (e.g., the concept of "Amae" (甘え) as pleasure in the reliance on another's love in Japanese, "Iktsuarpok" as anticipation for someone to arrive in Inuit, "Toska" as spiritual anguish in Russian), as well as the fundamentally continuous nature of our emotional experiences.


135 views0 comments

Recent Posts

See All
bottom of page