Wearable devices increasingly capture physiological signals related to users’ affective and cognitive states, yet these signals are commonly presented through numerical dashboards that are difficult to interpret and act upon in everyday contexts. We introduce EmoDrink, a Mixed Reality (MR) research framework for communicating physiology-informed wellbeing recommendations through embodied representations.

EmoDrink integrates smartwatch-derived signals with brief self-reports and contextual cues to generate a single beverage recommendation, then holds this recommendation constant while varying how it is presented: an abstract visualization, a generic embodied agent, and a personalized “future self” avatar.

emodrink

Wearable devices increasingly capture physiological signals related to users’ affective and cognitive states, yet these signals are commonly presented through numerical dashboards that are difficult to interpret and act upon in everyday contexts. We introduce EmoDrink, a Mixed Reality (MR) research framework for communicating physiology-informed wellbeing recommendations through embodied representations.

EmoDrink integrates smartwatch-derived signals with brief self-reports and contextual cues to generate a single beverage recommendation, then holds this recommendation constant while varying how it is presented: an abstract visualization, a generic embodied agent, and a personalized “future self” avatar.

Wearable devices increasingly capture physiological signals related to users’ affective and cognitive states, yet these signals are commonly presented through numerical dashboards that are difficult to interpret and act upon in everyday contexts. We introduce EmoDrink, a Mixed Reality (MR) research framework for communicating physiology-informed wellbeing recommendations through embodied representations.

EmoDrink integrates smartwatch-derived signals with brief self-reports and contextual cues to generate a single beverage recommendation, then holds this recommendation constant while varying how it is presented: an abstract visualization, a generic embodied agent, and a personalized “future self” avatar.

emodrink

Wearable devices increasingly capture physiological signals related to users’ affective and cognitive states, yet these signals are commonly presented through numerical dashboards that are difficult to interpret and act upon in everyday contexts. We introduce EmoDrink, a Mixed Reality (MR) research framework for communicating physiology-informed wellbeing recommendations through embodied representations.

EmoDrink integrates smartwatch-derived signals with brief self-reports and contextual cues to generate a single beverage recommendation, then holds this recommendation constant while varying how it is presented: an abstract visualization, a generic embodied agent, and a personalized “future self” avatar.