projects

Multimodal Health Sensing

Designing tools to support personalized, holistic health understanding using menstrual health as a case study.

🎯 Problem

Users struggle to make sense of fragmented, complex health data due to tools not supporting holistic understanding or personal context

🔬 Approach

Designed and deployed a prototype app integrating user-tracked hormones, wearables, and reflections to support multimodal menstrual health sensemaking

📊 Impact

Highlighted the challenges of interpreting complex multimodal data and offered design implications to help users align personal mental models with physiological signals

3 User Studies
80+ Participants
6500+ Days of Health Data Tracked
Menstrual Health Tracking Interface
Cooking Assistant

Smart Cooking Assistant

Context-aware AI assistant for hands-free cooking interactions and video tutorial support

Multimodal Assistants UX Research Applied AI
Loess of WST

Modeling Wearable and Physiological Data Relationships

Building models to capture physiological variations across the menstrual cycle

Wearables Multimodal Sensing Statistical and Machine Learning
Order Picking AR

Industrial AR for Warehouses

Optimizing order picking efficiency through head-worn display positioning and interface design

Head-Worn Displays Interface Design Statistical Analysis
Captioning on Glass

Accessible Real-Time Captioning

Captioning on Glass (CoG) for deaf and hard-of-hearing users using head-worn displays such as Google Glass

Head-Worn Displays Accessibility Statistical Analysis

Innovation & Hackathon Projects

wondARland AR Social Experience

wondARland

AR Social Experience

ARCore Unity C#
View on Devpost
Personalized interview guidance

Pitch

Personalized Interview Guidance

Speech Recognition Google Cloud APIs
View on Devpost
Ubi Translation Service

Ubi

Real-time Translation

Speech Recognition NLP APIs
View on Devpost
Scan

Scan

AR bills and coins recognition

Computer Vision Augmented Reality
View on Devpost