Data@Hand: a Self-Tracking app that Leverages Speech+Touch Interaction for Visual Exploration

image name

Most mobile health apps employ data visualization to help people view their health and activity data, but these apps provide limited support for visual data exploration. Furthermore, despite its huge potential benefits, mobile visualization research in the personal data context is sparse. This work aims to empower people to easily navigate and compare their personal health data on smartphones by enabling flexible time manipulation with speech. We designed and developed Data@Hand, a mobile app that leverages the synergy of two complementary modalities: speech and touch. Through an exploratory study with 13 long-term Fitbit users, we examined how multimodal interaction helps participants explore their own health data. Participants successfully adopted multimodal interaction (i.e., speech and touch) for convenient and fluid data exploration. Based on the quantitative and qualitative findings, we discuss design implications and opportunities with multimodal interaction for better supporting visual data exploration on mobile devices.

Project Website

Source Code

*Camera-ready paper will be added soon!


Award 0e159cc13b354ef7bc26a6fbcb3cd41403712b388010706c599bc76327872a74 Best Paper Honorable Mention Award
Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction
Young-Ho Kim, Bongshin Lee, Arjun Srinivasan, and Eun Kyoung Choe
ACM CHI 2021 (To appear) (Full Paper) DOI PDF Slide Github Web BibTex