Data@Hand: Multimodal Data Exploration of Personal Data
Members
Young-Ho Kim
Bongshin Lee
Arjun Srinivasan
Eun Kyoung Choe
Keywords
Personal data visualization
visual data exploration
speech
multimodal interaction
smartphone

Data@Hand is a cross-platform smartphone app that facilitates visual data exploration leveraging both speech and touch interactions. To overcome the smartphones' limitations such as small screen size and lack of precise pointing input, Data@Hand leverages the synergy of speech and touch; speech-based interaction takes little screen space and natural language is flexible to cover different ways of specifying dates and their ranges (e.g., "October 7th", "Last Sunday", "This month"). Currently, Data@Hand supports displaying the Fitbit data (e.g., step count, heart rate, sleep, and weight) for navigation and temporal comparisons tasks.

Demo Video

Funding

  • National Science Foundation award #1753452 (CAREER: Advancing Personal Informatics through Semi-Automated and Collaborative Tracking, PI: Dr. Eun Kyoung Choe).
  • Young-Ho Kim was in part supported by Basic Science Research Program through the National Research Foundation in Korea, funded by the Ministry of Education (NRF2019R1A6A3A12031352).

Publication

Honorable Mention Award
Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction
Young-Ho Kim,
Bongshin Lee,
Arjun Srinivasan,
and Eun Kyoung Choe
ACM CHI 2021 (Full Paper)
FoodScrap: Capturing Rich Food Contexts with Speech
Members
Yuhan Luo
Young-Ho Kim
Bongshin Lee
Naeemul Hassan
Eun Kyoung Choe
Keywords
Food journaling
Speech input
Smartphone
OmniTrack

The factors influencing people’s food decisions, such as one’s mood and eating environment, are important information to foster selfreflection and to develop personalized healthy diet. But, it is difficult to consistently collect them due to the heavy data capture burden. In this work, we examine how speech input supports capturing everyday food practice through a week-long data collection study.

Using OmniTrack for Research, we deployed FoodScrap, a speech-based food journaling app that allows people to capture food components, preparation methods, and food decisions. Using speech input, participants detailed their meal ingredients and elaborated their food decisions by describing the eating moments, explaining their eating strategy, and assessing their food practice. Participants recognized that speech input facilitated self-reflection, but expressed concerns around rerecording, mental load, social constraints, and privacy.

Funding

Publication

FoodScrap: Promoting Rich Data Capture and Reflective Food Journaling Through Speech Input
Yuhan Luo,
Young-Ho Kim,
Bongshin Lee,
Naeemul Hassan,
and Eun Kyoung Choe
ACM DIS 2021 (Full Paper)
OmniTrack for Research: A Research Platform for Streamlining Mobile-based In-Situ Data Collection
Members
Young-Ho Kim
Bongshin Lee
Jinwook Seo
Eun Kyoung Choe
Keywords
In-situ data collection
research toolkit
mobile
web
OmniTrack

OmniTrack for Research (O4R) is a research platform for mobile-based in-situ data collection, which streamlines the implementation and deployment of a mobile data collection tool. O4R enables researchers to rapidly translate their study design into a study app, deploy the app remotely, and monitor the data collection, all without requiring any coding.

In-situ data collection studies (e.g., diary study, experience sampling) are commonly used in HCI and UbiComp research to capture people's behaviors, contexts, and self-report measures. To implement such studies, researchers either rely on commercial platforms or build custom tools, which can be inflexible, costly, or time consuming. O4R minds this gap between available tools and researchers' needs.

Research Papers used OmniTrack for Research

OmniTrack for Research was used by us or other researchers to conduct studies at peer-reviewed venues. Here is the list of the publication used OmniTrack for Research:

  1. Yuhan Luo, Young-Ho Kim, Bongshin Lee, Naeemul Hassan, and Eun Kyoung Choe
    FoodScrap: Promoting Rich Data Capture and Reflective Food Journaling Through Speech Input
    ACM DIS 2021 Link
  2. Eunkyung Jo, Austin L. Toombs, Colin M. Gray, and Hwajung Hong
    Understanding Parenting Stress through Co-designed Self-Trackers
    ACM CHI 2020 Link
  3. Young-Ho Kim, Eun Kyoung Choe, Bongshin Lee, and Jinwook Seo
    Understanding Personal Productivity: Know Knowledge Workers Define, Evaluate, and Reflect on Their Productivity
    ACM CHI 2019 Link
  4. Sung-In Kim, Eunkyung Jo, Myeonghan Ryu, Inha,Cha, Young-Ho Kim, Heejeong Yoo, and Hwajung Hong
    Toward Becoming a Better Self: Understanding Self-Tracking Experiences,of Adolescents with Autism Spectrum Disorder Using Custom Trackers
    EAI PervasiveHealth 2019 Link

Funding

OmniTrack: A Flexible Self-Tracking App for Semi-Automated Tracking
Members
Young-Ho Kim
Jae Ho Jeon
Bongshin Lee
Eun Kyoung Choe
Jinwook Seo
Keywords
Flexible self-tracking
semi-automated tracking
mobile
OmniTrack

OmniTrack is a mobile self-tracking app that enables self-trackers to construct their own trackers and customize tracking items to meet their individual needs. OmniTrack was designed based on the semi-automated tracking concept: People can build a tracker by combining both automated and manual tracking methods to keep a balance between capture burden and tracking feasibility. Under this notion, OmniTrack allows people to combine input fields to define the input schema of a tracker and attach external sensing services, such as Fitbit, to feed sensor data to individual data fields. People can use Triggers to let the system to initiate data entry in a fully automated way.

Demo Video

Funding

  • National Science Foundation under award number CHS-1652715.
  • National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. NRF-2016R1A2B2007153).

Publication

OmniTrack: A Flexible Self-Tracking Approach Leveraging Semi-Automated Tracking
Young-Ho Kim,
Jae Ho Jeon,
Bongshin Lee,
Eun Kyoung Choe,
and Jinwook Seo
PACM IMWUT (UbiComp 2017) (Full Paper)
Diary Study to Investigate Knowledge Workers' Holistic Nature of Productivity
Members
Young-Ho Kim
Eun Kyoung Choe
Bongshin Lee
Jinwook Seo
Keywords
Productivity
Diary study
Qualitative Analysis
OmniTrack

Existing productivity tracking tools are usually not sufficiently designed to capture the diverse and nebulous nature of individuals' activities: For example, screen time trackers such as RescueTime does not support capturing work activities that do not involve digital devices. As the distinction between work and life has become fuzzy, we need a more holistic understanding of how knowledge workers conceptualize their productivity in both work and non-work contexts. Such knowledge would inform the design of productivity tracking technologies.

We conducted a mobile diary study using OmniTrack for Research, where participants captured their productive activities and the rationale of productivity. From the study, we identified six themes of productivity that participants consider when evaluating their productivity. Participants reported a wide range of productive activities beyond typical desk-bound work, ranging from having a personal conversation with dad to getting a haircut. We learned the way people assess productivity was more diverse and complex than we thought, and the concept of productivity is highly individualized, calling for personalization and customization approaches in productivity tracking.

Funding

Publication

Understanding Personal Productivity: How Knowledge Workers Define, Evaluate, and Reflect on Their Productivity
Young-Ho Kim,
Eun Kyoung Choe,
Bongshin Lee,
and Jinwook Seo
ACM CHI 2019 (Full Paper)
TimeAware: Leveraging Framing Effects to Enhance Personal Productivity
Members
Young-Ho Kim
Jae Ho Jeon
Eun Kyoung Choe
Bongshin Lee
KwonHyun Kim
Jinwook Seo
Keywords
Productivity
Personal data visualization
desktop
ambient display
framing effects

Screen time tracking is now prevalent, but we have little knowledge on how to design effective feedback on the screen time information. To help people enhance their personal productivity by providing effective feedback, we designed and developed TimeAware, a self-monitoring system for capturing and reflecting on personal computer usage behaviors. TimeAware employs an ambient widget to promote self-awareness and to lower the feedback access burden, and web-based information dashboard to visualize people’s detailed computer usage. To examine the effect of framing on individual’s productivity, we compared two versions of TimeAware, each with a different framing setting—one emphasizing productive activities and the other emphasizing distracting activities. We found a significant effect of framing on participants’ productivity: only participants in the negative framing condition improved their productivity. The ambient widget seemed to help sustain engagement with data and enhance self-awareness.

Funding

  • National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. NRF2014R1A2A2A03006998).

Publication

TimeAware: Leveraging Framing Effects to Enhance Personal Productivity
Young-Ho Kim,
Jae Ho Jeon,
Eun Kyoung Choe,
Bongshin Lee,
KwonHyun Kim,
and Jinwook Seo
ACM CHI 2016 (Full Paper)