Used together, mobile/wearable sensor readings and task/time/motion scripts could become a powerful yet straightforward way to detect and log a person’s activities. As an early step, we want to develop an interactive system for associating those two types of data.
This project has four main pieces:
1. Design and implement a scheme for a user to specify his/her activities in terms of a sequence of timed actions. It should allow for recursive decomposition and for re-use of “atomic” actions (e.g., sitting down, lifting an object, opening a door, walking between refrigerator and stove, etc.). This might be captured in or transferred to a calendar program for convenience.
2. Design and implement a scheme for interrogating the user* via his/her smartwatch about the task that is about to start, currently being executed, or just completed. It could be an inquiry with a limited number of responses or a simple yes/no confirmation.
3. Design and implement a scheme to associate smartwatch sensor data with the tasks as logged via EMA. Doing so establishes a labeled data set for a supervised machine learning algorithm (see below).
4. Design and implement a machine learning approach to automatically identify some actions or whole tasks based on the smartwatch sensor data. As a bonus, upgrade the user interrogation scheme to do EMAs for confirming/refuting the identified actions/tasks.)
- A network of sensors that can record elements of the user’s activity are installed. These sensors include ones for indoor/outdoor localization, physiological parameters (heart rate, step count), data from smart phones and a wearable camera.
- A reflective Android UI that allows users to visualize their daily activities at the end of the day.