Publication
ACM ISWC 2015
Conference paper

An energy-aware method for the joint recognition of activities and gestures using wearable sensors

View publication

Abstract

This paper presents an energy-aware method for recognizing time series acceleration data containing both activities and gestures using a wearable device coupled with a smartphone. In our method, we use a small wearable device to collect accelerometer data from a user's wrist, recognizing each data segment using a minimal feature set chosen automatically for that segment. For each collected data segment, if our model finds that recognizing the segment requires high-cost features that the wearable device cannot extract, such as dynamic time warping for gesture recognition, then the segment is transmitted to the smartphone where the high-cost features are extracted and recognition is performed. Otherwise, only the minimum required set of low-cost features are extracted from the segment on the wearable device and only the recognition result, i.e., label, is transmitted to the smartphone in place of the raw data, reducing transmission costs. Our method automatically constructs this adaptive processing pipeline solely from training data.