Activity Segmentation and Identification based on Eye Gaze Features
Sprache des Titels:
PETRA '18 Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference
[Best student paper award] In coherence with the ongoing digitalization of production processes, Human Computer Interaction (HCI) technologies have evolved rapidly in industrial applications, providing abundant numbers of the versatile tracking and monitoring devices suitable to address complex challenges. This paper focuses on Activity Segmentation and Activity Identification as one of the most crucial challenges in pervasive computing, applying only visual attention features captured through mobile eye-tracking sensors. We propose a novel, application-independent approach towards segmentation of task executions in semi-manual industrial assembly setup via exploiting the expressive properties of the distribution-based gaze feature Nearest Neighbor Index (NNI) to build a dynamic activity segmentation algorithm. The proposed approach is enriched with a machine learning validation model acting as a feedback loop to classify segments qualities. The approach is evaluated in an alpine ski assembly scenario with real-world data reaching an overall of 91% detection accuracy.