TY - GEN
T1 - Language Model Co-occurrence Linking for Interleaved Activity Discovery
AU - Rogers, Eoin
AU - Kelleher, John D.
AU - Ross, Robert J.
N1 - Publisher Copyright:
© 2020, IFIP International Federation for Information Processing.
PY - 2020
Y1 - 2020
N2 - As ubiquitous computer and sensor systems become abundant, the potential for automatic identification and tracking of human behaviours becomes all the more evident. Annotating complex human behaviour datasets to achieve ground truth for supervised training can however be extremely labour-intensive, and error prone. One possible solution to this problem is activity discovery: the identification of activities in an unlabelled dataset by means of an unsupervised algorithm. This paper presents a novel approach to activity discovery that utilises deep learning based language production models to construct a hierarchical, tree-like structure over a sequential vector of sensor events. Our approach differs from previous work in that it explicitly aims to deal with interleaving (switching back and forth between activities) in a principled manner, by utilising the long-term memory capabilities of a recurrent neural network cell. We present our approach and test it on a realistic dataset to evaluate its performance. Our results show the viability of the approach and that it shows promise for further investigation. We believe this is a useful direction to consider in accounting for the continually changing nature of behaviours.
AB - As ubiquitous computer and sensor systems become abundant, the potential for automatic identification and tracking of human behaviours becomes all the more evident. Annotating complex human behaviour datasets to achieve ground truth for supervised training can however be extremely labour-intensive, and error prone. One possible solution to this problem is activity discovery: the identification of activities in an unlabelled dataset by means of an unsupervised algorithm. This paper presents a novel approach to activity discovery that utilises deep learning based language production models to construct a hierarchical, tree-like structure over a sequential vector of sensor events. Our approach differs from previous work in that it explicitly aims to deal with interleaving (switching back and forth between activities) in a principled manner, by utilising the long-term memory capabilities of a recurrent neural network cell. We present our approach and test it on a realistic dataset to evaluate its performance. Our results show the viability of the approach and that it shows promise for further investigation. We believe this is a useful direction to consider in accounting for the continually changing nature of behaviours.
UR - http://www.scopus.com/inward/record.url?scp=85084183729&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-45778-5_6
DO - 10.1007/978-3-030-45778-5_6
M3 - Conference contribution
AN - SCOPUS:85084183729
SN - 9783030457778
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 70
EP - 84
BT - Machine Learning for Networking - 2nd IFIP TC 6 International Conference, MLN 2019, Revised Selected Papers
A2 - Boumerdassi, Selma
A2 - Renault, Éric
A2 - Mühlethaler, Paul
PB - Springer
T2 - 2nd International Conference on Machine Learning for Networking, MLN 2019
Y2 - 3 December 2019 through 5 December 2019
ER -