Dual-memory Neural Networks for Modeling Cognitive Activities of Humans via Wearable Sensors

Abstract

Wearable devices, such as smart glasses and watches, allow for continuous recording of everyday life in a real world over an extended period of time or lifelong. This possibility helps better understand the cognitive behavior of humans in real life as well as build human-aware intelligent agents for practical purposes. However, modeling the human cognitive activity from wearable-sensor data stream is challenging because learning new information often results in loss of previously acquired information, causing a problem known as catastrophic forgetting. Here we propose a deep-learning neural network architecture that resolves the catastrophic forgetting problem. Based on the neurocognitive theory of the complementary learning systems of the neocortex and hippocampus, we introduce a dual memory architecture (DMA) that, on one hand, slowly acquires the structured knowledge representations and, on the other hand, rapidly learns the specifics of individual experiences. The DMA system learns continuously through incremental feature adaptation and weight transfer. We evaluate the performance on two real-life datasets, the CIFAR-10 image-stream dataset and the 46-day Lifelog dataset collected from Google Glass, showing that the proposed model outperforms other online learning methods.

Publication
Neural Networks (IF = 7.197, 2017)
Date
Links