Sequential consolidation using multiple task learning, sweep rehearsal and CVAE generated pseudo examples
LE3 .A278 2022
2022
Silver, Danny
Acadia University
Master of Science
Masters
Computer Science
A key component for a Lifelong Learning Agent is the integration or consolidation of new task knowledge with prior task knowledge. Consolidation requires a solution to several problems,most notably, the catastrophic forgetting problem where the development of representation for a new task reduces the accuracy of prior tasks. This research extends our prior work on consolidation using multiple tasks learning (MTL) networks and a task rehearsal or replay approach. The goal is to maintain functional stability of the MTL network models for prior tasks, while providing representational plasticity to integrate new task knowledge into the same network. Our approach uses (1) a conditional variational autoencoder (CVAE) to generate accurate pseudo-examples (PEs) of prior tasks, (2) sweep-rehearsal requiring only a small number of PEs for each training iteration, (3) the appropriate weighing of PEs to ensure consolidation of new task knowledge with prior tasks, and (4) a novel network architecture we call MTL with Context inputs (MTLc) which combines the best of standard MTL and context-sensitive MTL (csMTL) architectures. Sequential learning of twenty classification tasks using a combination of MNIST and Fashion-MNIST datasets shows that our CVAE based approach to generating accurate PEs is promising and that MTLc performs better than either MTL or csMTL with minimal loss of task accuracy over the sequence of tasks.
The author retains copyright in this thesis. Any substantial copying or any other actions that exceed fair dealing or other exceptions in the Copyright Act require the permission of the author.
https://scholar.acadiau.ca/islandora/object/theses:3740