Sequential consolidation of learned task knowledge
LE3 .A278 2004
2004
Silver, Danny
Acadia University
Bachelor of Computer Science
Honours
Computer Science
A major problem of life-long machine learning is how to sequentially consolidate learned task knowledge into a single long-term memory structure (domain knowledge) without loss of prior knowledge. Consolidated domain knowledge makes more e±cient use of memory and can be used for e±cient and e®ective transfer of knowledge when learning future tasks. Relevant background on arti¯cial neural networks, inductive bias and knowledge transfer using multiple task learning (MTL) networks is reviewed. A theory of sequential consolidation of task knowledge which uses large MTL networks and a method of task rehearsal to overcome the stability-plasticity problem and the loss of prior knowledge is presented. The theory is tested on two synthetic domains containing diverse sets of tasks and it is shown that new task knowledge can be consolidated without loss of the existing knowledge under the proper conditions.
The author retains copyright in this thesis. Any substantial copying or any other actions that exceed fair dealing or other exceptions in the Copyright Act require the permission of the author.
https://scholar.acadiau.ca/islandora/object/theses:435