catastrophic forgetting continual learning deep learning lifelong learning Machine Learning
Machine learning has enjoyed rapid and substantial advances in the past few years. However, machine learning models cannot learn continually as we humans do. Humans are continual learners, meaning they can accumulate knowledge, use the previous knowledge to learn from new experiences better, and retain knowledge from previous experiences. In contrast, current machine learning models learn in an isolated manner meaning there is no notion of time (e.g., past or present) in their closed-world learning. The goal of continual learning is to mimic the learning mechanism of humans for machines with significant impacts on the machine learning community. However, this is a challenging problem since current machine learning systems suffer from the catastrophic forgetting problem, meaning they cannot preserve their learned knowledge. Catastrophic forgetting happens mainly because the model is trained sequentially over evolving data distributions. Consequently, the representations the model has learned for previous data will change to adapt to the new data, and the new representations are no longer adequate for the past data. While the recent progress in continual learning is encouraging, our understanding of the catastrophic forgetting problem is still limited. This dissertation aims to understand the continual learning problem better and fill this knowledge gap by studying the theoretical and practical implications of the catastrophic forgetting problem for deep learning models. We will study the catastrophic forgetting problem from various perspectives and show that the optimization, training regime, loss landscape, and architectures of neural networks all play a significant role in alleviating the forgetting. We then use the gained insights to develop continual agents that are more robust to catastrophic forgetting.
Metrics
18 File views/ downloads
40 Record Views
Details
Title
Alleviating Catastrophic Forgetting in Continual Learning
Creators
Seyed Iman Mirzadeh
Contributors
Hassan Ghasemzadeh (Advisor)
Diane Cook (Advisor)
Janardhan Rao Doppa (Committee Member)
Yan Yan (Committee Member)
Awarding Institution
Washington State University
Academic Unit
Electrical Engineering and Computer Science, School of
Theses and Dissertations
Doctor of Philosophy (PhD), Washington State University
Publisher
Washington State University
Number of pages
260
Identifiers
OCLC#: 1365768659; 99900898638301842
Language
English
Resource Type
Dissertation
Research Home Page
Browse and search our researcher profiles
Browse by research and academic units
For display interface
Alleviating Catastrophic Forgetting in Continual Learning