NEW CHALLENGES IN DEVELOPMENT OF REUSABLE MACHINE LEARNING MODELS.
BASIC INFORMATION
Ph.D. Student: Zaid Raad Saber Zubair
Advisor: Sebastián Ventura
Started on: February 2021
Keywords: Machine learning, Lifelong Machine Learning
THESIS PROPOSAL
The current dominant paradigm for Machine Learning (ML) is to run an algorithm on a given dataset to generate a model, which is then applied to real-life performance tasks. We call this paradigm isolated learning because it does not consider any other related information or the previously learned knowledge. However, the fundamental problem with the isolated learning paradigm is that it does not retain and accumulate knowledge learned in the past in order to use it in future learning. On the other hand, humans never learn in isolation or from scratch, but we always retain the knowledge learned in the past and use it to help future learning and problem-solving.
Without the ability to accumulate and use prior knowledge, an ML algorithm typically needs a large number of training examples in order to learn effectively. Continual Learning (CL) aims to learn many different tasks without forgetting how to solve previously learned tasks. CL is difficult to obtain because when the model learns how to solve a task, the information about how to solve previously learned tasks is generally lost. This phenomenon, called Catastrophic Forgetting, occurs when the model, trained on a task, is modified in order to meet the objectives of the new task. As a result, the model accuracy on the previous task would be drastically reduced after a few training updates on successive tasks. Catastrophic Forgetting is considered one of the main problems that appear in CL and is a hot topic that researchers are trying to solve.
Thus, the main objective of this thesis will be the development and implementation of new Lifelong Machine Learning algorithms to use the previously learned information in future learning in an efficient way to reduce the exhausted of memory and time of learning, while overcoming the catastrophic forgetting problem.