COUNTERFACTUAL EXPLANAIBLE METHODS AND THEIR APPLICATION TO PREDICTIVE PROBLEMS IN EDUCATION
BASIC INFORMATION
Ph.D. Student: Ilias Naser
Advisors: José Raúl Romero, Aurora Ramírez
Started on: February 2023
Keywords: explanaible artificial intelligence, machine learning, educational data mining
THESIS PROPOSAL
Machine learning builds predictive models from data that typically represent a “black box”, i.e., are difficult to understand. This has led to the emergence of explainable artificial intelligence (XAI), an area of research that focuses on enabling users to understand the reasoning behind the model’s predictions, which can ultimately lead to greater trust and technology adoption.
There are two main approaches in XAI depending on the scope of the explanations: local methods, which explain the predictions individually, and global methods, which provide a general understanding of the model. XAI methods are executed after the construction of the machine learning models, using inference, deduction and optimisation mechanisms. Within the local methods, counterfactual explanations stand out, which refer to alternative scenarios that could have occurred if certain variables had been different. In other words, counterfactual explanations can be used to explain what would have needed to be changed in the data to obtain a different prediction.
Machine learning is used in many fields of application, with education being a domain of great interest. The area known as educational data mining (EDM) allows educators to discover knowledge about student behaviour, performance and outcomes. One of the most important problems studied is the prediction of academic performance in order to detect at-risk students. Counterfactual explanations can help teachers understand how students’ performance could be improved by proposing changes in their study habits or classroom behaviour, thus helping them to develop more targeted and effective interventions.
This PhD thesis aims to explore the potential of counterfactual methods in machine learning, proposing new methods that allow users to adapt explanations to their needs. During the duration of the doctoral thesis, the aim is to design and implement new methods capable of generating counterfactual explanations that improve existing methods, carrying out comparative studies and evaluating the quality of the explanations from the user’s point of view. In addition, the application of the proposed methods in the field of EDM is proposed, where there are several predictive problems that could benefit from this type of explanations.
FUNDS
The development of this thesis is being supported by:
- Spanish Ministry of Science and Innovation and the European Regional Development Fund, under project PID2020-115832GB-I00