Ensembles of Explanation Distributions for Discovering Distribution Shifts

Bachelor/Master Thesis

Open Bachelor/Master Thesis

Description:

As input data distributions evolve, the predictive performance of machine learning models tends to deteriorate. In the past, predictive performance was considered the key indicator to monitor. However, explanation aspects have come to attention within the last years. In this work, we investigate how model predictive performance and model explanation characteristics are affected under distribution shifts and how these key indicators are related to each other for tabular data. We find that the modeling of explanation shifts can be a better indicator for the detection of predictive performance changes than state-of-the-art techniques based on representations of distribution shifts. We provide a mathematical analysis of different types of distribution shifts as well as synthetic experimental examples.

In https://arxiv.org/abs/2210.12369 we have described how to discover distribution shift by considering the change of so called explanation distributions. These distributions build on Shapley value explanations of classified objects and allow for determining whether the way that objects are classified changes in a distribution.

The task of this thesis is to use ensembles in order to solidify results.

Project Members

To the top of the page