Discovering Image Distribution Shifts Using Explanation Distributions

Bachelor Thesis

Open Bachelor's Student Thesis

Description

As input data distributions evolve, the predictive performance of machine learning models tends to deteriorate. In the past, predictive performance was considered the key indicator to monitor. However, explanation aspects have come to attention within the last years. In this work, we investigate how model predictive performance and model explanation characteristics are affected under distribution shifts and how these key indicators are related to each other for tabular data. We find that the modeling of explanation shifts can be a better indicator for the detection of predictive performance changes than state-of-the-art techniques based on representations of distribution shifts. We provide a mathematical analysis of different types of distribution shifts as well as synthetic experimental examples.

In https://arxiv.org/abs/2210.12369 we have described how to discover distribution shift by considering the change of so-called explanation distributions. So far this technique was applied to structured data.

In this thesis, we target the investigation of whether feature representations of images could successfully contribute to the discovery of distribution shift.

Project Members

To the top of the page