Horizon CDT Research Highlights

Research Highlights

Fair Privacy-Preserving Insights

  Ana Rita Pena (2019 cohort)

Decisions are increasingly being automated in a wide range of industries.

There has been a considerable amount of research on privacy protection and private learning. Recently, Differential Privacy (DP) has established itself as a state of the art privacy enhancing technology. Concurrently, there has been a growing interest in algorithmic fairness, the study of bias in decision making systems and their proposed solutions. In order for these newly developed tools to be fully implemented, there is a demand to study them together. There are instances in which the applications of differential privacy affect different groups disproportionally, as well as certain fairness definitions harming the groups they were meant to protect in specific contexts.

In order for automated decision making and classification to be fully implemented it needs to be able to live up to the values and expectations society. One cannot just have privacy without fairness. Only limited research has been done to bridge DP and Fairness . So far the work that has been done has relied solely on Statistical Fairness definitions (one of the three main categories) in the context of classification tasks.

The aim of my PhD is to explore and develop ML algorithms that preserve minorities privacy in an equitable way. This is a challenge as these communities have small populations. By definition DP solutions are far from optimal in these scenarios, meaning that the communities that need this protection the most are the ones getting the worse end of the deal.

From the technical perspective there are two main challenges to achieving this goal: understanding how to make DP algorithms more equitable for small subgroup populations and understanding how complex definitions of fairness can be made for DP in a practical manner.

This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (UKRI Grant No. EP/S023305/1).