Carnegie Mellon University
Browse

(Un)Fairness Along the AI Pipeline Problems and Solutions

Download (9.68 MB)
thesis
posted on 2023-06-12, 18:29 authored by Emily BlackEmily Black
<p> Artificial Intelligence (AI) systems now influence decisions impacting every<br> aspect of people’s lives, from the news articles they read, to whether or not they<br> receive a loan. While the use of AI may lead to great accuracy and efficiency in<br> the making of these important decisions, recent news and research reports have<br> shown that AI models can act unfairly: from exhibiting gender bias in hiring<br> models, to racial bias in recidivism prediction systems.</p> <p><br> This thesis explores new methods for understanding and mitigating fairness<br> issues in AI through considering how choices made throughout the process of<br> creating an AI system—i.e., the modeling pipeline—impacts fairness behavior.<br> First, I will show how considering a model’s end-to-end pipeline allows us to<br> expand our understanding of unfair model behavior. In particular, my work<br> introduces a connection between AI system stability and fairness by demonstrating<br> how instability in certain parts of the modeling pipeline, namely the learning rule,<br> can lead to unfairness by having important decisions rely on arbitrary modeling<br> choices.<br> </p> <p>Secondly, I will discuss how considering ML pipelines can help us expand our<br> toolbox of bias mitigation techniques. In a case study investigating equity with<br> respect to income in tax auditing practices, I will demonstrate how interventions<br> made along the AI creation pipeline—even those not related to fairness on their<br> face—can not only be effective for increasing fairness, but can often reduce<br> tradeoffs between predictive utility and fairness.<br> </p> <p>Finally, I will close with an overview of the benefits and dangers of the<br> flexibility that the AI modeling pipeline affords practitioners in the creation of<br> their models, including a discussion of the the legal repercussions of this flexibility,<br> which I call model multiplicity. </p>

History

Date

2022-07-18

Degree Type

  • Dissertation

Thesis Department

  • Computer Science

Degree Name

  • Doctor of Philosophy (PhD)

Advisor(s)

Matt Fredrikson