Carnegie Mellon University
Browse

Methods for Calibrated Uncertainty Quantificationand Understanding its Utility

Download (3.89 MB)
thesis
posted on 2025-09-05, 19:09 authored by Youngseog ChungYoungseog Chung
<p dir="ltr">As machine learning models have become more capable of dealing with complex data, they have been entrusted with an increasing array of predictive tasks. With such growing reliance on model predictions, being able to assess whether a given model prediction is reliable has become equally important. Uncertainty quantification (UQ) plays a critical role in this context by providing a measure of confidence in a model's predictions. In this thesis, I address the problem of UQ in machine learning in three different stages. </p><p dir="ltr">The first part presents an overview of evaluation in UQ and open-source software which provides various utilities in evaluating, visualizing, and recalibrating predictive uncertainty. The second part of this thesis discusses algorithms designed to produce well-calibrated predictive uncertainties in regression models, which produce distributions over continuous-valued outputs. The first work in this part presents a suite of algorithms for training univariate probabilistic regression models, and the second work discusses an extension to the multivariate setting. The third part presents the utilization of predictive uncertainties in the decision-making setting. The application setting dictates how the uncertainties will be used, and I present a collection of works which utilize uncertainties in the single-step decision-making setting, sequential decision-making setting, and in model-based reinforcement learning.</p>

History

Date

2025-07-23

Degree Type

  • Dissertation

Thesis Department

  • Machine Learning

Degree Name

  • Doctor of Philosophy (PhD)

Advisor(s)

Jeff Schneider