Carnegie Mellon University
Browse
dgutman_phd_mathsci_2019.pdf (1.03 MB)

First-Order Methods in Convex Optimization: Acceleration, Conditioning, and Rescaling

Download (1.03 MB)
thesis
posted on 2019-05-29, 16:10 authored by David GutmanDavid Gutman
This thesis focuses on three themes related to the mathematical theory of first-order methods for convex minimization: acceleration, conditioning, and rescaling.
Chapters 1 and 2 explore the acceleration theme. In chapter 1, we give a novel proof of the O(1/k) and O(1/k2) convergence rates of the proximal gradient and accelerated
proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function. Chapter 2 extends the approach of chapter 1 to the convergence analysis of Bregman proximal fi rst-order algorithms for convex minimization. We provide novel proofs of the convergence rates of the Bregman proximal subgradient, Bregman proximal gradient, and a new accelerated Bregman proximal gradient algorithm under fairly general and mild assumptions. Our accelerated Bregman proximal gradient algorithm attains the
best-known accelerated rate of convergence when suitable relative smoothness and triangle scaling assumptions hold. However, the algorithm requires no prior knowledge
of any related smoothness or triangle scaling constants.
Chapter 3 explores the conditioning theme by proposing a condition number of a differentiable convex function relative to a reference set and distance function pair. This relative condition number is de fined as the ratio of a relative smoothness constant to a relative strong convexity constant. We show that the relative condition number
extends the main properties of the traditional condition number both in terms of its geometric insight and in terms of its role in characterizing the linear convergence of
first-order methods for constrained convex minimization.
Chapter 4 explores the rescaling theme. In this chapter, we propose three enhanced versions of the projection and rescaling algorithm's basic procedures, using an efficient
algorithmic implementation of Caratheodory's Theorem. Each of these enhanced procedures improves upon the order of complexity of its analogue in Pe~na and Soheili
(Math Program 166(1):87111, 2017) when the dimension of the subspace is sufficiently smaller than the dimension of its ambient space.

History

Date

2019-05-16

Degree Type

  • Dissertation

Department

  • Mathematical Sciences

Degree Name

  • Doctor of Philosophy (PhD)

Advisor(s)

Javier Pena

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC