Carnegie Mellon University
Browse
zing_master_philosophy_2024.pdf (1.06 MB)

Structure Learning with Continuous Optimization: A Sober Look and Beyond

Download (1.06 MB)
thesis
posted on 2024-06-26, 19:08 authored by Ignavier NgIgnavier Ng

 This thesis investigates in which cases continuous optimization for directed acyclic graph (DAG) structure learning can and cannot perform well and why this happens, and suggests possible directions to make the search procedure more reliable. Reisach et al. (2021) suggested that the remarkable performance of several continuous structure learning approaches is primarily driven by a high agreement between the order of increasing marginal variances and the topological order, and demonstrated that these approaches do not perform well after data standardization. We analyze this phenomenon for continuous approaches assuming equal and non-equal noise variances, and show that the statement may not hold in either case by providing counterexamples, justifications, and possible alternative explanations. We further demonstrate that nonconvexity may be a main concern especially for the non-equal noise variances formulation, while recent advances in continuous structure learning fail to achieve improvement in this case. Our findings suggest that future works should take into account the non-equal noise variances formulation to handle more general settings and for a more comprehensive empirical evaluation. Lastly, we provide insights into other aspects of the search procedure, including thresholding and sparsity, and show that they play an important role in the final solutions.  

History

Date

2024-05-12

Degree Type

  • Master's Thesis

Department

  • Philosophy

Degree Name

  • Master of Science (MS)

Advisor(s)

Peter Spirtes Kun Zhang

Usage metrics

    Categories

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC