posted on 1999-01-01, 00:00authored byIoannis Koutis, Gary L. Miller, Ali Sinop, David Tolliver
Linear systems and eigen-calculations on symmetric diagonally dominant
matrices (SDDs) occur ubiquitously in computer vision, computer
graphics, and machine learning. In the past decade a multitude of specialized
solvers have been developed to tackle restricted instances of SDD systems
for a diverse collection of problems including segmentation, gradient
inpainting and total variation. In this paper we explain and apply the support
theory of graphs, a set of of techniques developed by the computer
science theory community, to construct SDD solvers with provable properties.
To demonstrate the power of these techniques, we describe an efficient
multigrid-like solver which is based on support theory principles. The
solver tackles problems in fairly general and arbitrarily weighted topologies
not supported by prior solvers. It achieves state of the art empirical results
while providing robust guarantees on the speed of convergence. The method
is evaluated on a variety of vision applications.