Carnegie Mellon University
Browse

Fast kernel matrix-vector multiplication with application to Gaussian process learning

Download (999.82 kB)
journal contribution
posted on 1993-05-01, 00:00 authored by Alexander Gray
Abstract: "A number of core computational problems in machine learning, both old and new, can be cast as a matrix-vector multiplication between a kernel matrix or class-probability matrix and a vector of weights. This arises prominently, for example, in the kernel estimation methods of nonparametric statistics, many common probabilistic graphical models, and the more recent kernel machines. After highlighting the existence of this computational problem in several well-known machine learning methods, we focus on a solution for one specific example for clarity, Gaussian process (GP) prediction -- one whose applicability has been particularly hindered by this computational barrier. We demonstrate the application of a recent N-body approach developed specifically for statistical problems, employing adaptive computational geometry and finite-difference approximation. This core algorithm reduces the O(N┬▓) matrix-vector multiplications within GP learning to O(N), making the resulting overall learning algorithm O(N). GP learning for N = 1 million points is demonstrated."

History

Date

1993-05-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC