Carnegie Mellon University
Browse

Deflating the Dimensionality Curse Using Multiple Fractal Dimensions

Download (291.4 kB)
journal contribution
posted on 1980-01-01, 00:00 authored by Bernd-Uwe Pagel, Flip Korn, Christos Faloutsos
Nearest neighbor queries are important in many settings, including spatial databases (find the k closet cities) and multimedia databases (find the k most similar images). Previous analyses have concluded that nearest neighbor search is hopeless in high dimensions, due to the notorious “curse of dimensionality”. However, their precise analysis over real data sets is still an open problem. The typical and often implicit assumption in previous studies is that the data is uniformly distributed, with independence between attributes. However, real data sets overwhelmingly disobey these assumptions; rather, they typically are skewed and exhibit intrinsic (“fractal”) dimensionalities that are much lower than their embedding dimension, e.g., due to subtle dependencies between attributes. We show how the Hausdorff and correlation fractal dimensions of a data set can yield extremely accurate formulas that can predict I/O performance to within one standard deviation. The practical contributions of this work are our accurate formulas which can be used for query optimization in spatial and multimedia databases. The theoretical contribution is the `deflation' of the dimensionality curse. Our theoretical and empirical results show that previous worst-case analysis of nearest neighbor search in high dimensions are over-pessimistic, to the point of being unrealistic. The performance depends critically on the intrinsic (“fractal”) dimensionality as opposed to the embedding dimension that the uniformity assumption incorrectly implies

History

Publisher Statement

All Rights Reserved

Date

1980-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC