file.pdf (291.4 kB)
Download file

Deflating the Dimensionality Curse Using Multiple Fractal Dimensions

Download (291.4 kB)
journal contribution
posted on 01.01.1980, 00:00 by Bernd-Uwe Pagel, Flip Korn, Christos Faloutsos
Nearest neighbor queries are important in many settings, including spatial databases (find the k closet cities) and multimedia databases (find the k most similar images). Previous analyses have concluded that nearest neighbor search is hopeless in high dimensions, due to the notorious “curse of dimensionality”. However, their precise analysis over real data sets is still an open problem. The typical and often implicit assumption in previous studies is that the data is uniformly distributed, with independence between attributes. However, real data sets overwhelmingly disobey these assumptions; rather, they typically are skewed and exhibit intrinsic (“fractal”) dimensionalities that are much lower than their embedding dimension, e.g., due to subtle dependencies between attributes. We show how the Hausdorff and correlation fractal dimensions of a data set can yield extremely accurate formulas that can predict I/O performance to within one standard deviation. The practical contributions of this work are our accurate formulas which can be used for query optimization in spatial and multimedia databases. The theoretical contribution is the `deflation' of the dimensionality curse. Our theoretical and empirical results show that previous worst-case analysis of nearest neighbor search in high dimensions are over-pessimistic, to the point of being unrealistic. The performance depends critically on the intrinsic (“fractal”) dimensionality as opposed to the embedding dimension that the uniformity assumption incorrectly implies


Publisher Statement

All Rights Reserved



Usage metrics