On the Estimation of alpha-Divergences
We propose new nonparametric, consistent Renyi-alpha and Tsallis-alpha divergence estimators for continuous distributions. Given two independent and identically distributed samples, a `brute force' approach would be simply to estimate the underlying densities, and plug these densities into the corresponding formulas. However, it is not our goal to consistently estimate these possibly high dimensional densities, and our algorithm avoids estimating them. We will use simple k-nearest-neighbor distance (k-NN) based statistics, and interestingly enough, we will still be able to prove that the proposed divergence estimators are consistent under certain conditions. We will also show how to use them for mutual information estimation, and demonstrate their efficiency by some numerical experiments.