Undersmoothed kernel entropy estimators

Liam Paninski and Masanao Yajima

In press, IEEE Transactions on Information Theory

We develop a ``plug-in'' kernel estimator for the differential entropy that is consistent even if the kernel width tends to zero as quickly as $1/N$, where $N$ is the number of i.i.d. samples. Thus, accurate density estimates are not required for accurate kernel entropy estimates; in fact, it is a good idea when estimating entropy to sacrifice some accuracy in the quality of the corresponding density estimate.
Reprint  |  Liam Paninski's research page